When does a regular person become a “user”? It’s long been an inside joke in tech companies that the word for drug addicts and consumers of technology is the same. But today addiction is becoming a competitive advantage. Bringing insights about the human operating system together with proven principles of user interaction, the formula for creating habit-forming technologies has been cracked.
The principles may be psychological, but the effects are real and fall within a broad spectrum. Software apps as well as hardware devices can entice their users into making healthy habits or destructive addictions. It’s one of technology’s secret superpowers, and the ethics of using it have been left to the discretion of private companies.
In the span of a few decades, technology has evolved from industrial equipment to personal accessory to an extension of the self. Smartphones permanently in hand, we use our apps and the networks they connect to not so much for any particular function they provide but as a general way of relating to the world. In parallel, there’s been growing interest from psychology and behavioural economics in understanding the human mind as its own kind of system software. Books like Predictably Irrational: The Hidden Forces That Shape Our Decisions (2008) by Dan Ariely and Thinking, Fast and Slow (2011), by the winner of the Nobel Memorial Prize in Economics, Daniel Kahneman, have demonstrated that we use the rational, logical part of our brains much less often than we’d like to believe. Most of the time, we’re reacting emotionally or unconsciously to stimulation from our surroundings instead of making conscious, rational choices.
Above all, we’re creatures of habit. Consumer psychologist Wendy Wood of the University of California, Los Angeles, notes, “About 45 percent of people’s behaviour is repeated almost daily and usually in the same context.” With nearly half of our actions being that predictable, it’s no surprise that, as she puts it, “habits characterise a significant segment of consumer behaviour that is linked to important marketing outcomes”. In the fierce competition over users’ limited attention, tech companies need customers who are habituated. The industry calls this “optimising for engagement”, and it is quantified with measures like “time on device” and DAU/MAU (daily-to-monthly active user) ratios. A DAU/MAU ratio in the range of 50 percent or above means that of those using the service in a month, half use it every single day. Because popular apps like Instagram commonly meet those metrics, other companies make that level of “engagement” their goal too. “When you’re setting up a business based on numbers rather than people, you’re more likely to be desensitised to the ethics of this or its impact on users,” says Harry Brignull, a user experience consultant who writes the blog www.90percentofeverything.com. “Online businesses in particular have a lot of physical and emotional distance from their customers.”
The focus, instead, is on the success of the product. Here again, questions of ethics or even the quality of the user’s experience are overshadowed by the pressures of business. “Product managers are almost always the ones making these decisions, and they have an especially tough job,” says Cennydd Bowles, a former design manager at Twitter, who is now an independent digital product designer. “Because they are accountable for the success or failure of their business area, they are more driven by numbers and less open to deviating from that.”
Finally, there seems to be a consensus in the broader culture on the desirability of creating addicted customers. With Hooked: How to Build Habit-Forming Products (2014), one might say that author Nir Eyal literally wrote the book on this trend. (It is the No 1 bestseller in the Product Management category on Amazon.) But Eyal’s contribution was not so much to invent a formula as to identify a pattern. From Facebook and Twitter to online gaming, he outlined the methods being used successfully by companies with highly engaged users to create a cycle he calls the ‘Hooked Model’. Here’s how it works. Prompts such as a new-message notification on your phone offer an immediate way of responding to ambiguous internal states such as anxiety, boredom or FOMO (fear of missing out). Because it’s easier to react than to think, external triggers such as the sound of a bell or the flash of a red button cue most people to do exactly what the app designers want, which is to get out their phone and start clicking. That much may be unconscious or automatic, but it’s not yet addictive. The habit-forming part is formed in the second half of the cycle.
When you get that new message notification, you don’t know yet what awaits you in your inbox: It could be good news or bad, an old problem or a new opportunity. This unpredictability is known as “variable rewards”: The payout for opening the app is uncertain. Yet this lends the action an almost narcotic effect. “Variable rewards are one of the most powerful tools companies implement to hook users,” writes Eyal. “[They create] a focussed state, which suppresses the areas of the brain associated with judgment and reason while activating the parts associated with wanting and desire.” In other words, the more we shake the gift-wrapped box of variable rewards, wondering what is rattling around inside, the more excited and less rational we become. This is
why we check email or Facebook even when we haven’t received a notification. Like a prospector panning for gold, we keep returning to the muddy water
in the hope of finding that one gleaming nugget.
Even when that fails to happen, the effort we spend sifting through our newsfeed or inbox makes it more important to us. “People view resources they spend time on as having value,” says Victor Yocco, author of the forthcoming book Design for the Mind: Seven Psychological Principles of Persuasive Design. “It’s the principle known as escalation of commitment.” As in any relationship, commitment requires proof of being invested in the outcome. This is the final stage of the Hooked Model, where the user contributes something of their own to the product. It might be posting a photo on Facebook, sending out a tweet on Twitter, or adding to their LinkedIn profile. Eyal calls this ‘reloading the trigger’ because it not only increases the chances that the cycle will start over again, sooner—i.e. you will get a notification that someone liked your photo or retweeted your tweet—but it also makes you more emotionally attached to the product because now it’s “yours”. With each pass through the cycle, using the product feels more like a personal habit, making it harder to determine if it’s something you actually want to do versus a compulsion scripted by design.
But where is the line between habits and addictions, when it comes to the technologies we use most often? “A habit is something you benefit from, where there’s a symbiotic relationship between you and the brand owners,” suggests Brignull. “I’m meeting people and expanding my network by using Facebook and Twitter.” And there is a whole range of personal productivity and health apps, such as SuperBetter and Fitocracy, that employ similar psychological principles to get users hooked onto good habits instead of bad. On the dark side, Brignull cites free-to-play games that are designed to make their users keep playing, and offer dubious value relative to the amount of time—and sometimes money—spent.
Hooked Model aside, what many addictive technologies have in common is that they are free. From Facebook to games such as Candy Crush and FarmVille, users don’t need to pay money to get that first notification. But from that point on, they start to pay in terms of time and attention. And the cost can be high. A study published in October 2015 by researchers at the University of Bournemouth in the UK, in tandem with a local rehabilitation agency, found that four out of five students registered feeling “panic, confusion and extreme isolation” when they were separated from their devices of choice for 24 hours. As a result of that experiment, the study’s authors advocated putting warning messages on smartphones similar to those employed on cigarette and alcohol packaging.
Apart from warnings to consumers and potentially, the need for oversight by government regulators, the parallels with drugs and alcohol might also suggest a model for individuals to deal with addictive technology. “What is effective in terms of the psychology of health behaviours is the perception of control over where the addiction comes from,” says Victor Yocco. “For people trying to quit smoking or drinking, it may be a support programme or medication that controls their urges. It’s key for people to see where that control lies.” In a post on his blog titled ‘Un-Hooked: Increasing Focus in the Age of Distraction’, Nir Eyal says the issue is putting technology in its place. Sometimes literally: Among other things, he advocates charging your phone outside the bedroom and even fitting your internet router with a timer to shut off the Wi-Fi at a certain hour each night.
Creating physical barriers to using addictive technology is one part of the equation. But it’s also a matter of recognising the internal triggers that lead to unwanted habits and compulsive behaviour. As Predictably Irrational author Dan Ariely notes, “To make informed decisions, we need to somehow experience and understand the emotional state we will be in at the other side of the experience.” Knowing where your weaknesses lie, he recommends planning ahead with pre-commitments to divert yourself into a different course of action when temptation strikes: Taking a book to read on the train rather than playing a game on your phone, for example, or deciding you will take a walk or call a friend when you feel lonely, rather than checking Facebook.
But is there anything that Facebook or other businesses with a “highly engaged” user base could do to promote healthier use of their technology? “I would love to see a company say, ‘We think some of the behaviours you are demonstrating with our product are not great for you’ or ‘We’ll turn that off’. But with their internal reward systems so predicated on time on device, it’s hard to imagine that happening,” says Cennydd Bowles. “There would be no business benefit to them doing it.”