This discussion was originally filmed on February 19, 2020, and the panelists reunited via Zoom on May 4 to reconsider the issue (see final question).
What is tech addiction?
Adam Alter: Tech addiction is a behavioral addiction, meaning any behavior that in the short run you want to enact over and over again. You do it compulsively, and you enjoy doing it, but in the long run, it harms you in some respect, affecting your psychological, social, or physiological well-being. All sorts of things happen when we are glued to our screens, such as people spending more money than they’d like. Substance addiction and tech addiction are different, but a lot of the consequences are similar. It may be less immediately unhealthy to be a tech addict, but in the long run, when you add up all the tech addicts, there are negative consequences for society in how we interact with each other. Playgrounds, restaurants, and the dinner table are degraded because we all spend so much time on screens.
Nicholas Epley: Other people are deeply important for human well-being. The quality of our connections, the nature of our relationship to other people, is the biggest determinant of our happiness or well-being. Anything that distracts us from positive connections with other people has the potential to undermine our well-being. You might really enjoy your phone, but the data suggest that it doesn’t bring you the same well-being as connecting positively with another person. Experiments find that when you’re stressed, connecting with your mother over instant messaging reduces that stress a tiny bit, but only trivially compared with talking to her. My research suggests that the voice is really important for conveying the presence of another person’s mind and creating connection, whereas text lacks the paralinguistic cues that help create a sense of connection with another person.
How do developers of websites and apps keep us hooked?
Marshini Chetty: A lot of websites use dark patterns, a term for the user-interface design patterns that coerce you into making a decision that you may not have otherwise made. I coauthored a study of countdown timers on websites that try to get you to buy by creating a sense of urgency. We find that in many instances, some of these patterns were faked, and the timers were randomly resetting. Or, for example, if you watch an episode of a show on Netflix, at the end, the next episode automatically starts to play. In some video games, you have to play by appointment—in Pokémon Go, for example, some Pokémon are nocturnal. They’re only available at night, so even if you’re supposed to be sleeping, you have to be up to catch that Pokémon. In email notifications from social media, the “likes” are aggregated according to a variable-ratio reinforcement schedule that prompts you to constantly check your feed, because you don’t know when that counter is going to be updated.
So it’s not just that we might be susceptible to this kind of addiction, but that technology is being engineered to give you these quick fixes. These kinds of rewards are designed to keep us addicted to the technology. They play on our cognitive and behavioral biases to manipulate us into staying on the technology longer. These dark patterns have always been around—think about candy bars displayed next to the checkout line—but they can now happen on a much larger scale. They can also be personalized, to see which dark patterns an individual is most responsive to. And because this is not really being regulated, it’s problematic, and could be adding to this addiction problem.
Alter: We are much better at getting people addicted than we ever were before, which means you can be much more purposeful about designing things with that in mind. But you don’t even need to be purposeful anymore, because you can throw a billion data points at the wall and look at what sticks best, and you don’t need theory. So you don’t need to understand humans. You don’t need a degree. You don’t need a PhD. All you need is billions of data points looking at how people engage with an experience. You can do lots of little trials, see what sticks, and create a weaponized version of the experience. We didn’t have the feedback or the access to the data to do that before. We’re in a losing battle on the other side of the screen, fighting against reams of data and people with a really smart sense of what makes us tick.
Epley: All of this just describes the scientific process. This is experimentation. Researchers and scientists have been doing this for centuries. The method isn’t problematic. Any good method can be used to ill intent. The scientific method that gave us lifesaving medicines also gave us opioids. The same scientific practice that allows us to understand what makes people happy and sad, what helps people make good choices and bad choices, and what helps people live better lives can also be designed to help a company make money. The use of the method is really driven by the user’s intent. If the intent is to capture our attention, it’s creating addiction—and that’s not a great outcome for us, that kind of addiction.
Are we becoming more and more addicted to technology all the time?
Chetty: One thing that’s changed over time is that we all have the access to these technologies all the time, and they’re becoming more ubiquitous. It’s not just the phone in your pocket, but also an Alexa [smart speaker] in your home or a smart TV, and so on. It’s hard to avoid it, and behavior such as checking your email constantly in a social setting is not frowned upon.
Epley: With a lot of substance addiction, the negative consequences are obvious. But not all the negative consequences of this tech use are obvious. Take texting. In our research, we find that a person’s voice is really critical for conveying the presence of mind—you sound more thoughtful, intelligent, and rational. I make a different inference about your intellectual character when I hear what you have to say than when I read the same thing. But the effect of that is not obvious. We don’t get feedback on some of the negative social consequences of technology. If I don’t talk to another passenger on the train, I don’t learn that we would have had a great conversation and she’d have been super interesting. I just learn whatever was on my email. So I don’t find out that the tech was keeping me from another wise, pleasant experience. We get really smart in the world as human beings when we get really good feedback. Technology doesn’t always give us great feedback. And it’s even harder to aggregate individual experiences to pinpoint the social consequences. The long causal chain is hard to see out there in the world. It’s hard to see all the different ways all of this stuff that’s impacting humanity at a really unprecedented level is affecting our social lives. For example, is the divisiveness that we see here in the United States partly because technology has enabled us to connect easily with folks who are part of our “tribe”? Anyone can find a Facebook group that creates endless opportunities for “us versus them” thinking.
Alter: In 2014, I wrote the proposal for my book Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked. When I spoke to a number of potential editors, some of them said, “This is just not an issue. No one cares about it. No one thinks it’s a big deal. This is a storm in a teacup.” The editor whom I worked with said, “I think you might be on to something. This is interesting. Let’s run with it. Let’s see what happens.” No one says that anymore. I used to have to spend the first 20 minutes of any talk saying, “This is the thing you should care about.” I don’t have to do that anymore. Everyone now understands that this is a concern. The consequences are not as immediate as watching someone experience heroin addiction, but certainly we’ve gotten to the point now where we are convinced as a society that this is a problem. But there’s been some evolution. When I published the book, parents said to me, “I don’t know what to do about my kids.” Now kids are saying to me, “I don’t know what to do about my parents.” Younger people are getting better at dealing with the technology. They’ve grown up with it, and it’s older generations who are struggling with it more. If I have an audience, I’ll ask them to indicate where they lie on a spectrum from 1: “I am perfectly happy with my interactions with technology,” to 10: “This is destroying my life. I need to make major changes.” In almost every room I’ve ever done this, the most common response is between 6 and 8. So most adults say, “It’s not ruining my life, but it’s a real problem for me personally, and I need to do something different.”
How should we address the problem?
Alter: There are two basic approaches: the grassroots, bottom-up approach, where each individual has certain tactics to deal with the problem; and the top-down approach: government legislation, workplace policies—and you have to be hopeful the latter has to happen, either through pressure from consumers or pressure on governments, on legislatures, and so on. It’s happening in some parts of the world: East Asia, Northern Europe, Western Europe, certain parts of those regions. Not so much in the US right now. If there’s enough pressure, over time that could evolve. It seems like it’s leaning in that direction. More and more countries are introducing legislation. Perhaps the US government will at some point. But because not much is happening currently at the top-down level, apart from a few isolated organizations, we as individual consumers have to do the grassroots work ourselves.
Chetty: There is a proposal in the US Senate trying to regulate the use of dark patterns. One of the patterns it’s specifically concerned with is encouraging kids under the age of 13 into compulsive gaming habits. So in the US, there does seem to be more concern about children in general being on technology. The American Academy of Pediatrics changed its guidelines for screen time. Even in the devices themselves, more operating systems have rolled out screen-time awareness tools. I agree that as an individual, you should try to curb your tech addiction, but you can’t expect everyone to do that. Not everyone is informed. Children need help to protect themselves. For the elderly, or those with cognitive impairments, regulation is needed so that someone is providing oversight for this. The proposal in the Senate is only geared toward big companies with more than 100 million monthly active accounts. They can’t go after everyone, but at least if they make an example of some of the bigger players, hopefully others will follow suit. You can’t police everything, but I do think there’s a place for that as well.
Epley: Businesses need to make money to be able to sustain themselves, and as a general rule, they haven’t been great at regulating their prosocial orientation until it’s also aligned with them doing well financially. To the extent that investors start caring about these issues with their pocketbooks, then it will matter. And there are companies that are trying to do real good. Facebook, for instance, probably has good intent behind lots of its products. One thing that’s maybe not so obvious all the time, though, is how its business practices might detract from that goal. A company such as Facebook, for instance, has to make money by drawing people’s attention because that’s the only way it makes money: through ads. That’s its business model. If Facebook wanted to design a product that was systematically better for a user, it might also design a separate channel where people have to pay for a subscription service, and then it’s not incentivized to keep them as hooked. So businesses can make choices that are more socially responsible. My hope would be that as the negative social consequences of these phenomena become more widely known, companies become better at doing that.
Has the COVID-19 crisis changed the way you think about our relationship to screens?
Alter: When we’re forced to use screens, it throws into relief how important it is to understand how to maximize the benefits, to get as much good from screens as possible, and to minimize the costs. This goal that people have long had of disconnecting completely from screens, I’ve never thought that’s realistic. The key is to understand as much as you can about what screens are doing and what different aspects of screen time do to us, so you can then decide how to structure your life. Screens are not monolithic. There isn’t one thing known as “screen time.” You could sit in front of a screen doing work. You could have birthday parties in front of a screen. You could watch mindless content. One important step for everyone to take is to do an audit of what you’re doing with that screen time. Maybe track your usage for a couple of days, and try to break it down into the benefits and the costs, and what you’re actually doing. It needs to be an audit process where you say to yourself, “What does a screen mean to me right now? Are there other things I could be doing?” That will vary by
Epley: My thinking on the impact that screens have on us has not changed, but it does highlight different aspects of the effects that screens can have on us. I study social connection in my research, and what we’ve seen over the past few months is just how good screens can be for people, as long as we use them in high-fidelity ways. We’re being asked to socially distance from each other, but that’s a misnomer. What we’re really being asked to do is to physically distance from each other. We can use technology to keep ourselves socially connected, even when we’re physically apart, as long as we use the technology optimally. That means connecting with each other using voice. Video doesn’t hurt either, but in particular, using voice to connect with somebody really creates a sense of connection to others, we find in our research, rather than texting or typing.
Chetty: We need to be even more conscious about the way we’re spending time on technology during the pandemic. That’s because we’re dependent on technology for many different reasons—homeschooling, connecting with other people, and entertainment. There are a few different things that we can do: one, we have to go easy on ourselves, because we might be on screens more than usual. Two, we can use technology positively, but at the same time, I can’t forget those rules that we talked about before the pandemic—if we’re not being mindful about how we’re spending the time online, it can be bad time. We need to ask ourselves: Is it productive time, is it helping us relax, or is it harmful?