Tracking how long we spend online, how many times we pick up our phones, and how many hours we devote to certain apps has become a bit of a global obsession in the news media and within families. If it is true that the average American adult spends nearly 6 hours per day on digital media should we automatically call this “addiction”? We might question what is “too much” and what is healthy, but we should also resist scaremongering and moral panic about technology and carefully assess claims to scientific certainty when high quality research is lacking.
It’s no accident that the time we spend online has increased dramatically over the last decade. And it isn’t only because mobile phones and internet connections are becoming faster and more affordable in most parts of the world. Our phones have become our alarm clocks, navigation aids, memory enhancers and constant companions. Smartphone apps and social media are often also explicitly designed to optimize engagement, like comments and shares, and to increase the amount of time we spend, watching, reading, scrolling or playing.
Natasha Dow Schüll calls this “addiction by design”. Schüll is an associate professor at New York University, and spent 15 years studying how casinos and slot machines pull people into an addictive “machine zone” that is hard to escape. She and many others see the same design principles being applied in smartphone apps, social media platforms and recommendation engines. Such intents on the side of companies have been documented, but there is still inconclusive evidence of how much control they actually wield over users.
To illustrate this point, scientists Amy Orben and Andrew Przybylski at Oxford University examined existing data sets about the relationship between technology use and well-being in young people. The results published in Nature Human Behavior in 2019 show that there is no overwhelmingly consistent correlation –– good or bad. Other factors had greater impact.
“In one dataset, for example, the negative effect of wearing glasses on adolescent well-being is significantly higher than that of social media use. Yet policymakers are currently not contemplating pumping billions into interventions that aim to decrease the use of glasses,” writes Orben in a behind-the-scenes analysis for the Nature Research Community.
Anecdotally, countless people report feeling anxious, sad or depressed about the way technology has meshed with their lives, or dissatisfied with the terms on which they are offered free services that vacuum personal data. Many actively seek to change their relationships with their devices: Digital detoxes, social media hiatuses, or buying phones that can’t go online are but a few of the tactics that those privileged enough to choose to go offline employ.
One of the most visible organizations working to stop the design of addictive technologies is the Center for Humane Technology, whose co-founder Tristan Harris himself was a design ethicist at Google. Advised by former and current technology executives, the launch of the organization in 2016 (originally named Time Well Spent) helped spark a public debate about the vast potential for harm from technology that is not designed with humanity’s best interests in mind.
Tech industry leaders responded to a deluge of bad publicity by designing new tools to assist people in managing the time they spend with devices and in apps. In an apparent nod to the organization in 2018, Facebook CEO Mark Zuckerberg announced, “One of our big focus areas for 2018 is making sure the time we all spend on Facebook is time well spent…”
Later that year, Facebook introduced new tools to support “safety” and “well-being”, including options to mute notifications for Facebook and Instagram and create time limits. Meanwhile, Apple introduced a new iPhone feature called ScreenTime to help users “understand and take control of the time” they spend with their device. And as part of a digital well-being initiative, Google announced similar controls for Android and YouTube, including an app timer.
But such tools constitute nothing in the way of a change in design practices. Business models that incentivize engagement still reign. As awareness about the questions and potential risks of the current systems grow, so do the ways to help us understand how we’re using technology — and make choices about how and whether to do things differently. For instance, one whimsical browser extension, Facebook News Feed Eradicator (for Firefox or Chrome), aims to counteract the lure of social media by replacing your news feed with “an inspiring quote”.
But the responsibility for change shouldn’t lie with individuals alone.
We also need collective action to design different incentives and business models. There is an opportunity for people within the tech sector –– developers, designers, content creators, marketers and others –– to be leaders in creating apps and services that do not encourage addictive behaviours and instead incentivize positive, healthy online experiences.