Smart Phones and Drained Brains

As we use our mobile phones to do more and more things, we are paradoxically able to accomplish less— even when the phones are face down and turned off.

My last column explored how smart glasses (“heads up display” or “HUDs”) will increase the amount of digital information we look at, with the ironic twist that these same devices will erode our shared experience of reality. But we don’t need to look to a future technology to see how challenging it is to pay attention to what’s around us. We already carry a dislocating technology around in our pockets: our phones.

I’m deliberate when I say “dislocating” rather than “distracting,” because we’re not necessarily distracted: often we’re fiercely focused on our phones, but we’re dislocated because unless we’re taking pictures or videos we’re not engaged with our immediate physical environments. Distraction is a subset of dislocation.

The charts below show the many ways we use our phones, as described in the newest version of Center’s longitudinal “Surveying the Digital Future” report (it comes out next month):

As the report will observe, texting (93%) has edged out talking (92%) as the most common use of a mobile phone because texting increased six percent year over year while talking stayed flat.

It’s easy to get sucked into data on the individual functions (for example, 67% of people take videos with their phones, a nine percent increase), but doing so misses the big picture: with the exception of talking, Americans have increased their use of every mobile phone function over four years (2012 to 2016).

Phones and the Future of Focus

As with all technologies, increased mobile phone use has both a plus side and a downside.

On the positive side, we’re more connected to our loved ones and the information we want than ever before. We get news of the world instantly and store our important information — from shopping lists to medical documents to that pinot grigio we liked so much in that restaurant that time that we took a picture of the label — in our phones and online where we can always get to it. (I’m the king of productivity apps and can no longer imagine life without Evernote.) With games and apps and email and social media, mobile phones have engineered boredom out of our lives because there is always something fun to do.

But on the negative side, we use our phones more often to do more things, and that time and attention have to come from somewhere — they come from our engagement with the physical reality around us, including the people we are with who increasingly feel ignored unless they too have their noses in their smart phones. If we’re playing Candy Crush waiting in the supermarket checkout line, then we’re not chatting with the cashier or the other people in line who might have something interesting to say. While it sucks to be bored, boredom leads to daydreaming, and most of the great ideas in human history started with a daydream.

Brain Drain

First we’re dislocated, then we’re distracted. In other words, when we finally want to focus on the world around us, it’s getting harder to do so because of our mobile phone use. This is the finding of an important study that came out in the Journal of the Association for Consumer Research in April.

The article — “Brain Drain: The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity” by Adrian F. Ward, Kristen Duke, Ayelet Gneezy and Maarten W. Boz — usefully distinguishes between the things we think about (the orientation of our attention) and how much energy we have to think about those things (the allocation of our attention).

Mobile phones, the authors find, suck attentional energy away from non-phone-based activities, and since we have a limited amount of attention to spend, we’re less capable when we have a task at hand and in front of us.

What’s startling about the study is that mobile phone distraction does not just happen when our phones are on, beeping and flashing and vibrating for our attention. Our mobile phones reduce our ability to function even when the phones are turned off and face down on the table or desk where we’re working. As the authors observe, trying to increase your focus using “intuitive ‘fixes’ such as placing one’s phone face down or turning it off are likely futile.”

Performance gets slightly better if the phone is out of sight in a pocket or bag. Performance substantially increases only when the mobile phone is in another room, entirely out of sight and somewhat out of mind. And the more dependent you are on your mobile phone, the more your focus blurs when your phone is in sight or nearby.

It gets worse: the data shows convincingly that our ability to perform erodes if our phones are nearby, but we do not recognize that degradation of performance:

Across conditions, a majority of participants indicated that the location of their phones during the experiment did not affect their performance (“not at all”; 75.9%) and “neither helped nor hurt [their] performance” (85.6%). This contrast between perceived influence and actual performance suggests that participants failed to anticipate or acknowledge the cognitive consequences associated with the mere presence of their phones.

In other words, we think that we can handle the distraction that comes with our phones being around, but we can’t. In this regard, mobile phones are a bit like drunk driving or texting while driving: we think we can do it without consequence, but often we aren’t aware when we’re impaired and not able to function until it’s too late. (Psychology Today has a nice summary of the study findings.)

Implications: Budgeting Attention

We have a limited amount of attention: this is why a common metaphor for directing our attention towards someone or something is “to pay attention.” Attention is like a currency that we can budget or hoard, but we tend not to do so. Instead, we are attention spendthrifts, throwing our cognitive capacity at all the tasty tidbits that come out of our screens.

The problem with the “pay attention” metaphor is that it obscures something important: our attention can disappear without our having made a conscious decision to pay. For example, when we have notifications enabled on our laptops, tablets, and mobile phones — especially the latter — those bleeps and flashes and buzzes are attention taxes that we don’t realize we’re paying.

What the “Brain Drain” study shows is that even if we have our phones turned off and face down, we’re still paying an attention tax that acts like hidden fees on credit cards.

Brain Drain is different than Information Overload because with Brain Drain there is no information: just the potential for information. Likewise, Brain Drain is different from FOMO (Fear of Missing Out), because Brain Drain happens even when we aren’t fretting about what might be going on somewhere else.

The paradox of mobile phones is that as we use them to do more and more things, it becomes harder and harder to do any one thing. Always using our everything devices mean that we’re often nowhere in particular, and in order to be somewhere we have make a pre-emptive, conscious decision to put the everything device into an entirely different room.

That’s hard to do.

[Cross-posted on the Center for the Digital Future website.]

Why Avengers ROCKS + Top 5 Superhero Movies

Yesterday my almost-seven-year-old son and I took in a 3D matinee of “The Avengers” and had a blast. We loved it so much that we plan to see it again in iMax.

The movie has everything—Joss Whedon directing off a terrific screenplay he co-authored, a huge budget with huge stars, and, startlingly for a popcorn movie, emotional arcs for several of the main characters.

However, what has sent Avengers directly into my top 5 list of all time best superhero movies is how it culminates the previous four Paramount Pictures movies—Iron Man, Iron Man 2, Thor & Captain America.

Each of those movies seeded hints about this movies that followed and about Avengers, and fans like me stuck around all the way through the credits on each film to see the parting hints.

You don’t have to have seen any of the previous movies to thoroughly enjoy Avengers, but if you have seen any or all of the earlier movies your pleasure will be amplified by your memories of the earlier movies.

Avengers’ optional relationship to its predecessors makes this series different than, for example, The Matrix series where by the third movie nothing much made sense if you hadn’t seen the first two, played the video game and read the comic book (Henry Jenkins wonderfully describes all this in his book Convergence Culture.)

This amplification is the topic that I spent eight years of my life researching in a different context—Shakespeare’s audience. (See the top video on this blog or the summary of my doctoral dissertation under “Writing” for more.)

It’s a complicated trick to smuggle extra experience into a movie to thrill the fans that won’t distract the uninitiated. Similar to but different than pop cultural throwaways like — @CNNLADavid’s favorite line, “Better clench up, Legolas.” Said to Hawkeye by Iron Man as they zoom roof ward, it’s a one-second tribute to The Lord of the Rings’ elven archer from another fan-fave movie cycle.

What’s different about in-series amplification is that it lasts: 100 years from now a viewer of The Avengers who has seen the previous movies will have the amplified experience, whereas the Legolas throwaway is contingent on different knowledge.

For those of you (anyone? anyone?) who care about good old-fashioned aesthetics, this phenomenon is the cinematic version of what Stephen Pepper described as “funding” in his classic book “The Basis of Criticism in the Arts” (1949), only he was writing about painting and how the second time (T2) you see the Mona Lisa you are actually viewing both the painting itself and your earlier experience (T1), and the third time it’s T3 + T2 + T1, and so on.  (I’ve written about this here before: click the “Cognitive Funding” tab to the right to find those posts.)

Moving on to my top 5 list of super hero movies and why

Mystery Men (1999): Hysterical and brilliantly cast exploration of every comic book cliché and why they’re still powerful even though cliché. A passion project for everyone involved, it was probably the only superhero team movie to work well before Avengers.

Superman 2 (1980): Like “Empire Strikes Back” in the Star Wars series, and Spider-Man 2, the second movie with Christopher Reeve surpassed #1 in this series by light years when it came to the story. Although the first Superman had the inspiring flight sequences and John William’s magnificent score, it was in essence a two hour preview for Superman 2, which still blows me away.

Iron Man (2008): The movie so effectively deployed star Robert Downey Jr.’s real-life brilliant, charismatic bad boy persona into the Tony Stark protagonist that it took my breath away, and having both Downey and Jeff Bridges square off in the movie was like the Foreman vs. Ali “Rumble in the Jungle” back in 1974—two titans.

Dark Knight (2008): Heath Ledger’s performance as The Joker was so intense that it killed him. Unlike earlier portrayals across all media, Christopher Nolan’s movie didn’t try to explain The Joker or to resolve his paradoxes—each time the Joker explained how he came to be the story failed to match the earlier version. It’s a brilliant exercise in Keatsian “Negative Capability.” That paradox was the engine that drove the movie, poised in perfect tension with the equally powerful engine of Batman’s motivation to fight the evil that killed his parents. Batman is the only superhero whose origin story and motivation are identical, and it’s an inexhaustible supply of narrative.

Avengers (2012): See the first half of this post: it’s not just one movie—it’s five blockbusters tied together.

What movies are on your list? Please share in comments.