The inexorable spread of new technologies has spurred a cottage industry of intellectual hand-wringing about their adverse effects on our lives. Books like Nicholas Carr’s The Shallows: What the Internet Is Doing to Our Brains or Sherry Turkle’s Alone Together: Why We Expect More From Technology and Less From Each Other offer research from neuroscience and psychology to analyze the deleterious influence of our omnipresent devices.
I find the work of one astute thinker in this area especially compelling: "For a multitude of causes, unknown to former times, are now acting with a combined force to blunt the discriminating powers of the mind, and, unfitting it for all voluntary exertion, to reduce it to a state of almost savage torpor. The most effective of these causes are the great national events which are daily taking place, and the increasing accumulation of men in cities, where the uniformity of their occupations produces a craving for extraordinary incident, which the rapid communication of intelligence hourly gratifies."
That last sentence really hits it on the head, even though it was written more than 200 years ago by William Wordsworth. As new technologies have made our lives increasingly uniform — sitting all day at desks, doing the same thing over and over again, and then sitting around in the evening in front of our screens, watching the same plotlines repeated in movies and television shows — we seek variety and stimulation through the brief bursts of pleasure and interest we get from our devices, our social-media connections, and our society’s endless news cycles.
That, at least, is my modern updating of Wordsworth’s 1802 preface to Lyrical Ballads, a collection of poems he wrote with Samuel Taylor Coleridge. Wordsworth was writing about the movement of people into cities to find work in the early factories and mills of the budding industrial revolution, and of the proliferation of newspapers and pamphlets during that era.
But the story is not quite so simple, as the neuroscientist Adam Gazzaley and the psychologist Larry D. Rosen argue in The Distracted Mind: Ancient Brains in a High-Tech World, the book which I am featuring in this multipart series on the challenges that phones, tablets, and laptops pose to sustained learning, both for teachers and students.
Last month’s column outlined the attention problem in the classroom, where the learning goals of our courses clash with the cognitive-control limitations of students. The Distracted Mind presents some compelling findings about the difficulties we humans face in directing and sustaining our attention, remembering things, and switching back and forth between tasks.
This month, I want to pinpoint the special role that our modern-day devices play in exacerbating our cognitive limitations.
Technology advocates would argue that, with the arrival of any new technology, humans have always fretted about our inability to pay attention — and then adjusted. One part of Gazzaley and Rosen’s research would support that argument. New devices did not create our current struggles with attention and distraction. Our cognitive limitations are part of the architecture of our brains. And we have been plagued by those limitations for as long as our recorded history.
In the past, however, the arrival of new technologies — and especially ones with the capacity to reshape our thought, like literacy or the printing press — occurred over very long stretches of time, and with long gaps in between. We had time to adjust to the rapid spread of the printed word, as that technology arrived and then remained the primary means of transmission of thought for centuries.
That is no longer the case. The arrival and widespread adoption of new technologies has occurred in increasingly intense bursts. In The Distracted Mind, Gazzaley and Rosen point out that, if you assume a benchmark of 50 million worldwide users, radio arrived at that level within 38 years of its invention. The time frame shrinks with each new invention: telephone, 20 years; television, 13 years; cellphones; 12 years; the internet, four years. Social media amped up the curve: Facebook, two years; YouTube, one year. And the winner, at least at the time of their writing the book? "Angry Birds" took over our lives in 35 days.
As any neuroscientist will tell you — and as handy little books like Moheb Costandi’s Neuroplasticity will explain — our brains have vast capacities to change and adapt to new circumstances. But the rapid pace with which changes are arriving today makes such adaptation challenging for even the most plastic of brains.
All of which brings us to those handy devices which pose such a tantalizing temptation to our distractible brains: laptops, tablets, and phones. They have rapidly and intensely penetrated the lives of students and the college classroom. Most faculty members are still torn between seeing those devices as tools for exciting new learning and recognizing them as barriers to attention and focus.
Gazzeley and Rosen use a fascinating argument to articulate why today’s technologies pose a much more serious threat to our brains than inventions from previous eras.
Humans, like other primates, are information-foraging animals. "Molecular and physiological mechanisms that originally developed in our brain to support food foraging for survival," they write in the book, "have now evolved in primates to include information foraging." They cite studies which show that macaque monkeys "respond to receiving information similarly to the way that they respond to primitive rewards such as food or water." That response includes receiving small bursts of pleasure in the brain, which then drives us to repeat the behavior when we are bored or in search of stimulation.
For a very long time in human history we have had limited access to information beyond what we could see in our immediate environment. In 2017 and beyond we have instant access to a magical information source in our pockets.
When we are in want of a quick burst of stimulation — because we are sitting in the waiting room at the dentist’s office, or slogging through a difficult article in our field, or listening to Professor Lang introduce the poetry of William Wordsworth in "British Literature Survey II" — we turn to that device and get a little short-term reward from our brains.
As anyone who has ever eaten too many cookies or drunk too many beers will tell you, though: Short-term rewards don’t always have positive long-term effects. And the same goes for our constant search for quick stimulation from phones and laptops.
The Distracted Mind points to three separate studies — conducted by different research teams — which found that when people engaged in a conversation with a cellphone present but not in use (such as resting on a table or held in their hands), they tended to rate the quality of the conversation lower, or rate their relationship with their conversation partner lower, than when those conversations were conducted without the presence of cellphones. The mere presence of a visible device, this research would suggest, led people to focus less on the person in front of them, as one part of their brain remained angled toward the phone, wondering whether a neurological cookie was waiting for them there.
Such findings present difficult challenges for those of us who teach. Learning is hard work and requires sustained attention — whether it takes place in a students’ bedrooms as they watch a video lecture, in the library as they pore over a textbook, or in the classroom as they participate in a difficult discussion.
The learning we offer can bring students both deep satisfaction and immediate pleasure, but it must compete with the quick bursts available to them from their phones and computers.
Just as cookies (or beer) have a longer impact on us than the few seconds they spend on our palate, so do device checks have a long impact on our attention.
Multiple studies of both workers and students have found that when people pause from a task for a digital distraction, they might devote a very short time to the distraction itself, but the effect lingers: According to one study cited by Gazzaley and Rosen, observations of people interrupted during the pursuit of a task revealed that it took them almost 30 minutes to refocus and fully engage with the original task.
To paraphrase Wordsworth, that quick check of a phone in class trails clouds of distractions well beyond the experience itself.
None of this necessarily means that we should ban technology from the classroom, since — as I mentioned in my previous column as well — we are graduating students into a world full of distractions. Rather than keep them in a technology-free bubble, we have to help them learn to manage it. We can also help them understand how to use computers and phones most effectively to support their learning, rather than distract from it.
In the final two columns of this series, we’ll consider the general recommendations that The Distracted Mind offers for managing distraction, and the implications of those recommendations for teachers and students alike.
James M. Lang is a professor of English and director of the Center for Teaching Excellence at Assumption College, in Worcester, Mass. His latest book, Small Teaching: Everyday Lessons From the Science of Learning, was published in the spring of 2016. Follow him on Twitter at @LangOnCourse.