If you're looking for a contrarian take on technology, Nicholas Carr is your man. In 2003 the author touched off a debate about the role of computers in business with his article "IT Doesn't Matter." He caused another kerfuffle five years later with an Atlantic piece, "Is Google Making Us Stupid?"
Now the 51-year-old, Colorado-based writer has published a new book, The Shallows, which warns that the Internet is rewiring our brains and short-circuiting our ability to think. The Chronicle called Mr. Carr to get his opinion about what this means for teaching and research.
Q. The idea of neuroplasticity is central to your argument. What does this mean, and what does it have to do with how the Internet is changing our brains?
A. For a long time, even when I was going to school, we were taught that the structure of the human brain was basically fixed by the time we got to our early 20s. But it's become clear in the last few decades that in fact, even the adult human brain is quite malleable. And our neural circuitry is kind of always in the process of adapting to circumstances and to environment and to the tools we use, particularly those for finding information and making sense of information. When you look across all of the evidence, there's very strong suggestions that the way we take in information online or through digital media impedes understanding, comprehension, and learning. Mainly because all of those things combine to create a very distracted, very interrupted environment.
Q. You write that educators assumed multimedia would aid learning, but that has been "contradicted by research." Explain.
A. Whenever we have a new information technology, there tends to be a lot of enthusiasm throughout society, but also in the educational community. That was true with hypertext in the 80s and 90s, and I think it continues to be true with multimedia. But what the evidence suggests is that, unless it's very carefully planned with an eye to how the brain processes information, multimedia actually impedes learning rather than enhances it, simply because it divides our attention. Studies pretty clearly show that when our attention is divided, it becomes much more difficult to transfer information from our short-term memory, which is just the very temporary store, to our long-term memory, which is the seat of understanding.
Q. What studies?
A. There's a study called "The Laptop and the Lecture" that divided a class into two sets. One-half of the students could use their laptops in a classroom while listening to a lecture. They were free to surf the Web. And the other half had to keep their laptops closed. And then there was a test of comprehension. And the students who used their laptops scored significantly lower on the comprehension test for how well they could remember the content of the lecture. An interesting twist was that students who visited sites relevant to the content of the lecture actually did even worse on the test than students who browsed unrelated sites. It indicates that, even if you think that allowing students to look at other information relevant to what they're being taught might enhance their learning, it actually appears to have the opposite effect.
Q. Some professors are interested in integrating social technology—blogs, wikis, Twitter—into their teaching. Are you suggesting that is a misguided approach?
A. I'm suggesting that it would be wrong to assume that that path is always the best path. I'm certainly not suggesting that we take a Luddite view of technology and think it's all bad. But I do think that the assumption that the more media, the more messaging, the more social networking you can bring in will lead to better educational outcomes is not only dubious but in many cases is probably just wrong. It has to be a very balanced approach. Educators need to familiarize themselves with the research and see that in fact one of the most debilitating things you can do to students is distract them.
Q. Most people would praise the Internet's power to open access to information for researchers, but you suggest a downside.
A. I should start off by saying—I don't want to come off as a hypocrite—I use the Net for research all the time. It's an extremely valuable way to uncover information. I think that there are at least two possible downsides. We know that as human beings we love new information. That's pretty well proven by psychological research. If given the opportunity to find something new, we'll usually go in that direction, whether that information is trivial or important. And what the Net does is give us huge opportunities to get new information. We can see that in our habits when we're on Facebook or social networking or just on the Web in general. But that same instinct can bleed over even when we're doing formal academic research. Because there's so much information at our fingertips, we can get stuck just constantly uncovering new relevant information and never stopping and actually reading and thinking deeply about any one piece of information.
The other is the study by James Evans that was in Science magazine a couple of years ago. He looked at what happens when academic journals publish their archives online. The assumption is this will be a great boon to research because suddenly all these articles that used to be difficult to find, suddenly we can just search them. And what he discovered is, in fact, the effect was kind of the opposite of what we expected, in that actually the number of articles cited goes down when these journals go online. And also people tend to cite more-recent articles and not go back in time to older ones. His hypothesis there is that we become so dependent on search, and the results from searches are determined by popularity of one sort or another. And the risk of using search for online research is that everybody gets led in the same directions to a smaller number of citations which, as they become ever more popular, become the destination for more and more searches. And ... he suggested that simply the act of flipping through paper copies of journals actually may expose researchers to a wider array of evidence.
Q. If the Internet is making us so distracted, how did you manage to write a 224-page book and read all the dense academic studies that much of it is based on?
A. It was hard. The reason I started writing it was because I noticed in myself this increasing inability to pay attention to stuff, whether it was reading or anything else. When I started to write the book, I found it very difficult to sit and write for a couple of hours on end or to sit down with a dense academic paper. One thing that happened at that time is I moved from outside of Boston, a really highly connected place, to renting a house in the mountains of Colorado. And I didn't have any cellphone service. I had a very slow Internet connection. I dropped off of Facebook. I dropped out of Twitter. I basically stopped blogging for a while. And I fairly dramatically cut back on checking e-mail. After I got over the initial period of panic that I was missing out on information, my abilities to concentrate did seem to strengthen again. I felt in a weird way intellectually or mentally calmer. And I could sit down and write or read with a great deal of attentiveness for quite a long time.
Q. Did you ever wish you could stick a hyperlink in your book?
A. No. There's already enough footnotes. I'm not the only one now who's starting to question all these hyperlinks. Particularly if you're writing something long and thoughtful, do you want to keep encouraging readers to jump out and jump back in? Or is there something to be said for reducing the temptation to hop around and actually encouraging attentiveness?
Q. What do you make of Steven Pinker's critique of you? The Harvard cognitive psychologist is dismissive of one of your key points, that experience can change the brain. He says cognitive neuroscientists "roll their eyes at such talk."
A. I think he's overly dismissive. I say that with great respect. It's important to read his thoughts in the context of a broader battle going on in the world of cognitive neuroscience, between those who, like Pinker, are strongly behind evolutionary psychology, which basically says that our behavior is very much determined by our genetic heritage, versus those who believe that the brain is adaptable—and we're not locked into that kind of behavior, and in fact our brain changes as the environment changes. Pinker's background and Pinker's views are very much antithetical to the "highly adaptive argument." There's a whole lot of neuroscientists who are uncovering evidence that in fact our use of digital media and media multitasking is having a substantial effect on the way we think.
Q. Colleges refer to a screen-equipped space as a "smart classroom." What would you call it?
A. I would call it a classroom that in certain circumstances would be beneficial and in others would actually undermine the mission of the class itself. I would maybe call it a questionable classroom.