The Chronicle Review

Programmed for Love

In a skeptical turn, the MIT ethnographer Sherry Turkle warns of the dangers of social technology

Photograpy by Erik Jacobs

Sherry Turkle
January 14, 2011

Imagine standing in front of a robot, gazing into its wide, plastic eyes, and falling in love. Your heart revs up, and you hope this Other—this humanoid machine—turns your way again, tilts its head in interest, likes you back.

It happened one summer to Sherry Turkle, at a lab at the Massachusetts Institute of Technology, where she is a professor studying the impact of technology on society. She met a metallic robot named Cog—made to resemble a human, with moving arms and a head—which was programmed to turn toward whoever was speaking, suggesting that it understood what was being said. To Turkle's surprise, she found that she deeply wanted Cog to interact with her rather than with a colleague who was there that day. She realized this human-looking machine was tapping into a deep human desire to see it as alive—as good a companion as any human. She describes it almost like a schoolgirl crush.

The experience unnerved her, she says as she recounts the story one recent morning in the kitchen of her townhouse here. A trim woman with chin-length, dark hair and a ready laugh, she shivers at what she now views as a creepy moment: "I understood what I felt, even though I know that there's nobody home in Cog."

She has spent some 15 years since that day studying this emerging breed of "sociable robots"—including toys like Furbies and new robotic pets for the elderly—and what she considers their seductive and potentially dangerous powers. She argues that robotics' growing trend toward creating machines that act as if they were alive could lead people to place machines in roles she thinks only humans should occupy.

Her prediction: Companies will soon sell robots designed to baby-sit children, replace workers in nursing homes, and serve as companions for people with disabilities. All of which to Turkle is demeaning, "transgressive," and damaging to our collective sense of humanity. It's not that she's against robots as helpers—building cars, vacuuming floors, and helping to bathe the sick are one thing. She's concerned about robots that want to be buddies, implicitly promising an emotional connection they can never deliver.

The argument represents a skeptical turn for a researcher who was one of the first humanities scholars to take human interactions with computers seriously as an area of study. Because she began her academic career at MIT, starting in 1976, she had an early look at the personal computer and the Internet and now a front-row seat for robotics. She's a Harvard-trained psychologist and sociologist and refers to herself as an "intimate ethnographer," looking at how people interact with their devices. "I'm fascinated by how technology changes the self," she says.

By the mid-90s, her largely enthusiastic explorations of online chat rooms and video games had landed her on the cover of Wired magazine, making her a celebrity among the geek set. Back then her main interest was how creating alter egos in virtual worlds helped people shape their identities, as captured in her seminal 1995 book, Life on the Screen (Simon & Schuster).

She still believes in those benefits. But in recent years, she has spent more time documenting the drawbacks and hazards of technology in daily life. Warnings fill her new book, Alone Together: Why We Expect More From Technology and Less From Each Other (Basic Books), out this month, which devotes roughly half of its pages to her studies of robots and the rest to information overload and the effects of social networks and other mainstream technologies. At points the prose seems designed to grab readers by the shoulders, shake them as if out of a dream, and shout: "Put down the BlackBerry—you're forgetting how to just be!"

"We talk about 'spending' hours on e-mail, but we, too, are being spent," Alone Together concludes. "We have invented inspiring and enhancing technologies, and yet we have allowed them to diminish us."

In Turkle's view, many of us are already a little too cozy with our machines—the smartphones and laptops we turn to for distraction and comfort so often that we can forget how to sit quietly with our own thoughts. In that way, she argues, science fiction has become reality: We are already cyborgs, reliant on digital devices in ways that many of us could not have imagined just a few years ago.

Perhaps it's not so far-fetched to think that walking, talking machines will soon come a-courting—and that many people might welcome their advances.

Turkle says her shift in attitude about the influence of digital technologies grew not just from personal experiences like those with Cog, but also from her field research—hundreds of interviews with children, teenagers, adults, and the elderly encountering the latest tech gadgets. Again and again, she saw how even a relatively clumsy robot dog or electronic baby doll could spark a deep emotional response.

In her home office, she cues up a videotape from her archive to demonstrate her research method. It's from 2001, showing a 9-year-old girl interacting with a robot named Kismet that was developed by researchers at MIT's artificial-intelligence laboratory (the same group that built Cog). Kismet looks as if it could be a prop in a science-fiction film—a metal face with large eyes, wide eyebrows, and a mouth that switches among expressions of surprise, delight, disgust, and other simulated emotions.

The girl holds up a yellow toy dinosaur and waves it in front of Kismet. She moves the toy to the right, then to the left, and the robot turns its head to follow. Turkle, who can be seen off to the side (with a shorter haircut and larger glasses than now), says she gave almost no guidance to the girl—the goal was to put robot and child together and see what would happen. "It's called a first-encounter study. I say, 'I want you to meet an interesting new thing.'"

As we watch, the girl tries to cover the robot with a cloth to dress it, and then tries to clip a microphone to the robot. Soon Kismet is saying the girl's name and other simple statements, and the girl experiments with other ways to communicate with this mix of steel, gears, and microchips.

Most of the kids in the study loved Kismet and described the robot as a friend that liked them back, despite careful explanations by Turkle's colleagues that it was simply programmed to respond to certain cues and was definitely not alive. The response appears to be a natural one, Turkle says. "We are hard-wired that if something meets extremely primitive standards, either eye contact or recognition or very primitive mutual signaling, to accept it as an Other because as animals that's how we're hard-wired—to recognize other creatures out there."

Yes, children have long thought of their dolls as alive, or near enough. But Turkle argues that a crucial shift occurs when dolls are programmed so that they seem to have minds of their own. She devotes chapters of her new book to studies she did using popular robot toys in the late 1990s, including Tamagotchi, a hand-held digital pet with a small screen and buttons that asked kids to feed and nurture it at regular intervals, and Furbies, stuffed toys programmed to speak gibberish until they "learn" English from their owners (actually, they automatically spoke preprogrammed phrases after a given amount of time).

One day during Turkle's study at MIT, Kismet malfunctioned. A 12-year-old subject named Estelle became convinced that the robot had clammed up because it didn't like her, and she became sullen and withdrew to load up on snacks provided by the researchers. The research team held an emergency meeting to discuss "the ethics of exposing a child to a sociable robot whose technical limitations make it seem uninterested in the child," as Turkle describes in Alone Together. "Can a broken robot break a child?" they asked. "We would not consider the ethics of having children play with a damaged copy of Microsoft Word or a torn Raggedy Ann doll. But sociable robots provoke enough emotion to make this ethical question feel very real."

Turkle, to be clear, praises the work of the team that engineered Kismet—and she defends the ethics of her project, which in many cases presented students with commercial toy robots rather than MIT prototypes. A leader of the Kismet project, Cynthia Breazeal, says she hopes the technology can be used to create tutors for distance education that are more engaging than educational software or games, or to create robot assistants that supplement rather than replace humans.

"There are advantages to it not being a person—robots can be seen as not judgmental; people are not at risk of losing face to a robot," Breazeal says. "People may be more honest and willing to disclose information to a robot that they might not want to tell their doctor for fear of sounding like a 'bad' patient. So robots working with other people can help the patient and the care staff."

During her research, Turkle visited several nursing homes where resi­dents had been given robot dolls, including Paro, a seal-shaped stuffed animal programmed to purr and move when it is held or talked to. In many cases, the seniors bonded with the dolls and privately shared their life stories with them.

"There are at least two ways of reading these case studies," she writes. "You can see seniors chatting with robots, telling their stories, and feel positive. Or you can see people speaking to chimeras, showering affection into thin air, and feel that something is amiss."

Some robotics enthusiasts argue that these sociable machines will soon mature, and that new models may one day be judged as better than humans for many tasks. After all, robots don't suffer emotional breakdowns, oversleep, or commit crimes.

In his 2007 book, Love and Sex With Robots (Harper Perennial), David Levy, who is an expert in conversational computer software, argues that by the year 2050, some people will even choose to marry robots. By then, he says, many human couples will bring in robot baby sitters when they want to head to a holographic movie (or whatever the entertainment is by then).

"The concept of robots as baby sitters is, intellectually, one that ought to appeal to parents more than the idea of having a teenager or similarly inexperienced baby sitter responsible for the safety of their infants," he writes. "Their smoke-detection capabilities will be better than ours, and they will never be distracted for the brief moment it can take an infant to do itself some terrible damage or be snatched by a deranged stranger."

Levy says his book was inspired by Turkle's earlier work. It is dedicated to one of Turkle's research subjects, a young man named Anthony, who, in Levy's words, "tried having girlfriends but found that he preferred relationships with computers." Levy sent a copy of the book to Turkle, hoping she would pass it along to Anthony and believing that he would find it encouraging.

Turkle was not pleased. She expresses frustration with the notion that Anthony would be happier with a robot than with gaining the social skills necessary to connect with a human companion.

"David Levy is saying: For someone who is having trouble with the people world, I can build something. Let's give up on him. I have something where he will not need relationships, experiences, and conversations. So let's not worry for him. For a whole class of people, we don't have to worry about relationships, experiences, and conversations. We can just issue them something."

Turkle continues: "Who's going to say which class of people get issued something? Is it going to be the old people, the unattractive? The heavy-set people? Who's going to get the robots?"

Levy's response: "Who is going to get the robots is an ethical question, and I am no ethicist. What I am saying is that it is better for the 'outcasts' to be able to have a relationship with a robot than to have no relationship at all."

For Turkle, though, the most chilling moment in the Kismet study came when the robot was at its best: when a child left the MIT lab feeling that she had had a deep moment of connection. Kismet can't reciprocate friendship, after all, or prepare kids for the more dynamic experience of interacting with people.

"What if we get used to relationships that are made to measure?" Turkle asks. "Is that teaching us that relationships can be just the way we want them?" After all, if a robotic partner were to become annoying, we could just switch it off.

Turkle's new book, Alone Together, is a coming-of-age story, with several protagonists. Robots take the first turn, with a look at how they might develop from early prototypes. The second character is the Internet, which, during the roughly 15 years since Turkle wrote Life on the Screen, has gone from infancy to young adulthood, growing up so fast that those early years are all but forgotten.

Turkle's argument is that the "always on" culture of constant checking of e-mail messages and Facebook updates has appeared so quickly that we haven't yet developed ways to balance our networked and physical lives. "Because we grew up with the Net, we assume that the Net is grown-up," she writes, in what she says is her favorite line from the book.

We've reached a moment, she says, when we should make "corrections"—to develop social norms to help offset the feeling that we must check for messages even when that means ignoring the people around us.

"Today's young people have a special vulnerability: Although always connected, they feel deprived of attention," she writes. "Some, as children, were pushed on swings while their parents spoke on cellphones. Now these same parents do their e-mail at the dinner table." One 17-year-old boy even told her that at least a robot would remember everything he said, contrary to his father, who often tapped at a BlackBerry during conversations.

Several times during our interview, Turkle's computer beeped in the corner of the room, signaling a batch of new e-mails. "I'm getting something like 30 messages a minute," she said. "Trust me, 30 people a minute don't really need to be in touch with me." She laughed. "It's not good for them, it's not good for me."

She constructs her case with story after story from her field research: Interviews with teenagers complaining about growing up "tethered" by cellphones and endlessly required to phone home; lawyers lamenting that their clients now want quick answers by BlackBerry rather than longer, more nuanced advice; college students so carefully constructing their Facebook profiles that one worried he might forget what was real and what was posturing.

Turkle says her earliest work on computers and networks may have been too optimistic. "In some ways, I look at this as a book of repentance. I do," she says. She always included "caveats," she emphasizes, even in her first book about technology, The Second Self: Computers and the Human Spirit (1984). But the issues she is concerned about now, such as privacy and how experimenting with identity online could cause people embarrassment later if everything is archived, simply weren't on her mind back then.

Now that they are, she hopes to take a more "activist stance" in provoking debate about technology.

"I'm contemplating a book that's more prescriptive," she tells me. "I'm interested in writing more on how to navigate this. Particu­larly in the areas of technology for children and the elderly."

The final major character of Turkle's new book is her daughter, Rebecca Willard, who just started college down the street, at Harvard. The book is dedicated and ends with an open letter to her—which amounts to advice to the members of her generation as well—to be thoughtful about how they use technology.

Does Rebecca buy it? I asked.

"Why don't you text her and ask her yourself?" Turkle says with another laugh.

So I do, and a couple of hours later I'm sitting in a coffee shop near Harvard Yard with Willard. It seems appropriate that the only seats available are at a communal table, where everyone can hear our conversation. Apparently that bothers none of the students there.

"Whenever I bring this up with friends, people always go, 'Oh, I've thought that—it's like you're with people, but you're not completely connected with them,'" Willard says, recalling a time she met a friend for dinner and they both stopped to check their e-mail on their phones.

She kept her own iPhone out of reach during our talk, and she says she hopes her mother's book sparks a conversation about the appropriate uses of technology. "Because," she says flatly, "I don't like it when someone uses their phone while I'm talking to them."

Meanwhile, in years to come, if anyone tries to give her mother a robot caretaker, Willard has received strict instructions: Keep it away. Sherry Turkle would rather have the complete works of Jane Austen played continuously.

Jeffrey R. Young is a senior writer for The Chronicle.