“Life should be lived as play.” Plato’s famous dictum appears as the preamble to countless books and syllabi on learning. It also helped inspire the pedagogical revolutions of the modern world. From Rousseau to Dewey, Froebel to Piaget, educational theorists have insisted that one learns best through play.
But while Plato endorsed play in principle, his Socrates denounced Athenians for playing in the wrong way. He disapproved of their habit of transforming everything, whether beard growing or choral singing, into contests. He was especially appalled by the battles among rival orators in the Assembly, where Athenians stomped and cheered “till the rocks and the whole place re-echo.”
Worse than the Athenians’ love of competition was their passion for make-believe. They would sit transfixed as Homeric orators pretended to be Achilles or Odysseus. They would spend entire days at the theater, identifying with mighty warriors and deathless gods. Little wonder, Socrates observed, that dim-witted shoemakers and shepherds thought themselves competent to govern by public vote. For him, much of what was wrong with ancient Athens was due to the Athenians’ bad play—their addiction to social competition and make-believe.
In The Republic, Plato’s Socrates endorsed play of an antithetical character. Athenians should give up their raucous amusements and abandon their vulgar and childish preoccupation with Homer and the theater. Indeed, the highest excellence of his ideal society was its exclusion of orators, actors, playwrights, and anyone else skilled at persuading people to imagine themselves to be someone else. Athenians should instead embrace “chaste and serious play,” which would inculcate “orderly habits” and teach them to become “serious citizens.”
Rousseau’s Émile, or On Education (1762) became the classic treatise on the merits of experiential and child-centered learning, yet it was also an indictment of bad play. Rousseau dispatched Émile to the remote countryside, where the boy would have no one with whom to compete. And although the boy’s tutor would devise all sorts of interesting learning activities, he would eschew competition. “True satisfaction is neither gay nor wild,” Rousseau explained. “Boisterous games and turbulent joy veil disgust and boredom.”
The tutor was also charged with keeping Émile away from books. (The one exception was Robinson Crusoe, a primer on living alone.) Rousseau’s reasoning echoed Plato: Whenever students read books, especially history, they imagine themselves to be the people about whom they’re reading. “He who begins to become alien to himself does not take long to forget himself entirely,” Rousseau warned. “If Émile just once prefers to be someone other than himself—were this other Socrates, were it Cato—then everything has failed.” The boy would more readily find his own true self if he remained isolated from peers and the imaginative seduction of books.
Over the next half-century, countless educators tinkered with Rousseau’s pedagogical scheme. Later came Darwin, who revolutionized educational theory along with everything else. In On the Origin of Species, he hypothesized that some types of play promoted evolution. When young animals gamboled in the woods or batted at one another with their paws, they learned skills that would help them survive. Darwinian theorists subsequently defined good play as that which promoted evolutionary progress—for human beings as well as animals. Good play advanced mankind; bad play hampered it.
But educators intent on developing Darwin’s insight confronted a major difficulty. The natural play of children seemed to consist mostly of teasing, fighting, make-believe, and all sorts of irrational nonsense. Such behavior may have suited Paleolithic societies, but it was incompatible with the needs of a modernizing industrial society. In The Play of Man (1899), the German psychologist Karl Groos suggested that society had evolved more rapidly than its forms of play had. Therefore, educators should impose a “moral law of temperance,” restraining the “deadly poison” of imagination and suppressing competitive passions.
In his classic study, Adolescence (1904), G. Stanley Hall, first president of the American Psychological Association, dismissed Groos’s nostrums: Children, he argued, could not be induced to play in more suitable ways. Borrowing from the theories of Ernst Haeckel, a German biologist who contended that the evolutionary stages of any organism were reflected in its growing embryo (“Ontogeny recapitulates phylogeny”), Hall declared that psychological development recapitulated the stages of man’s evolutionary ascent. This explained why the child, our “half-anthropoid ancestor,” naturally “revels in savagery.” Children could not help indulging in “tribal, predatory, hunting, fishing, fighting, roving, idle playing proclivities.” The sooner children and adolescents passed through those stages, the sooner they could progress to rational study and adult work. That’s why Hall regarded college as a waste of money and effort. Only a “few hundred picked and ripened adolescents” were capable of embracing the life of the mind. When he became founding president of Clark University, Hall sought to restrict it to graduate study.
Freud shared some of Hall’s views on the inherent bad play of children: Dominated by a craving for pleasure, children often lacked the capacity to satisfy their desires. They therefore resorted to fantasy, identifying with superheroes and venting aggressive impulses through games. In “Formulations on the Two Principles of Mental Functioning” (1911), Freud explained that the transition to adulthood entailed a shift from childhood fantasy to adult competence, from the pleasure principle to the reality principle—and thus from bad play to work. The central task of education was the “conquest of the pleasure principle and its replacement by the reality principle.” Freud later added that adults who failed to make the transition—who remained trapped within the illusory world of bad play—were “madmen,” fitting subjects for treatment.
Hall and Freud doubted that much could be done to improve the play of children. But by the early 20th century, in the United States especially, progressive thinkers insisted that knowledgeable experts could accelerate evolution. John Dewey, the intellectual leader of the movement, focused on education as the best mechanism to precipitate rapid social change. Dewey also concurred that children’s natural play was regressive—a “natural recurrence” to the “typical activities of primitive peoples.”
But Dewey, who had studied with Hall, refused to accept his mentor’s pessimism: Children could be weaned from bad play. Dewey counseled parents to avoid the danger of fanciful stories, which weakened character, dissipated mental energy, and drew children into imaginary worlds instead of “the world of actual things.” Good teachers, too, should “introduce positive material of value” in order to “lead the child on,” allowing him to “pass naturally, and by continuous gradations,” from play to study and, ultimately, to work. Young girls, for example, should play with brooms; boys should cooperate in productive undertakings rather than pummel one another on the playground. “It is necessary that the play attitude should gradually pass into a work attitude,” he explained.
While educational theorists such as Jean Piaget focused on the “infantile dynamics” of play in children, it was left to the Harvard psychologist Erik Erikson to analyze the later stages of the passage from adolescence to adulthood. In his view, the central developmental challenge was overcoming “role confusion” caused by exposure to multiple role models—friends, parents, teachers, pop-culture icons. Adolescent boys, for example, growing up in an era of “decaying paternalism,” often had difficulty finding their way to manhood.
In Toys and Reasons (1977), Erikson blamed bad play for much of what ailed American society. Generals, forever mired in adolescence, played war games with nuclear weapons. American soldiers, having played at war as children, could not distinguish between their childish fantasies and the Vietnamese women and children they slaughtered at My Lai. The “imaginative fictions” of the modern world were so seductive that young adults often failed to accept the world as it really was. Erikson’s solution was to replace bad play with “true play"—serious and chaste play rooted in reality, exactly as Plato had proposed.
Few professors or college administrators keep Plato’s Republic on their nightstands, much less the works of Rousseau, Dewey, Piaget, or Erikson. But opposition to “bad play” nevertheless remains embedded in the academy. Many professors regard competitive games and “make-believe” as tolerable among children or adolescents but obviously inappropriate for college classrooms. The purpose of college is to prepare students for work as adults: Welcome to the real world.
By insisting that competitive and make-believe play is bad, the intellectual titans of enlightened pedagogy succeeded in banning it from higher education. But they did not so much win the war against bad play as push it out of the classroom. In fraternity and sorority houses, football stadiums and dorm rooms, bad play has prevailed. Students continually plunge into brutal social competitions—fraternity hazings, beer pong, Lulu and Tinder. And they spend much of every day pretending to be someone else—assuming new identities as fraternity “brothers,” as better versions of their real selves on Facebook, as misogynistic car thieves or bloodthirsty warriors in online games. The simple fact, which Plato’s Socrates well perceived, is that bad play is often fun. Which is why efforts to suppress it have failed.
But during the past decade, some faculty members and administrators have discovered that the motivational power of “bad play” can be harnessed to academic purposes. Perhaps the most vivid illustration of the phenomenon is the spread of Reacting to the Past, a pedagogical system I helped start, in which students play monthlong games, set in the past, with roles informed by classic texts. For the game set in Athens in 403 BC, for example, students become democrats or oligarchs, and compete by debating the respective merits of Pericles and Plato; for the game set in the Holy Office in Rome in 1632, students pretend to be mathematicians, natural philosophers, and conservative cardinals, and debate whether Galileo’s Dialogue Concerning the Two Chief World Systems proves that the Earth moves. During the past decade, Reacting—the epitome of Platonic bad play—has spread to more than 350 campuses.
Professors report and studies confirm that students playing Reacting to the Past games work far harder than do classmates in regular classes. One example: Martina Saltamacchia, a history professor at the University of Nebraska at Omaha, found that her students, after playing a version set during the Second Crusade, had become consumed by the subject. When she mentioned that an international symposium on the Crusades would soon be held in St. Louis, they asked if they could attend. Ten of them raised money for the trip, participated in all of the sessions, buttonholed presenters afterward, and so impressed Adrian Boas, the keynote speaker, that he offered to pay for their food and lodging to work on his excavation at the Montfort Castle, in Israel’s northern Galilee. Three of Saltamacchia’s students are there now.
Bad play is psychologically powerful. Sometimes it does much harm. But it can also be an effective learning tool. Rather than banish it from higher education—allowing Anheuser-Busch and ESPN to usurp its motivational power—we should welcome it as another way to revitalize the classroom experience.
Mark C. Carnes is professor of history at Barnard College. He is the author of Minds on Fire: How Role-Immersion Games Transform College, just out from Harvard University Press.