As a graduate student in the 1960s, I joined Irven DeVore and Richard Lee in a multifaceted study of the !Kung San, then still hunter-gatherers, in northwestern Botswana. Some anthropologists were persuaded that such studies would shed light on human origins, and some psychologists were convinced that infancy research had a similar role to play in helping us understand the individual. So it seemed logical to investigate, so to speak, the origins of the origins.
Not that either of those propositions was uncontroversial. Both Franz Boas’s disciples in the United States and the structural anthropologists in Europe had rejected any notion that evolution orders cultures, and so there were those who found the claims of researchers on hunter-gatherers to be nothing less than offensive. We were, however, careful to point out that hunter-gatherers were not different from other people biologically or even psychologically, but were perfectly modern human beings living in the very circumstances that dominated human evolution. It was this overlap between them and early human beings—the ones who lived before the invention of agriculture—that led us to think that those who persisted in this way of life could shed light on our origins.
Then there was the question of childhood development. The idea that what happens in infancy might be of overriding importance in later development was also questionable. Some observers argued that the first three years of life were all that really mattered. (The re-emergence of that idea about a decade ago, in the language of brain science, didn’t make it any more valid.) It was probably the lingering sway of psychoanalysis that made this such a tempting hypothesis in the 1960s, but attempts to reconstruct in retrospect the influences that shape patients’ lives do not constitute scientific evidence.
While I was in Botswana, Jerome Kagan—one of the most brilliant of infancy researchers and one of my advisers—was doing research in rural Guatemala, where he, Robert Klein, and other collaborators saw infants who got none of the stimulation thought essential by middle-class parents in America, but who at age 10 performed very nicely, thank you, on basic age-appropriate cognitive tests. Kagan became deeply skeptical of the importance of early experience. By the late 1990s, as Judith Rich Harris conducted a frontal assault on “the nurture assumption,” Kagan began to think that the pendulum had swung too far. But by then he, and many other developmentalists, were committed to genetic, temperamental, and neurobiological investigations and were less interested in the nurture assumption or its challenges.
I returned from Africa in the early 1970s to a revolution in the study of evolution. The new scholarship incorporated sociobiology, behavioral ecology, and what would become evolutionary psychology, but it is best and most comprehensively called neo-Darwinian theory. At first it seemed so mechanistic and trivializing that when applied to human behavior it often produced psychological and political revulsion. A letter to The New York Review of Books in 1975 that was signed by a number of distinguished scientists accused E.O. Wilson, one of the field’s leaders, of joining “the long parade of biological determinists whose work has served to buttress the institutions of their society by exonerating them from responsibility for social problems.” Yet this revulsion was often followed by critical appraisal, and then grudging and partial acceptance. I went through those stages, and by 1976 I was convinced that neo-Darwinism would someday have a small but important place in the spectrum of behavioral and social science—a prediction that was considered weak by enthusiasts and anathema by critics, but one now widely recognized to be true.
In 1979 I signed a contract with Harvard University Press to write a book on evolution and childhood. I thought it would take three years; it took three decades. In that time, advances in the fields of sociobiology, evolutionary psychology, behavior genetics, and brain development greatly enhanced our understanding of childhood. There were thousands of person-years of studying animal behavior in the wild, hundreds of well-designed experiments testing Darwinian hypotheses about human behavior, enormous samples analyzed by advanced statistics in twin and adoption studies, accelerating gene technology, and functional brain imaging in real time in adolescents and even in children.
Those and other advances were both causes and results of a rapidly changing intellectual atmosphere. For one thing, both neo-Darwinism and behavioral genetics gained traction at a pace and in ways that I never predicted. A watershed moment was in 1997, when Newsweek splashed across the top of two pages in a special issue on childhood, “Scientists Estimate That Genes Determine Only About 50 Percent of a Child’s Personality.” To the extent that such a number is meaningful, it made good sense to me, but 20 years earlier, you would have been savaged for a far more modest guesstimate. Wilson, the author of Sociobiology: The New Synthesis (Harvard University Press, 1975), had ice water poured on his head on the stage at a national scientific meeting, and Sandra Wood Scarr, a leading developmental psychologist, was spat upon on a major university campus; neither of them is remotely a genetic determinist. So the fact that “only about 50 percent” was now news showed just how far we had come.
Behavioral ecology and ethology, too, were transformed by neo-Darwinian ideas. Stephen Jay Gould, Richard Lewontin, and a few other important biologists continued to oppose them, but if Gould actually read Natural History, the magazine he wrote for so eloquently for decades, he must have noticed that hardly an issue went by without an article that was suffused with concepts like competition, reproductive success, life-history theory, the evolution of altruism, and other attempts to find and measure adaptations. (This phenomenon was even more evident in scholarly journals.)
Evolutionary psychology, meanwhile, secured a niche in psychological science. And behavior-genetic analysis went from being easily challenged and occasionally even fraudulent to achieving scientific credibility. And then genetics took its greatest step, which was to be able to study genes and genomes directly. True, the promise of linking specific genes to complex behavior remains mainly a promise; unlike decoding the genome, this is an enterprise not of decades but of centuries. Still, genes are no longer an abstraction, and the hard work of figuring out how they shape the brain, and therefore behavior, is under way.
But this work is not the death knell for environmental influences on human development; quite the contrary. For instance, the genetic disorder phenylketonuria (PKU), a form of progressive mental retardation in infants resulting from a simple genetic mutation, can be managed by maintaining a special diet. And there are recent examples of how studying genes deepens our understanding of environmental influence. Genetic markers like the neurotransmitter-related enzyme monoamine oxidase, certain types of dopamine receptors, and perhaps the serotonin transporter all have variants that in some studies make individuals more vulnerable to psychological stress during early life. Those findings and countless more like them might one day enable us to tailor environments to infants and children, focusing our interventions with uncanny specificity.
The era when genetic hypotheses and discoveries resulted in a nihilistic attitude toward the prospects of some children is behind us—and good riddance. That said, there are still political and moral hazards in this work; vigilance is always needed. Discoveries will always be abused by some ideologues. But it is no longer possible to stop, slow, or ignore the advance of a science that now has great and well-deserved intellectual momentum.
Two other changes over the past 30 years make this a good moment to explore the evolution of childhood. First, advances in brain imaging are now as impressive as those in genomics, and it has influenced every branch of behavioral and social science. Before we could look at brains only after death, or very crudely during life, and supplement those meager findings with evidence from the study of other animals, and then guess how the brain generates its major product, behavior. Now we can watch brain circuits in action, down to the level of millimeters, while mental processes are going on.
For technical reasons, this has not been as easy to do in infants and children as in adults, but those difficulties are being addressed. Work by Mark Johnson on the development of face processing; by Jessica Dubois and Jésus Pujol on the emergence of language; and by Eric Nelson, Lawrence Steinberg, and Sarah-Jayne Blakemore on the tug of war between impulse and inhibition in adolescent social behavior, all demonstrate the tremendous power of imaging to refine our understanding of child development. Behavioral changes can’t be explained by brain maturation alone, but imaging brings a whole new kind of information to bear on children’s mental life, whether as cause, effect, or both.
Second, cognitive neuroscience is no longer concerned merely with how the brain enables us to see a line, remember a word, or execute a calculation. In the field’s early stages, cognition meant the things that can be measured by intelligence tests. With few exceptions, emotional intelligence, relationships, and emotions themselves were not considered suitable objects for serious study. Those areas were left to the psychoanalysts to speculate about as best they could. By the 1990s, however, prominent scientists like Kagan, Antonio Damasio, Richard Davidson, Robert Sapolsky, and Stephen Suomi turned their attention in these once-disdained directions and began to see new crucial dimensions of brain and behavior.
All of this research suggests that the evolution of intelligence and mind is driven not just by things like making tools and remembering food locations, but also by the vital need to negotiate emotions and relationships in the course of achieving reproductive success. That need is of the essence of higher-brain function; it is where the biobehavioral rubber meets the evolutionary road.
Where does anthropology, especially cross-cultural research, fit into this story? By 1970 psychological anthropology seemed on the cusp of a scientific revolution, with thinkers like Roy D’Andrade, Robert LeVine, and Beatrice and John Whiting developing careful methods of measuring child behavior and child-rearing in cultures across the globe. But as Patricia Greenfield deftly put it, anthropology took postmodernism “on the chin,” and it did so at a time when opportunities for both scientific and humanistic research were dissolving. The result was a generation of critiques of past work, sometimes verging on political and philosophical cant, instead of primary studies of vanishing cultures.
Fortunately, some anthropologists ducked the blow and kept empirically oriented cultural anthropology alive. Many were motivated mainly by evolutionary or ecological hypotheses. Some collaborated with ethologists and psychologists to put the study of childhood on an ever-firmer base of empirical evidence. And although postmodernism was almost as inimical to Boasian descriptive ethnology as it was to the new forms of evolutionary anthropology, it was the latter where the greatest ire was raised. Some anthropology departments, including those at Harvard and Stanford, even broke apart for a time over the role of science and evolution in the discipline, but progress continued.
So where do we stand now in our grasp of how evolution shapes child development?
Human development is a legacy of the remote past and the basis of all we think about and do in relation to infants and children. The first three months of life, which have aptly been called the fourth trimester, are a legacy of the necessary early expulsion of human fetuses from the womb to avoid an even worse crunch than childbirth already is. Erect posture, followed by brain expansion, made this necessary. The result is a newborn not exactly asocial, but not yet responsive to social cues, and certainly in need of care. And parents should be patient. The programmed social awakening of the third month of life will meet almost all expectations, and it can’t be rushed.
Another legacy of human evolution is the expansion of middle childhood, the period between age 6 or so and puberty. Alan Mann, a professor of anthropology at Princeton University and perhaps the leading authority on childhood in the fossil record, now sees this as a major human advance. In the course of what psychologists call the 5-to-7-year shift, the hard-to-control emotions of early childhood are left behind and replaced by logical patterns of thought and the ability to think about thought itself. Across cultures, it is a time when more is expected of children and more responsibility assigned to them. Biologically, middle childhood is a period of slower growth and calmer hormonal flux, ideal for a unique human enterprise: the acquisition of large stores of cultural knowledge.
That doesn’t stop with the advent of puberty, but the dynamic changes greatly. Teenagers enter, in some form, the human mating dance, and that involves competition even in cultures where it is largely controlled by elders through arranged marriage. And groups beset by enemies must turn boys into warrior-defenders. It’s a developmental phase fraught with danger for both sexes, and the evolutionary legacy is evident. Hormones mobilized by maturational change enable sexual and aggressive behavior, eventually in an adult mode. But there’s the rub: How long will it, or should it, take?
The news of the past decade or so is that the human brain continues its maturational march between the ages of 10 and 20. The frontal cortex and other areas needed for mature thought are not fully developed until at least the end of that period. Meanwhile, the average age at which children reach puberty (as defined by hormonal change) has dropped at least two or three years over the past two centuries. That is not evolution but revolution, and it is likely that the endocrine change now occurs earlier in relation to brain development as well as to chronological age. If so, we have an even starker problem than the slowness of brain growth: hormonal surges at ever-younger brain ages and ever-lower levels of inhibition. The implications for schooling, for the increasing sexualization of the young, and for the culpability of juvenile offenders are potentially transformative.
That brings us to another way that evolution aids our understanding of childhood. If through most of human history puberty began later, then we now face a mismatch between our evolutionary design and our current environment. A clear example of this discordance is found in studies of childhood nutrition and activity. Children throughout our evolution were continually active, mostly in play and exploration, but also in providing some of their own subsistence. Their diets included substantial amounts of lean meat and fish, extremely low levels of saturated fat, salt, and refined carbohydrates, high intake of vegetables and fruits, large amounts of fiber, and a broad spectrum of micronutrients like calcium and vitamin C.
If there is any such thing as a natural lifestyle, that is it, and our modern departure from it has predictably engendered an epidemic of childhood and adolescent obesity, as well as what used to be called adult-onset diabetes. Calls for more acceptance of obesity are at odds with evolution and, more important, with children’s health.
What about other characteristics of hunter-gatherer childhood, such as nursing, mother-infant co-sleeping, immediate parental response to crying, and the like? Here I would be more cautious, since, unlike in the case of diet and activity, we do not have decisive evidence for the advantages of those choices. However, neither do we have evidence that there is anything wrong with them, and, especially as they are part of the deep human past, pending further study parents should be left alone to make their own decisions.
Another thing is clear from the evolutionary record: Mothers have never done the job of child-rearing alone. Among primates, only humans provide for their young after weaning. As Sarah Blaffer Hrdy, a professor emerita of anthropology at the University of California at Davis, showed in her book, Mothers and Others: The Evolutionary Origins of Mutual Understanding (Harvard University Press, 2009), that required the support of grandmothers, fathers, and others. We should think of the natural human adaptation for child-rearing as one in which mothers are central but have large amounts of support.
Evolutionary thinking is particularly useful in illuminating our view of childhood in the realm of facultative adaptation—a sort of “if then” proposition built into our genes. Evolution and genes sometimes say, This is how it must always be, but often they say, If in such-and-such an environment, respond with this adaptation, but if in this other, very different context, respond with that one. Sometimes the consequences are dire for children. Martin Daly and Margo Wilson, of McMaster University, in Hamilton, Ontario, have shown that abuse and neglect, up to and including killing children, are almost 100 times more likely in households with an adult male who is not genetically related to the child. Nothing, I think, could make it clearer that evolutionary explanations must be kept completely separate from moral and legal judgments. Yet this well-established fact about violence committed against children, independent of socioeconomic status and shown across national boundaries, should lead us to a new ways of thinking about abuse prevention. They can be subtle, not draconian, but they should recognize the facts.
From the viewpoint of the child, early life experience may serve as an important signal to understand her environment. The lack of trust that most psychologists believe stems from unstable nurturance can also be thought of, in evolutionary perspective, as an adaptive response to a situation that is at best unpredictable. The adaptation may even include maturing and initiating sexual activity earlier. That needn’t constrain us to accept such harsh environments as inevitable, much less to condone the conditions that give rise to them. But since they do exist, we should adopt a more positive view of childhood adaptation in less-than-favorable circumstances. Respecting children rather than pathologizing them (or even while trying to help with their pathology) can in some cases be a good thing.
The evolutionary theorist Theodosius Dobzhansky used to say that nothing in biology makes sense except in light of evolution. We are now in a position to say that very little in childhood does, either—or, at a minimum, that children’s behavior, their developmental course, and even our treatment of them make much more sense in that light. In a world in which religious fundamentalists and some postmodern liberals stand in unholy alliance against Darwin’s science, we will do well to keep our minds open. Our children will benefit from a view of them and their care that includes our best understanding of that science.