For some time, we’ve been seeing a curious trend: Young adults are attempting to delay adulthood, while preteen children are hurrying—or being hurried—into the roles and attitudes of young adults. It’s no accident that child psychologists have extended their definition of adolescence into the 20s or that primary-school kids are pushed into beauty pageants where they are dressed like Miss America.
As different as the two trends appear, adults evading adulthood and children hurtling through childhood share a common longing for the sweet spot of youth, that quintessential time of autonomy and self-expression coupled with the conformity of a peer culture from which the boring older generation is excluded. Ten-year-olds want that as much as 30-year-olds do. Thus, we have created a contradictory culture that combines jaded children, whose parents wonder where their kids’ innocence went, and callow adults, whose elders fret when their kids boomerang back home and "deny" them grandchildren. But should we worry?
In 2011, almost a fifth of men between 25 and 34 still lived with their parents; in 2005, that figure had been 14 percent.
Thirty-five years ago, the historian Joseph F. Kett observed that modern adulthood is marked by the early and often simultaneous events of marriage, childbearing, and permanent employment. In today’s world, that clustering of rites of passage has largely disappeared. The median age of first marriage in the United States, once a key indicator of maturity, has risen from about 23 for men and 21 for women in 1970 to 28 and 26 respectively in 2010. Married couples today make up only 48 percent of households (down from 78 percent in 1950). Late marriage or permanent singlehood has also become relatively common; in the past three decades, the percentage of men in their early 40s who had never married has risen fourfold to 20 percent. For women, the never-married group has grown from 6 to 14 precent.
Even more telling, in 2011, almost a fifth of men between 25 and 34 still lived with their parents; in 2005, that figure had been 14 percent. (For women, the figure had grown more modestly, from 8 to 10 percent). Women are also delaying motherhood (waiting until an average age of 25, up from 21 in 1970). By 2010, only 20 percent of households contained children (less than half the percentage of 1950). These patterns are repeated in Europe and elsewhere in the developed world.
In the past three decades, the percentage of men in their early 40s who had never married has risen fourfold to 20 percent.
Popular culture has mirrored these changes. Since the 1990s, beginning with Friends, sitcoms have featured unrelated young people living together. Such shows have replaced the family and workplace comedies that had prevailed from the dawn of television. Instead of humor based on the foibles of children growing up in the homes of sometimes bewildered parents (think Leave It to Beaver in the 50s and The Cosby Show in the 80s), modern television half mocks and half celebrates the antics of young-adult singles who would have been long settled down in the not-so-distant past. The characters in shows like The Big Bang Theory behave like immature siblings (though all but a female foil are accomplished scientists), teasing yet supporting one another through their obsessions, without the benefit or bother of an older generation. In Friends, group members eventually "graduated" into marriage or break-up. It’s hard to imagine that happening today to the guys in shows like Workaholics, The League, or It’s Always Sunny in Philadelphia. Although within the world of sitcoms male immaturity predominates, programs such as 2 Broke Girls offer equal-opportunity stupidity.
Still, in real life, delayed social adulthood seems to have particularly affected men, evidenced by a reluctance to commit, a holding on to the playthings of youth (according to the video-game industry, the average player is 30 years old), and an embrace of a hedonistic lifestyle that is endlessly promoted in ads, especially during televised football games.
The avoidance of traditional markers of maturity takes many forms: a relentless quest for new thrills and experiences, a mocking rejection of formality in dress and manners, and even the celebration of unrestrained appetite for supersized fast food. Even some older adults admire the freedom of youth and turn it into a lifestyle rather than a life stage (witness the billions spent to retain the bodies, hair, and sexuality of youth). A recent Jeep ad warns that as "the world started to feel smaller" in adulthood, "you’re still here. And you’re still you. The horizons haven’t gone anywhere." You can get your youth back if you take to the wilds in a four-wheel drive.
At the other extreme is the rush of kids, often pushed by parents, out of childhood. Marketers even have an acronym for this trend: KGOY (Kids Getting Older Younger, aka "age compression"). The stories are endless: kids playing soccer with the idea that doing so will help them get into an Ivy League college (as the sociologist Hilary Levey Friedman found in her new book Playing to Win: Raising Children in a Competitive Culture); the success of clothing lines like Tween Brands (originally marketed as the Limited Too) and Victoria’s Secret Pink, which lure "tween" girls (9 to 13) into the glamour of fashion and cosmetics. If grown men play the latest version of Grand Theft Auto and Call of Duty, so do 10-year-olds (no matter the rating system intended to protect kids from M-Mature games featuring violence). At the same time, children can hardly wait to get tattoos or piercings, even if they are sometimes fake.
In the past, that type of activity was usually restricted to working-class or minority kids, whose parents, the received wisdom presumed, failed to protect them from growing up too fast. Indeed, guarding children from premature exposure to adult life has been the quintessential mark of the middle-class family for centuries. But the liberating power of access to money and choice has turned even the children of college-educated, churchgoing suburbanites into "rebels without a cause."
The Global Association for Marketing at Retail estimates that $200-billion will be spent on tweens in 2013, of which $43-billion will come directly out of the pockets of kids themselves (whose allowances average $65 per month or just more than $16 per week). Most of the $2,047 spent annually, on average, on or by tweens is for "nonessentials" like video games, music, fashion, and now even electronic cigarettes. Twenty percent of 8- to 9-year olds already have cellphones, and 60 percent use them by 11, marking for many not only the end of childhood play but an escape from parental control.
Prolonged adolescence is usually explained by economic trends: It is harder for young adults to establish families today because it takes longer to train for and obtain "permanent" good-paying jobs; most men’s wages are stagnant; and the costs of home buying and child rearing have risen drastically since the 1970s. But while the economic prospects of many young adults may have discouraged them from taking on family responsibilities, some young adults have more pocket money than their predecessors. Hence the ads that appeal to instant gratification, especially those directed toward men in their 20s, and the success of cable channels like Spike TV aimed at that age and gender cohort. Perhaps too, witnessing the failure of many of their parents’ early-age marriages has made the current generation skittish about too-early commitment.
The explanations for KGOY tend to be psychological: the desire of upper-elementary-school children for freedom from nagging and hovering parents and to fit into peer groups, longings again encouraged by merchandisers. The fact that puberty arrives ever earlier is also certainly a factor.
Early childhood, especially from the beginning of the 20th century, has been seen as a treasured time, as the sociologist Viviana A. Zelizer famously noted in her 1985 book Pricing the Priceless Child. We have protected childhood by prolonging schooling, restricting child labor, and raising the age of consent.
Though economic and psychological trends have certainly shaped the current longing for youth and escape from childhood and adulthood, the new glorification of youth is also related to the rise of modern commercial culture and its promotion of generational consciousness. About 1900, King Gillette marketed his safety razors (and the clean-shaven look) to young men whose fathers had worn gray beards, and cosmetics were sold to respectable ladies with the promise to restore or preserve the blossom of youth. Kodak cameras were aimed at Gibson Girls who wore liberating post-Victorian fashions and were told in advertisements to take snapshots of outings with their boyfriends (something their mothers would never have done). The historians John F. Kasson (Amusing the Million, 1978) and, more recently, Woody Register (The Kid of Coney Island: Fred Thompson and the Rise of American Amusements, 2001), have found young adults and teens of 1900 seeking playful escape from the adult world of work, competition, and responsibility at Coney Island’s beaches and amusement parks.
For a century, consumer commerce has cultivated youth (an age that adopts brand loyalty more easily and spends more readily than "settled" age groups). The not-so-subtle commercial message: Buy now when you are young. (The growth of consumer credit has paralleled this trend.) By the 1980s, even ads for luxury cars invited the young to abandon the old idea that they had to wait until they had reached the top of the corporate ladder before they bought their Buick or Cadillac. And for years, we have heard about abuses in the aggressive marketing of credit and now debit cards to college students.
Consider too how, during the 50s, the escape from male responsibility became a kind of subculture (think only of Jack Kerouac and Hugh Hefner). In the 60s, boomers, recalling their teenage reading of The Catcher in the Rye, broadened their rejection of maturity, in everything from clothing (abandoning the formality of the fedora and the white shirt and tie for jeans and T-shirts) to entertainment (rejecting the paternal guidance of family sitcoms like Father Knows Best and the old moralities of Western heroes like those in Gunsmoke).
In popular music, the often cross-generational Tin Pan Alley gave way to the teen-oriented sound of rock ’n’ roll, beginning with the first documented use of the term in 1954. The long-term impact was that adults, as they aged, held on to the music of their youth even more so than had adults in the past. The generation that grew up on Bill Haley, doo-wop, and Elvis embraced oldies concerts and radio and made it into a cult of youth nostalgia.
Roughly paralleling all these developments, in the first decades of the 20th century, parents were beginning to shower their young kids with teddy bears, electric model trains, and lifelike "companion dolls." Huge ad campaigns taught mothers and fathers (a generation before "permissive" child-rearing manuals) that kids had the right to expect peanut butter, Jell-O, and even fun-filled vacations (later to Disneyland). These gifts were often supposed to create bonds between parent and child: Electric trains, for example, could bring dads into the play.
But by the 1930s, media companies and toy manufacturers had gone a step further by creating a culture from which parents were largely excluded. Movie matinees catered to a more youthful audience; radio stations offered fantasy programming increasingly more detached from adults and the paths to adulthood. Boys bought cheap Buck Rogers toys and conducted their derring-do in a world largely shorn of both signs of maturity and innocence, and anticipated later play with action heroes. By the mid-1970s, video games had both pulled boys into KGOY and kept young adults playing long after they had put down other playthings. Especially from the late 1980s, Nintendo and other game manufacturers found that their customer base was expanding as child players became teens and entered their 20s. Games became more sexual and violent (as well as complex), holding many players for a lifetime while luring kids away from the more "babyish" games like Mario.
The female escape from childhood’s dependency was delayed in the troubled 30s and 40s by mothers needing help at home and fearing the premature sexuality of their daughters. But it came in 1959 with a vengeance in the Barbie fashion doll, the nemesis of motherhood’s baby and companion dolls, and a wish fulfillment of girls longing for teenage liberation. And when over time Barbie got domesticated and perhaps too associated with early childhood, Bratz dolls came along, painted up like streetwalkers to do what Barbie had once done: separate girls from their mothers. On the media front in the 1990s, as Sarah Banet-Weiser (in her Kids Rule!, 2007) shows, the Nickelodeon channel offered kids longing to be free of PBS’s patronizing programming (for example, Mr. Rogers’ Neighborhood and Sesame Street) more edgy, even adult-mocking shows like Rugrats and The Fairly OddParents.
The rush into young adulthood, at least in fantasy, may be a bit easier for girls than delaying maturity is for women. The mother’s role is far more deeply rooted in modern culture than is fatherhood, and maternity is less easy to abandon or ignore than paternity (explaining, in part, the 35.7 percent rate of births to single mothers in 2011). By contrast, the flight of men from social responsibility is deeply embedded in the culture—in everything from images of the lonely but free mountain man or cowboy of the 19th century to modern icons of male freedom like Charlie Sheen.
So what does all this mean and does it matter?
Is it really so bad if tweens attempt to escape smothering parents? And is there a real alternative today to the commercialized youth culture? There may even be some advantages of delaying "growing up" into adult roles. Putting off marriage, especially for completing education and initiating careers, has probably contributed to the stabilization of divorce rates. After all, the delay was part of Betty Friedan’s plea in The Feminine Mystique, when in 1963 she called for women to obtain an "identity" before committing to marriage and family. Moreover, delay has paid off for many young people, especially the privileged ones. By contrast, the continuing early passage into adult roles of less-educated and disadvantaged young adults often leads to unstable liaisons, divorce, single-parent families, and drifting from job to job.
The problem lies less in the delay (or speeding up) in life transitions than in the celebration of a form of youth that denies rather than prepares for adulthood. It’s a question of what young adults do during this lengthy transition period.
The fact is that, since the mid-20th century, returning to (or never leaving) childhood has become far more attractive than in the past. In recent years, the young have experienced much less subordination to their elders and much less need to sacrifice and save today for old age. That is hardly all bad; democratization across generations is largely a positive thing; deferred gratification may be overrated. Few of us would want to return to many of the old markers and roles of maturity that constrained our predecessors, oppressed the young, and fostered much intergenerational conflict. But "youth" is less often a stage of life than a refuge from the now tangled and obscured path to maturity. Modern consumer and media culture has profited from all this by offering a treadmill of packaged pleasures.
The famous "psychosocial moratorium" of adolescence that in the 1950s and 60s the psychologist Erik Erikson had hoped might lead to a rich chosen identity has all too often become a period of uncertainty. Prolonged singlehood may bring self-doubt and indecision, according to a report last year, "Knot Yet: The Benefits and Costs of Delayed Marriage in America," sponsored by, among others, the National Marriage Project at the University of Virginia. Delaying marriage, the report found, led to a decrease in divorce and more economic security for college-educated women, but it also resulted in most Americans without a degree having their first baby before marrying. Those who delay "settling down" beyond their late 20s also described themselves as depressed, more likely to abuse drugs and alcohol, and, in the long run, less happy than those who married in their mid-20s (but not earlier). The quest for the ideal maturing moment may well be more elusive than that finding suggests, but there surely are social and psychological costs to an overly long suspension of growing up.
And the larger issue still remains: The modern fixation on youth rejects intergenerational culture. In a quest for autonomy, the child escapes the parent and the young adult avoids parenting. However, both may also join an ever-expanding commercialized peer group. At the beginning of the 20th century, that community of youth was limited to street corners and the occasional visit to the nickelodeon or amusement park. By midcentury it had become more pervasive, with the coming of the transistor radio and the drive-in. Today with smartphones, the youth peer group is far more ubiquitous, and far more accessible—anytime, almost anywhere. The loss of intergenerationality inevitably creates a more shallow culture for all of us and reduces memory to our own experience, often excluding a longer historical perspective. Can this bode well for a society that is built around intergenerational transfers (Social Security and public-school education, for example)?
By escaping elders and the past, the new culture of youth is a culture of the infinite present, not driven by experience or anticipation. Accepting generational difference, being a child and being an adult, is to embrace the path to maturity. Becoming a grown-up doesn’t mean reaching a plateau, but rather accepting a lifelong quest for growing and relating to one’s own, but especially to other, generations. Why this rush out of childhood and evasion of adulthood?
Gary Cross is a professor of history at Pennsylvania State University at University Park. Among his books is Men to Boys: The Making of Modern Immaturity (Columbia University Press, 2008).