Encoded in the word “creative” is a long history of secularization. The power of creation (from the Latin creare: to make, beget, or cause) was once reserved for the gods. Later, it was claimed by poets and artists. Today, creativity is the common property of humanity: All of us, supposedly, are “creative” to some degree. This gradual lexical shift, in which a quintessentially superhuman act came to be recognized as a quintessentially human one, brought in its train a degradation of the central term. Who now, in hearing the word “creative,” feels the breath of the divine, or sees new worlds taking shape under the poet’s hand? Creativity has become a conventional value, immersed in the practical and the utilitarian. For evidence, look no further than the lamentable noun form “creatives,” which we use to refer to those among us who harness their imaginative faculties to generate slogans, slide decks, and gritty reboots.
We might be tempted to accuse corporate America of cheapening the idea of creativity by borrowing the prestige of the arts in order to lend a moral veneer to profit-making. The truth is more complex and disturbing. The business world not only trumpets its allegiance to creativity as a way of glamorizing its activities, it also helped develop the concept in the first place. This is, at any rate, what Samuel Weil Franklin argues in his irreverent and informative work of cultural history, The Cult of Creativity: A Surprisingly Recent History.
Creativity, as Franklin’s subtitle suggests, is a recent invention, a 20th-century concept that patched together prior notions of inventiveness, ingenuity, and imagination into a new synthesis. While the coinage is not entirely new — the Oxford English Dictionary cites usages from 1659 (referring to God) and 1875 (Shakespeare) — the notion of creativity as we know it today emerged in Cold War America. Psychologists, advertisers, and businessmen found in creativity a solution to the stifling conformity of the postwar order. This was a period saturated with anxieties about the inhumanity of the modern social order. The drive for efficiency and rationality had, critics feared, created a culture that was affluent yet meaningless. Civilization had become a “soulless machine.” The new doctrines of creativity rescued irrationality and play, and assimilated their fertile energies into consumer capitalism.
By nourishing their creative abilities, experts promised, Americans could resist the deadening pressures of bureaucracy (incarnated in the era’s dreaded figure of the “organization man”). They could find meaning in life while, at the same time, delivering new product lines for their employers. Creativity could even help beat the Soviets. America needed to ramp up its “idea production,” one creativity guru advised, while the Navy hired a specialist in the art of brainstorming — one of the signature activities of the postwar creativity movement — in hopes of generating “more Washington imagination” to meet “the new Communist tactics.”
The notion of creativity as we know it today emerged in Cold War America. Psychologists, advertisers, and businessmen found in creativity a solution to the stifling conformity of the postwar order.
The elasticity of the term proved key to its success. More soulful than “originality” and more universal than “genius,” creativity was a quality associated with artists and visionaries but available, in theory, to everyone. It imbued science and technology with the allure of art. Linked with idleness and play while whispering promises of innovation and economic productivity, creativity became a value everyone could agree on. A new field of creativity research “brought military brass into the same room” as educational reformers “who wanted oddball schoolkids to thrive.” Much as the “imagination,” in the Romantic period, swept aside rivals such as “fancy” to become the ruling concept for our faculties of inventiveness, in the postwar period “creativity” became the new dominant framework for how we think about human acts of making.
The intellectual foundations of the creativity movement were laid not by poets and philosophers, as with Romanticism, but by psychologists. By the 1940s, growing numbers of psychologists had come to doubt whether the concept of general intelligence (developed by turn-of-the-century eugenicists, and tested via IQ) provided an accurate model of human intellect. And behaviorism, with its reduction of human behavior to stimulus and reflex, was also falling out of fashion; there was an appetite instead for models of human interiority that stressed agency and complexity.
Creativity was, in this context, a perfect topic. Early creativity researchers found in divergent thinking — the ability to proliferate many solutions to a problem — a possible basis on which to differentiate creativity from other kinds of intelligence. Later, psychologists made stronger normative claims for creativity. Abraham Maslow, best known for his pyramidal theory of the hierarchy of needs, linked creativity to self-actualization, pointing to creative achievement as evidence of inner flourishing.
This flurry of research into creativity left many issues muddled or unresolved. Psychologists never succeeded in establishing that creativity named one characteristic rather than many faculties of mind. Some studies were tautological, treating eminence and acclaim as proxies for creativity: If you’re successful in your field, you must ipso facto be highly creative. And much creativity research was, in Franklin’s telling, dismally clueless about how factors such as education and class might influence creative achievement. Creativity tests administered in the 1950s, for example, interpreted the apparent preference of “creative” persons for abstract art in naturalizing terms, as evidence of a deeper “tolerance for ambiguity,” rather than an index of midcentury highbrow taste.
Nonetheless, the psychologists helped entrench several ideas about creativity that have now become common sense. They bequeathed to us a view of creativity as a unitary phenomenon that varies only quantitatively, not qualitatively, across persons and fields of achievement. On this view, the difference between Galileo and a corporate engineer is a difference in degree, not in kind: Both have creativity, but Galileo has “more” of it.
The corporate sector, meanwhile, seized on creativity as a way of combating workers’ alienation while encouraging the development of new products, brands, and slogans. A new practice entered the corporate environment: the brainstorming session. The method was popularized in the 1950s by the advertiser and creativity advocate Alexander Faickney Osborn, who in a series of self-help books and pamphlets exhorted readers to make use of “the gold mine between your ears.” In brainstorming, the organizational hierarchy was momentarily suspended, as a group of workers attempted to generate as many ideas as possible. Another idea-generation method called synectics was more radical. It sought to tap into the unconscious by leading participants through a progression of analogies (a transcript of engineers working on “gear rounding” shows them moving psychedelically from the gear wheels to the curve of a pregnant belly to the arc of a “beautiful colored rainbow”). These sessions — part workshop, part group therapy — aimed, Franklin writes, “to heal a split in the American professional self,” to help buttoned-up engineers to get in touch with their “childlike,” “playful,” and “poetic” qualities. They led to such Promethean inventions as Pringles and the Swiffer mop.
Corporate America’s love affair with creativity drew its share of detractors. Some artists felt that the creative techniques employed by businesses were a shallow mimicry of genuine art-making. The graphic designer Saul Bass, for example, argued that brainstorming perverted the imaginative process by putting creativity “on the production line.” And for businessmen of a more-austere stripe, the creativity movement looked suspiciously like idleness. The advertising tycoon David Ogilvy dismissed brainstorming sessions as “the delight of sterile loafers who would rather fritter away their day in meetings than shut their doors and get down to work.” Indeed, creativity evangelists, hoping for a partial integration of play into work, occasionally argued for workplace reforms such as flexible hours. Articles such as “The Creative Man: His Moods and Needs” advised managers to lead their sensitive underlings with a light touch: “If someone wants to work at home, or take off during the day to sit in Central Park, that’s all right, too.” While the siren of creativity has, in our own era, “normalized the precarity and overwork of the post-Fordist world,” as Franklin puts it, in its early days the creativity movement looked more like a liberal reaction against consumer capitalism’s stultifying pressures.
The concept of creativity that emerged from the nexus of actors Franklin describes — psychologists interested in human potential, industry leaders seeking innovation, and advertisers who saw themselves as demi-artists engaged in the crafting of new desires — remains a dominant value in business, education, and in American culture more generally. Creativity is one of the few American pieties. Whatever it might be, we all agree that it is good, and that we should have “more” of it.
But this way of thinking about the productive powers of the imagination, though appealingly democratic in its acknowledgement of the creative abilities of all people, has some serious defects. To appropriate the glamor of art and disperse it across other fields, postwar creativity advocates needed to insist that creativity was a single faculty, one that could explain achievement not only in the arts but also in science, technology, advertising, and other more-humdrum areas of endeavor. This broad understanding of creativity may have had the happy consequence of facilitating linkages between the arts and the sciences, but its looseness clouds the precise nature of those linkages. A mechanical engineer may benefit from trying her hand at landscape painting not because the art class stimulates her “creativity,” but because both painting and mechanical engineering rely on spatial reasoning and the mental rotation of parts and images. If we are trying to understand the springs of human invention, “creativity” is too vague a concept to get us very far.
The same universality of application that made creativity a democratic value also compromised its moral status from the start. Franklin quotes several descriptions of the creative process that are adamantly amoral. Acts such as “devising new instruments of killing” or “devising a new and more subtle form of torture for political prisoners” are placed alongside composing a symphony or arriving at a new scientific theory. Such claims insist, perversely, that creation and destruction emanate from the same impulse. (Studies of creativity were funded, Franklin notes, by “every branch of the military.”) Creativity was also nursed by capitalists, who saw in it the promise of profit. Truly creative men, one advertiser declared in The Wall Street Journal, savor the satisfactions of “applying creative talents to make a mark not on posterity, but on next month’s sales curve.” The boosting of creativity by industry meant that no bohemian compromises were needed in order to pursue the artistic life. “One could exercise one’s creativity,” Franklin comments witheringly, “from the comfort of a midtown suite.” Creativity’s ties to industry are reflected today in the fact that the bulk of academic research into “creativity” occurs not in departments of the arts and humanities but in design, business, and engineering schools.
The same universality of application that made creativity a democratic value also compromised its moral status from the start. Acts such as devising new instruments of killing are placed alongside composing a symphony.
Franklin is not quite ready to dispense with creativity. But he believes the concept has done harm. Not only does creativity paper over the excesses and contradictions of capitalism, it also displaces other values, encouraging a relentless emphasis on novelty and disruption. The problem is that “new” does not always equal “good.” Perhaps, Franklin considers, we should pay greater tribute to the humbler goods of maintenance, care, and infrastructure — the quiet labor that keeps the world going. This point is a crucial one. Yet even here, creativity perks up its head. Maintenance, care, and infrastructure may seem removed from creativity because of the steadiness of the labor required and the invisibility of the systems in place that keep our world from collapsing. But the delivery of essential goods and services amid conditions of rapid flux — including the dislocations caused by climate change — will require enormous acts of invention and ingenuity in the years to come. “Creativity” denotes a cluster of gifts and capacities that humanity cannot do without.
The deeper problem, it seems to me, is that today’s notion of creativity, in its business-bred focus on the “new,” risks obscuring a key path by which many acts of creation occur. This path involves an orientation not toward the new but the old. The historian of technology Brooke Hindle, in his 1981 study Emulation and Invention, calls this process “emulation.” Emulation entails the copying and studying of models. The learner begins by imitating. Eventually, she begins to compete against existing models and works to surpass them. More a “striving for quality and recognition than a marketplace competition,” the ethos of emulation developed from traditions of instruction in the arts and crafts. Major American inventions such as the steamboat and the telegraph, Hindle argues, emerged from a culture in which emulation was prized. Hindle’s focus is on mechanical invention, but the role of emulation in spurring developments in other disciplines, including the arts, seems apparent enough. The philosophy of emulation provides a corrective against the amnesiac promises of novelty claimed for all kinds of “creative” goods and products.
Emulation insists that the quest for excellence begins with the careful study of what has been achieved in the past. The implications for education are clear. To cultivate the imagination and — yes — the creativity of our students in a way that is neither frivolous nor superficial, we need to show them what has come before. To accomplish this, courses in the arts and the humanities, as well as in the history of science, technology, and medicine, ought to assume greater centrality. Such courses are not museums, or shrines to obsolete relics. They are the furrows in which the creative thinking of the future is planted. Whether we reap a harvest of art and invention in the future — or not — depends on whether this germination takes place.
One thing is certain: If we cede creativity to business, all we will get is more Pringles.