The Chronicle Review

Hey, Physics, Get Real!

Mario Wagner for The Chronicle Review

March 13, 2011

I'm not sure if I've changed or physics has changed, but the thrill is gone. Recent books by practitioner-popularizers—notably The Grand Design, by Stephen Hawking and a collaborator, the Caltech physicist Leonard Mlodinow (Bantam Books, 2010); The Hidden Reality, by Brian Greene (Alfred A. Knopf, 2011); and the forthcoming Cycles of Time: An Extraordinary New View of the Universe, by Roger Penrose (Knopf, 2011)—are leaving me peeved rather than inspired. In fact, I no longer recommend books like those to students hungry for far-out ideas from science's frontiers.

I feel a little guilty knocking physics, which more than any other field lured me into science journalism three decades ago. As a teenager, I lapsed out of the Catholicism in which I was raised, but I never stopped wrestling with the riddles that religion supposedly answered. Physics provided me with a kind of scientific theology, an empirical, rational way of probing, if not solving, the mysteries of existence. Physicists were discerning deep resonances between the smallest and largest scales of reality and spinning out astonishing conjectures about our universe and even other universes.

Just before I graduated from college with a degree in English, in 1982, I discovered the writings of John Wheeler, the archetypal physics-for-poets physicist. Wheeler was no flake. He helped Niels Bohr with the liquid-drop model of the nucleus, pioneered the study of black holes (and coined the term), and contributed to nuclear-weapons designs. Musing over the odd manner in which observation seems to determine the outcome of quantum experiments, Wheeler challenged materialism itself. He proposed that we live in a "participatory" cosmos, which emerges from the interaction of consciousness and the physical realm.

One of my final college papers explored the anthropic principle, postulated by Brandon Carter in the 1970s and popularized by Wheeler. The anthropic principle responded to one of the deepest of all questions: Why are the laws of physics—embodied in quantum mechanics and relativity—as we find them rather than some other way? What explains the precise strength of gravity and other constants of nature? The universe seems so, well, arbitrary. According to the anthropic principle, the universe must be as we observe it to be, because otherwise we wouldn't be here to observe it.

And our universe may be one of many. The big-bang theory led some theorists, including Einstein, to suggest that our cosmos will eventually stop expanding, collapse back in a "big crunch," and rebound in yet another big bang. According to the oscillating-universe hypothesis, this cycle of cosmic death and rebirth is never-ending. Friedrich Nietzsche had tormented himself with a similar idea, which he called "eternal recurrence"; he feared that everything we do happens over and over again, ad infinitum.

Quantum mechanics, which implies that a fleck of light or matter, when we're not looking at it, exists in a haze of probabilities, yielded an even weirder multiverse theory. In the late 1950s, the physicist Hugh Everett III proposed that each particle wanders down every possible path—in other universes. Ever­ett's many-worlds hypothesis evoked "The Garden of Forking Paths," the spooky tale by Jorge Luis Borges, who, like Nie­tzsche, was one of my favorite writers. (Question: Did quantum mechanics inspire Borges, who in turn influenced Everett?)

My favorite cosmic conjecture was the Omega Point Theory, a fusion of artificial intelligence and cosmology proposed in the late 1980s by John Barrow and Frank Tipler. They speculated that, in the not-so-distant future, our machines will become conscious, superintelligent, autonomous beings, which will quickly embark from Earth and begin colonizing the galaxy.

The machines will eventually transform the entire universe into one gigantic computer, or brain, Barrow and Tipler said. When the universe stops expanding and collapses toward an infinitely small point, the density and computational power of this cosmic brain will spike toward infinity. This all-powerful computer will be able to simulate any possible reality, making it, in effect, God. (Could Watson, the IBM computer that recently trounced two human Jeopardy champs, be a harbinger of the Omega Point?)

I once asked Tipler if our reality could be a simulation created by an Omega Point in the past. That's possible, he said, but not likely, given how much suffering there is in our world. Now that's scientific theology.

What can you do to top suppositions like those? Not much, it seems. The new books on physics promise "a state-of-the-art tour of cutting-edge science that is changing the way we see our world," as the jacket blurb for The Hidden Reality puts it. But they are just recycling the once-startling propositions of Car­ter, Everett, Wheeler, Barrow, Tipler—and Nietzsche and Borges, for that matter.

In The Grand Design, Hawking proclaims string theory—which postulates that reality consists of extremely small strings wriggling in an extremely hypothetical hyperspace of 10 or more dimensions—to be the ultimate account of reality. String theory comes in an infinite number of versions, but Hawking tries to turn this bug into a feature: All possible stringy universes actually exist, he asserts, and the anthropic principle explains why we find ourselves in this one.

In Cycles of Time, Penrose offers a new and improved version of the old oscillating-universe theory, which is supposedly compatible with the recent discovery that the universe is expanding at an accelerating rate. And Greene's book, subtitled Parallel Universes and the Deep Laws of the Cosmos, touts the oscillating universe, stringy multiverses, Everett's many-worlds theorem, and a host of variants. Greene teases us with the idea that in an infinite, eternal multiverse, everything must happen countless times; somewhere out there your doppelgänger is reading this sentence, and elsewhere "she has skipped ahead or feels in need of a snack."

Greene takes Nietzsche's eternal recurrence and makes it cute. His suggestion that our universe may be a simulation run on the computer of an alien civilization is also old hat. These ideas, in fact, are just scientized versions of stoner thought experiments: What if our whole world is just a grain of dirt in the pocket of a giant? And there is a whole universe inside one grain of dirt in our pockets? What if our world is really just an experiment created by evil machines? And so on.

Physicists' fantasies about parallel and virtual realms are not just stale. Increasingly they strike me as escapist and irresponsible. Scientists shouldn't have to serve the public good any more than poets or musicians. I value truth for its own sake, even if the truth disturbs. But I do think that theories—if they are being passed off as science—should have at least a remote chance of being empirically corroborated. Otherwise, how do they differ from pseudoscientific ideas like intelligent design?

Susan Sontag's 2002 essay "Looking at War" captures my jaded attitude toward the books of Hawking, Greene, and Penrose. Sontag castigated the philosopher Jean Baudrillard, among others, for claiming that there is no reality anymore; there are only media "representations," "spectacles," and "simu­lated realities." (Baudrillard's book Simulacra and Simulations inspired the Wachowski brothers to make the film The Matrix, about a world that is actually a computer simulation created by evil aliens.) This sort of philosophical claptrap, Sontag argued, is reprehensible in a world filled with real people suffering from real injustice, tyranny, and wars.

I haven't entirely given up on physics. I'm intrigued by quantum computation, which seeks to harness quantum weirdness to make computers more clever. The ability of electrons, say, to exist in a "superposition" of many possible states means that they can, in principle, carry out multiple, parallel computations and hence outperform conventional computers. (Imagine how smart Watson would be if it were a quantum computer.)

I'm also fascinated by the potential fusion of physics with information theory. Invented in 1948 by the mathematician Claude Shannon, information theory is a method for quantifying the improbability—the surprisingness, if you will—of a message. The theory laid the foundation for the digital age; it helps engineers optimize the efficiency of information processing, communication, and storage systems, from cellphones to compact discs. As the journalist James Gleick reports in his marvelous new book The Information (Pantheon, 2011), physicists—notably my old hero John Wheeler—have discovered deep connections between information theory and quantum mechanics. Wheeler coined the phrase "the it from bit" to summarize the idea that the "it" of objective, physical reality stems from "answers to yes-or-no questions, binary choices, bits."

But more than new ideas, physics desperately needs new facts. Budget cuts have forced into retirement the biggest American particle accelerator, the Tevatron, which helped flesh out the standard model of particle physics. Physicists pining for new data are now pinning their hopes on the Large Hadron Collider, in Switzerland. If and when that hypersensitive contraption gets up to full speed, perhaps it will stumble into a discovery so bizarre that it entices theorists lost in simulated and parallel universes back to reality.

Until then, I tell my students, if you're looking for science books that pose profound metaphysical puzzles, don't bother with the physics best sellers. Instead, check out works by Oliver Sachs, V.S. Ramachandran, and other intrepid explorers of the brain. Science's most thrilling frontier is the one inside our skulls.

John Horgan is a science journalist and director of the Center for Science Writings at the Stevens Institute of Technology. His books include The End of Science (Addison-Wesley, 1996) and Rational Mysticism (Houghton Mifflin, 2003).