Over the next decade, the objective study of subjectivity will transform the academy and the world beyond.
Westerners have long labored under the assumption that objectivity and subjectivity operate in distinct and mutually exclusive realms. Traceable at least back to Plato’s gleaming white horse of reason and unruly black horse of passion, fighting to control the chariot of the self, the theme reappears in Descartes’s politically expedient split of the mind from the body. A similar divide may critically distinguish C.P. Snow’s two academic cultures—with the sciences revering the objective and the humanities reveling in the subjective. Forces on both sides continue to support the distinction.
Thanks to accelerating technological advances, however, the gap between the objective and the subjective is closing fast. Scientists are beginning to objectively study subjectivity. By “subjectivity,” I refer here not only to the more traditional topics involving awareness of input (as in the case of perception) or output (as in the case of action), but also intermediate processes related to emotion and thought. Beyond simple input and output, these flexible and dynamic forces interpret and animate, transforming perception to action.
The quiet revolution fomenting the objective study of subjectivity feeds on data rather than rhetoric. Armed with new technologies that can resolve changes in brain activity on a spatial scale of millimeters and a time scale of subseconds, neuroimagers are exceeding the dreams of their peers a scant decade ago. Already, they can infer not only when people see or hear something, or are preparing to move, but also when people are committing something to memory, when they feel excited about something they want, when they are inhibiting an impulse, when they are paying attention, when they experience conflict, and even when they are thinking about themselves. At present, those inferences are crude and sometimes even irresponsible, but their sophistication will improve as scientists insist on adhering to predictive standards. The end product will be maps of subjective experience. Because prediction inevitably implies modification, the maps insinuate plans—ways of changing the flow of activity to alter behavior. Inevitably, scientists will crack the neural code of subjectivity—it’s only a matter of technology and time.
What might these developments mean for the academy? Many disciplines purport to study human behavior at different levels. Yet these disciplines are fractionated and operate largely independently. Dynamic maps of subjectivity and their implications for behavior might provide a conceptual spark that could meld disparate realms.
Witness the rise of new hybrid fields like “neuroeconomics” and “social neuroscience,” where consideration of the mind necessarily implies consideration of the brain. Of course, academic adoption may not provide a leading indicator of change (often resulting from changes in staff rather than minds, goes the joke). Lay interest in subjective inference continues to rise, including such broad applications as predicting market choices and improving psychiatric diagnoses.
While practical applications inevitably raise ethical issues, the objective study of subjectivity also poses existential questions. Given human limits on time, energy, and insight, what if the subjective maps turn out to know us better than we know ourselves? “Know thyself” was carved explicitly into the walls at the Oracle of Delphi and implicitly into the foundation of the academy—but how well do we really want to know ourselves?