Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    Hands-On Career Preparation
    An AI-Driven Work Force
    Alternative Pathways
Sign In
The Review

Teaching in the Time of Google

When the world’s collective knowledge is at our fingertips, what becomes of college?

By Michael Patrick Lynch April 24, 2016
Teaching in the Time of Google 2

Imagine a society where smartphones are miniaturized and hooked directly into a person’s brain. With a single mental command, those who have this technology — let’s call it neuromedia — can access information on any subject. Want to know the capital of Bulgaria or the airspeed velocity of an unladen swallow? It’s right there. Users of neuromedia can take pictures with a literal blink of the eye, do complex calculations instantly, and find at once the contact information for anyone they’ve ever met. No need to remember the name of the person you were introduced to last night at the dinner party — a subcellular computing device does it for you.

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

Imagine a society where smartphones are miniaturized and hooked directly into a person’s brain. With a single mental command, those who have this technology — let’s call it neuromedia — can access information on any subject. Want to know the capital of Bulgaria or the airspeed velocity of an unladen swallow? It’s right there. Users of neuromedia can take pictures with a literal blink of the eye, do complex calculations instantly, and find at once the contact information for anyone they’ve ever met. No need to remember the name of the person you were introduced to last night at the dinner party — a subcellular computing device does it for you.

For the people of this society, it is as if the world is in their heads. It is a connected world, one where knowledge can be shared immediately with everyone in a very intimate way. From the inside, tapping into the collective wisdom of the ages is as simple as tapping into one’s own memory. Knowledge is not only easy; everyone knows so much more.

So why should anyone go to college?

This is no idle question. The migration of technology into our bodies — the cyborging of the human — is no longer just fantasy. And as technology moves ever inward, the Internet of Things is well on its way to becoming the Internet of Us. The possibilities are hardly lost on the lords of Silicon Valley: “When you think about something and don’t really know much about it, you will automatically get information,” Google co-founder Larry Page is quoted as saying in Steven Levy’s 2011 book, In the Plex. “Eventually you’ll have an implant, where if you think about a fact, it will just tell you the answer.”

This scenario raises all sorts of disquieting questions, especially for those of us in the knowledge business. Some of those questions are familiar to anyone who has struggled to craft a policy for the use of personal technology in the classroom. But these practical questions only mask the more fundamental issue: What’s the role of a liberal-arts education in a society that can call up a world of knowledge with a handheld device — or, one day, with a simple stream of neurons?

The answer depends on what kind of knowledge we value, and what kind we want our students to acquire.

Worries about technology’s impact on education are hardly new. Plato is said to have been hostile to that antique innovation — writing — fearing that it would diminish cognitive abilities like memory. At one time, calculators were verboten in math classrooms. Even the library, a trove of knowledge for anyone who wants it, could be viewed as a threat from this perspective. The question “Why go to college if you have neuromedia?” might seem not much different from the question (one I took seriously myself as a know-it-all youth) “Why go to college when you have a library?”

But while worries about technology aren’t new, there is a difference in how we acquire information now. And appreciating that difference is epistemologically and culturally important — not because it should cause us to be hostile to information technology or its uses in the classroom, but because it underlines the distinctive value of higher education.

The Internet is the most glorious fact-checker and the most effective bias-affirmer ever invented.

Much of what we know now we know via what we might call “Google-knowing” — by which I mean getting information not just via search engine but all manner of digital interfaces, such as the apps on our smartphones. There was a time when some snarled at the thought that “Google-knowing” was real knowing at all. (Remember when Wikipedia was controversial?) But that battle is thankfully over — nor was it necessary in the first place. According to one pretty standard definition of knowledge that goes back to Plato, Google-knowing obviously fits the bill. To know in this minimal sense is to have accurate and warranted information from a reliable source. If we are looking for a restaurant, and the directions we get online turn out to be accurate and from a reliable source, then we “know.”

But we can’t let the similarities between Google-knowing and other ways of getting information blind us to the unusual combination of features that make it epistemologically distinctive. First, Google-knowing is cognitively integrated — meaning our use of it is so ingrained in our lives that we don’t even notice how seamless our acquisition of information in this way really is. We rely on it every day, all day long. We routinely allow it to trump other sources. It is our default. In a way, it is like sense perception: Where we used to say seeing is believing, now we think Googling is believing.

Second, Google-knowing is also outsourced. It is not just in our heads. When we Google-know, we are really knowing via testimony: Ultimately we are relying on the say-so, the design work, and the sheer cumulative weight of others’ preferences. We are outsourcing, and as a result, interconnected by the strings of 1s and 0s that make up the code of the digital atmosphere. That is the truest sense in which knowledge is more networked now, and why it is not an exaggeration to say, as the economist Jeremy Rifkin does, that the Internet “dissolves boundaries, making authorship a collaborative, open-ended process over time.” It is also why our online life is more affected by the opinions and biases of others than we often appreciate, as even the most casual web search illustrates (search for: “Climate change … " for example, and Google will helpfully suggest “is a hoax”).

It is this combination that makes Google-knowing distinctive: at once seamlessly integrated into individual experience but outsourced and guided by the preferences of others. It is both in and out of our heads. That is what makes it so useful, and also so problematic. The Internet is at one and the same time the most glorious fact-checker and the most effective bias-affirmer ever invented. Google-knowing allows us to share in and with the world. And sharing, as Mom always said, is good — except when it isn’t. It depends on what we share (whether it is good information) and whom we share it with (do we stay in our own circle, or do we try to expand our information horizon beyond our personal prejudices?). As any teacher knows, these are the sorts of problems that overreliance on Google-knowing can cause.

But there is also a less noticed factor here, one with more than a hint of paradox to it. The more information we have — even when much of it is perfectly accurate — the more epistemically overconfident we can become. It can lull us into thinking we know more, or can know more, than we actually do. Just Google it, we tell each other. It is all right there to be found. In some ways that’s true — but it depends on where you look. And partly because there is just so much information to sort through, we online humans tend to look at small sets or “families” of reinforcing sites. Unnoticed, this can make us more intellectually passive and deferential than is good for us — but it can also make us dig in, stick to our guns, come what may.

ADVERTISEMENT

The epistemic overconfidence that Google-knowing encourages is one reason teaching critical, reflective thinking matters more than ever. In a world where the sharing of information has never been easier, it is not enough to luck into information from good sources; we need to know how to tell which sources are reliable, how to recognize evidence, and how to employ that evidence when challenged.

But while critical thinking is important, it isn’t the end of higher education itself. It is a means to that end, which is a different kind of knowledge — what philosophers have sometimes called understanding.

Understanding incorporates the other ways of knowing, but goes farther. It is what people do when they are not only responsive to the evidence, but also have insight into how that evidence hangs together. Understanding is what we have when we know not only the “what” but the “why.” Understanding is what the scientist is after when trying to find out why Ebola outbreaks happen (not just predict how the disease spreads). It is what you are looking for when trying to grasp why the Battle of Vicksburg was a turning point in the Civil War (as opposed to simply knowing that it was).

To gain understanding is to comprehend hidden relationships among different pieces of information. These relationships can, of course, come in different forms depending on what it is we are trying to understand. In the case of history and science, the relationships are causal; in the case of literature, symbolic and emotional; in philosophy and mathematics, logical.

ADVERTISEMENT

In one really obvious sense, information technology is helping us understand more than ever before. Google-knowing is a terrific basis for understanding. You can’t connect the dots if you don’t have the dots in the first place. Yet Google-knowing, while a basis for understanding, is not the same as understanding, because it is not a creative act.

Understanding is different from other forms of knowledge because it is not directly conveyed by testimony. It is something you must attain yourself, not something you can outsource. The creativity involved in understanding helps explain our intuitive sense that it is a cognitive act of supreme value, not just for where it gets us but in itself. Creativity matters to human beings. Sure, that’s partly because the creative problem-solver is more apt to survive, or at least to get what she wants. But we also value creativity as an end. It is something we care about for its own sake; being creative is an expression of some of the deepest parts of our humanity.

This fact is why the real value of higher education lies in its ability to facilitate the creative abilities that understanding requires. Not all ways of teaching do this, as we well know. It doesn’t take a rocket scientist to realize that monotonously reading PowerPoint slides is not the best way to get your students to understand rockets. But that’s why we spend so much time crafting discussion and lab sections, engaging in the Socratic method during lectures, or breaking students into groups so they can engage directly over the ideas we are trying to get across. We do all this because we want our students to develop those skills and abilities that both feed understanding as well as manifest it. We want our students to not only go to the library — we want them to know what to do with what they find there, to use it to go beyond mere lists of facts to seeing how those facts relate. Higher education, at least at its best, is an engine for understanding.

That will remain the case, I hope, no matter what comes down the technological pike. But there is still a danger here. I don’t mean danger from impending technology per se. It is probably true that if Google ever really makes Larry Page’s dreamed-of neural implant, universities will face a real disruption that far exceeds the impact of MOOCs or interactive classrooms. But we don’t have to wait till then. Google-knowing has already changed our culture’s attitude toward information and its availability; it has already increased epistemic overconfidence. That’s not the fault of technology itself; it is the fault of our use of it, our human tendency to favor convenience without regard to consequences.

ADVERTISEMENT

The consequences that particularly matter here concern the role education can play in creating an informed public. The neuromedian society we imagined at the start will be more informed and collaborative in some ways, without a doubt. But the networking of minds will only increase understanding to the degree those minds are willing and able to do the hard work of extending themselves. And it will decrease to the degree they allow their collective biases to guide their network. The question is whether — without the challenges that higher education can bring at its best — they’d even notice. We often say we know not to believe everything on the Internet, we know we are subject to confirmation bias, we know that it is important to look beyond the appearances of the topmost link. But our uses of information technology in our current political climate — particularly when it comes to discussions over elections, or race, or climate change — make our insistence on our own sophistication sound rather hollow.

Hence the danger: If we want our institutions of higher education to continue to be engines of understanding, we’d better make sure they are fine-tuned to deliver. We’d better make sure, in short, that they don’t drift into becoming expensive mechanisms for passing on Google-knowledge.

Michael Patrick Lynch is a professor of philosophy and director of the Humanities Institute at the University of Connecticut. He is the author, most recently, of The Internet of Us: Knowing More and Understanding Less in the Age of Big Data (Liveright).

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Opinion
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
ADVERTISEMENT
ADVERTISEMENT

More News

Collage of charts
Data
How Faculty Pay and Tenure Can Change Depending on Academic Discipline
Vector illustration of two researcher's hands putting dollar signs into a beaker leaking green liquid.
'Life Support'
As the Nation’s Research-Funding Model Ruptures, Private Money Becomes a Band-Aid
Photo-based illustration of scissors cutting through a flat black and white university building and a landscape bearing the image of a $100 bill.
Budget Troubles
‘Every Revenue Source Is at Risk’: Under Trump, Research Universities Are Cutting Back
Photo-based illustration of the Capitol building dome topping a jar of money.
Budget Bill
Republicans’ Plan to Tax Higher Ed and Slash Funding Advances in Congress

From The Review

Photo-based illustration of the sculpture, The Thinker, interlaced with anotehr image of a robot posed as The Thinker with bits of binary code and red strips weaved in.
The Review | Essay
What I Learned Serving on My University’s AI Committee
By Megan Fritts
Illustration of a Gold Seal sticker embossed with President Trump's face
The Review | Essay
What Trump’s Accreditation Moves Get Right
By Samuel Negus
Illustration of a torn cold seal sticker embossed with President Trump's face
The Review | Essay
The Weaponization of Accreditation
By Greg D. Pillar, Laurie Shanderson

Upcoming Events

Ascendium_06-10-25_Plain.png
Views on College and Alternative Pathways
Coursera_06-17-25_Plain.png
AI and Microcredentials
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin