Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Events
    • Virtual Events
    • Chronicle On-The-Road
    • Professional Development
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Events
    • Virtual Events
    • Chronicle On-The-Road
    • Professional Development
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    College Advising
    Serving Higher Ed
    Chronicle Festival 2025
Sign In
Brainstorm Logo-Icon

Brainstorm

Ideas and culture.

Assessing Critical Thinking

By Laurie Fendrich March 7, 2008

I wish Jonathan Swift could come back to earth, sit side-by-side with me in front of this keyboard, and guide me as I write this post. Since that’s not going to happen, I’ll write this without the attitude it deserves. (Where, oh where, is my secretary when I need her?)

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

I wish Jonathan Swift could come back to earth, sit side-by-side with me in front of this keyboard, and guide me as I write this post. Since that’s not going to happen, I’ll write this without the attitude it deserves. (Where, oh where, is my secretary when I need her?)

At the most recent faculty meeting at my college, I listened patiently (well, sort of) while a very smart, well-intentioned, and hard-working colleague made a presentation about how our college is going to “measure” the specific “learning goal” of “critical thinking” — one of several “learning goals” that we faculty, in fulfillment of our mandate to participate in “outcomes assessment,” identified as part of the purpose of our distribution courses.

I’ll skip the details of the presentation (it’s no fault of my colleague, but there’s not only no way to make a silk purse out of the sow’s ear of outcomes-assessment practices, there’s no way to make outcomes-assessment jargon worth listening to) and go straight to the summary. Our plan of attack is to collect “meaningful data” on “critical thinking” as demonstrated in a select group of distribution courses. After we’ve gathered this “meaningful data,” we’ll figure out if students are learning the “critical thinking” we’ve identified as one of our “learning goals.” In practice, this means that student papers will be randomly selected, analyzed, and then analyzed again, a year or two later, for evidence that our students are improving (or getting worse) in their “critical thinking.”

ADVERTISEMENT

I know what “critical thinking” is (although I have to admit it took me a long time to figure out why plain, old-fashioned thinking — the kind used by such inferior thinkers as Aristotle, Fibonacci, Descartes, Hume, or Poincaré — no longer worked). I also know what data are. “Hel-lo,“ as my students would say: Data are information — often presented in the form of statistics or lists.

Without interpretation, data are meaningless. To become “meaningful data,” they must be gathered purposively and sifted through a conceptual scheme generated from outside the data themselves. Data are most meaningful when they’re obtained and interpreted rigorously — i.e., obtained using the methods of the hard sciences and mathematics. Data should always be interpreted carefully, and applied narrowly, the way scientists and mathematicians handle them.

The inherent flaw in relying on data is that their nature is to seduce their handlers, and their readers, into believing they’re seeing “truth.” Lists, columns, and charts, no matter their relation to reality, and no matter how sloppily gathered the data were out of which they’re composed, always appear to sum up the truth.

Moreover, when data are collected on those aspects of human beings involving quality rather than quantity — for example, something like “critical thinking” — the result is that the thing that’s purportedly being analyzed is likely to be lost in the process. When it comes to quality, the whole is always, without exception, more than the sum of its parts. (Love, to take an obvious example, is always more than recordable measurements of pulse, blood flow, and the observable actions of lovers in the throes of love.)

Let’s assume my college successfully collects and gathers writing samples from students and then has a committee look over these samples to check for “critical thinking.” They’ll most likely assign numbers from, say, 1 to 5, to rank the writing samples according to the evidence for “critical thinking” found in them.

Let’s then assume the committee successfully comes up with some data, based on numbers they assign to the writing samples, and compares these data with the data they obtain from assessing writing samples they collect a couple of years later. (Note that this is very similar to assessing how grades change over time, and that had faculty been doing their jobs these past years, grades — a form of assessment that allows for quality as well as quantity — would do the trick without the smoke and mirrors of outcomes assessment; but that’s another story.)

I see it already — the report, neatly charted out, demonstrating that there’s been a 7.6-percent increase in the “critical thinking” on the part of our students — or, conversely, a 7.6-percent decrease in their “critical thinking.” Following outcomes-assessment protocol, we will then “close the loop” — either by taking steps to improve the decline in our students’ skills of “critical thinking,” or, conversely, by concluding that our students are doing a very good job at “critical thinking.”

ADVERTISEMENT

Although some people complain about the onerousness of outcomes assessment, that’s not a significant problem. And we all want to improve “critical thinking” (or at least “thinking.”) The problem here lies in the pseudoscientific measurement of things that simply are not measurable by numbers nor translatable into data, but instead defined by their quality. To “measure” quality is absurd. The verb we need to use is judge — a very different action that many people apparently find repugnant.

What are the real indices of “critical thinking” in college students? The “critical thinking” we’re trying to track, using data that are collected without any rigor, can’t be found in any data. Moreover, the bad writing of many of today’s college students will easily be mistaken for a lack of “critical thinking.” It’s very plausible that many of our students, although writing very poorly, exercise critical thinking very well — in such actions as dropping and adding courses according to their analysis of which courses teach them something important and which don’t.

Now that outcomes-assessment practices are humming along, and we university professors are flapping our wings in the mud of actual assessment practices, we should take responsibility for what we’ve done. We should, at the very least, own up to the fact that we are the ones who need help with critical thinking.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
ADVERTISEMENT
ADVERTISEMENT

More News

Vector illustration of large open scissors  with several workers in seats dangling by white lines
Iced Out
The Death of Shared Governance
Illustration showing money being funnelled into the top of a microscope.
'A New Era'
Higher-Ed Associations Pitch an Alternative to Trump’s Cap on Research Funding
Illustration showing classical columns of various heights, each turning into a stack of coins
Endowment funds
The Nation’s Wealthiest Small Colleges Just Won a Big Tax Exemption
WASHINGTON, DISTICT OF COLUMBIA, UNITED STATES - 2025/04/14: A Pro-Palestinian demonstrator holding a sign with Release Mahmud Khalil written on it, stands in front of the ICE building while joining in a protest. Pro-Palestinian demonstrators rally in front of the ICE building, demanding freedom for Mahmoud Khalil and all those targeted for speaking out against genocide in Palestine. Protesters demand an end to U.S. complicity and solidarity with the resistance in Gaza. (Photo by Probal Rashid/LightRocket via Getty Images)
Campus Activism
An Anonymous Group’s List of Purported Critics of Israel Helped Steer a U.S. Crackdown on Student Activists

From The Review

Illustration of an ocean tide shaped like Donald Trump about to wash away sandcastles shaped like a college campus.
The Review | Essay
Why Universities Are So Powerless in Their Fight Against Trump
By Jason Owen-Smith
Photo-based illustration of a closeup of a pencil meshed with a circuit bosrd
The Review | Essay
How Are Students Really Using AI?
By Derek O'Connell
John T. Scopes as he stood before the judges stand and was sentenced, July 2025.
The Review | Essay
100 Years Ago, the Scopes Monkey Trial Discovered Academic Freedom
By John K. Wilson

Upcoming Events

07-31-Turbulent-Workday_assets v2_Plain.png
Keeping Your Institution Moving Forward in Turbulent Times
Ascendium_Housing_Plain.png
What It Really Takes to Serve Students’ Basic Needs: Housing
Lead With Insight
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin