Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    An AI-Driven Work Force
    AI and Microcredentials
Sign In
gif animation of giant checkmarks falling from the sky destroying academic buildings
Paul Windle for The Chronicle

The Terrible Tedium of ‘Learning Outcomes’

Accreditors’ box-checking and baroque language have taken over the university.

The Review | Essay
By Gayle Greene January 4, 2023

Every six years, the accountability police swoop down on my campus in the form of WASC, the Western Association of Schools and Colleges. The West Coast accreditation organization comes to Scripps, as it comes to all colleges in our region, to do our reaccreditation. The process used to take a couple of months, generating a flurry of meetings, self-studies, reports to demonstrate we’re measuring up. We’d write a WASC report — “wasp,” we called it, for the way it buzzed around making a pest of itself.

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

Every six years, the accountability police swoop down on my campus in the form of WASC, the Western Association of Schools and Colleges. The West Coast accreditation organization comes to Scripps, as it comes to all colleges in our region, to do our reaccreditation. The process used to take a couple of months, generating a flurry of meetings, self-studies, reports to demonstrate we’re measuring up. We’d write a WASC report — “wasp,” we called it, for the way it buzzed around making a pest of itself.

The WASC committee would come to campus, stirring up much hoopla and more meetings. They’d write up a report on our report, and after their visit, we’d write a report responding to their report on our report; the reports would be circulated, and more meetings would take place. Then it was over, and we could get back to work. It’s fairly pro forma with us; Scripps College runs a tight ship.

At least that’s how it used to be, just one of those annoying things to be got through, like taxes. Now that the reaccreditation process has become snarled in proliferating state and federal demands, it’s morphed from a wasp into Godzilla, a much bigger deal — more meetings, reports, interim reports, committees sprouting like mold on a basement wall. WASC demands that we come up with “appropriate student-outcome measures to demonstrate evidence of student learning and success,” then develop tools to monitor our progress and track changes we’ve made in response to the last assessment.

There are pre-WASC preps and post-WASC post mortems, a flurry of further meetings to make sure we’re carrying out assessment plans, updating our progress, and updating those updates. Every professor and administrator is involved, and every course and program is brought into the review. The air is abuzz with words like models and measures, performance metrics, rubrics, assessment standards, accountability, algorithms, benchmarks, and best practices. Hyphenated words have a special pizzazz — value-added, capacity-building, performance-based, high-performance — especially when one of the words is data: data-driven, data-based, benchmarked-data. The air is thick with this polysyllabic pestilence, a high-wire hum like a plague of locusts. Lots of shiny new boilerplate is mandated for syllabi, spelling out the specifics of style and content, and the penalties for infringements, down to the last detail.

All of this is perceived to be so complicated that a Director of Assessments and Institutional Research is hired. Yes, there are such things — it’s a burgeoning industry. It’s where the jobs in academe are — though they have different titles at different colleges: Assessment Officer, Officer of Institutional Effectiveness, Director of Assessment and Regional Accreditation.

If I’d wanted this kind of crap, I could have gone into business and be making money.

Our new director has “areas of expertise” in “WASC, course evaluations, survey administration and analysis” — no teaching experience, of course. At a time when the college can ill afford new appointments, when every faculty opening and sabbatical replacement is carefully vetted, suddenly there’s a new administrator. And administrators require staff and offices, and though they don’t have tenure, they might as well; generally, they’re here to stay. The less real work they do, the more make-work they generate for faculty.

When I came to Scripps, there were a dozen or so administrative offices whose functions I understood, and whose staff I knew by name, who genuinely facilitated the work of the college. But nationwide, between 1993 and 2009, administrative positions increased by 60 percent, 10 times the rate of tenured faculty. Now administrative offices and functionaries outnumber faculty members, a bureaucracy that drains resources and drives up costs.

We are required to work up an “assessment plan and logic model.” As specified in a memo intended to clarify, this means we must create “rubrics for student-learning outcomes,” “assessment method type(s) to assess each SLO,” “measurement tools to assess the selected student work assignment(s),” and must also suggest “potential work assignments your department could collect to measure your SLOs.”

There are worksheets with boxes for comments, results, and “action summaries.” A section for “additional design-tools” provides extra space for “entries of design methods.” Then there’s a “results/actions” section in which to recap each SLO, design method, selected student work, and measurement tool in the “logic model worksheet tab.”

“Unbelievable!” sighed a colleague after one of our meetings. “If I’d wanted this kind of crap, I could have gone into business and be making money.”

Then the boxes with “comments, results, and summaries” are to be incorporated into an Educational Effectiveness Review Report. “By applying the rubric to last year’s senior theses enables you to evaluate both the rubric and your results to help fine-tune the assessment of this year’s theses.” (That sentence is why some of us still care about dangling participles.) This is all written in a language so abstract and bloodless that it’s hard to believe it came from a human being. But that is the point, phasing out the erring human being and replacing the professor with a system that’s “objective.” It’s lunacy to think you can do this with teaching, or that anyone would want to.

ADVERTISEMENT

My colleagues roll their eyes, roll up their sleeves, set to work. Awfully good sports they are, also awfully glad to have jobs. Ours is a faculty that works very hard. Each year I’ve seen demands ratcheted up, committee work proliferating, more pressure to publish. Students at colleges like ours expect a lot in terms of faculty availability, advising, mentoring; they have a right to — tuition runs about $60,000 a year.

“Let me get this straight,” said a colleague, storming into my office, “we give them a number instead of a letter — no, we give lots of numbers — and that makes it ‘objective’?” He was waving a new directive that instructs us to “assess randomly selected students by number, assigning numbers 1 to 3, exemplary to unacceptable, initial-to-highly developed, and, using an Excel spreadsheet, rate their work numerically according to things like design process, argument or focus, authority, attribution, evaluation of sources, and ‘communicative effectiveness.’”

“I thought this college prided itself on not treating our students as numbers,” he spluttered.

Do not think I am singling out Scripps College for special criticism. From what I’ve heard, it’s as bad or worse elsewhere. I think most of our faculty see our dean and president as indefatigable women who work for and not against us and genuinely respect the liberal arts. This outcomes-assessment rigmarole has been foisted on all colleges, adding a whole new layer of bureaucratic make-work. Reports and meetings bleed into one another like endless war. Forests die for the paperwork, brain cells die, spirits too — as precious time and energy are sucked into this black hole. And this is to make us more … efficient? Only in an Orwellian universe. This is to establish a “culture of evidence,” we’re told. Evidence of what? Evidence of compliance, I’m afraid.

ADVERTISEMENT

Our new “director of assessments” is a pleasant young woman, only it’s hard sometimes to figure out what she means. Colleagues huddle in the halls, bent over the latest memo: Could she be saying ... ? Might this mean ... ? Look, it says here, no, that’s goals, not objectives. ... Wait a minute, it says outcomes, not objectives. ...

I swear that O stood for objectives when these directives first started appearing. There’s a big difference between outcomes and objectives. An objective is a goal, a purpose aimed for, aspired to, sought after. An outcome is a result or conclusion; and, as mandated here, it must be measurable.

OK, “department goals,” we can do that, make up a bunch of goals. But wait — now we’re told to recast these to fit “goal/outcome” structure. And that requires a lot more verbiage. What was before “Students will learn basic skills in literary studies” is now a mouthful:

Student exhibits the ability to read primary texts closely. Student is able to pose effective questions about form, content, and literary devices. Student engages with relevant critical approaches and with secondary material in literary studies.

(That’s the kind of garbage I try to purge from my students’ writing, but never mind.)

ADVERTISEMENT

“Students will learn to see their argument in historical context” becomes:

Student demonstrates an awareness that her arguments participate in a long-term conversation about the nature, function, and value of literary work.

“Students will learn to recognize and construct well-formed arguments” becomes:

Student recognizes well-formed argument, including recognition of argumentative structure, use of evidence, and a disciplinary framework. Student constructs such arguments.

Student in her right mind will flee this major and find another, except they’re all drowning in this gobbledygook.

ADVERTISEMENT

A guideline is circulated explaining the difference between outcomes and objectives, to make sure we know it’s outcomes, not objectives, we’re being asked to produce. “Objectives are generally less broad that [sic] goals and more broad than student learning outcomes.” I do a Google search because I’m still confused, and sure enough, there’s a boatload about this online.

Outcomes are “what a student must be able to do at the conclusion of the course,” explains an online source, and in order to assure these, it is best to use verbs that are measurable, that avoid misinterpretation. Verbs like write, recite, identify, sort, solve, build, contract, prioritize, arrange, implement, summarize, estimate are good because they are open to fewer interpretations than verbs like know, understand, appreciate, grasp the significance of, enjoy, comprehend, feel, learn, appreciate. This latter set of verbs is weak because the words are less measurable, more open to interpretation.

Wait a minute, I thought getting students to understand, feel, learn, appreciate, grasp the significance of, comprehend, and enjoy was sort of the point. No more, apparently. A friend who teaches poetry at a community college was instructed to take the word appreciate out of her SLO. Now we’re supposed to be teaching students to prioritize, arrange, implement, summarize, recite, sort, solve, build, contract — because these verbs are less open to interpretation? And here I was, thinking interpretation was kind of central to what I teach.

Anyone who wishes to know more about assessment-friendly verbs, I refer you to a 27-page typology from the National Institute for Learning Outcomes Assessment: To Imagine a Verb: The Language and Syntax of Learning Outcomes Statements, by Clifford Adelman. It is staggeringly specific. The document explains that “non-operational verbs” are not useful because they do not refer to outcomes: “These verbs do not produce observable behaviors or objects: recognize, develop, relate, consider, prepare, comply, reflect, realize, anticipate, foresee, observe, review, extend, work... Unless the learning outcome statement specifies what kind of ‘work,’ e.g. construct, build, model, shape, compose, it cannot be observed and judged.”

ADVERTISEMENT

The author lists 16 categories of “Productive Active, Operational Verbs Groups,” A through P. A few of them give the gist:

F) Verbs falling under the cognitive activities we group under “analyze”: compare, contrast, differentiate, distinguish, formulate, map, match, equate

G) Verbs describing what students do when they “inquire”: examine, experiment, explore, hypothesize, investigate, research, test

H) Verbs describing what students do when they combine ideas, materials, observations: assimilate, consolidate, merge, connect, integrate, link, synthesize, summarize

I) Verbs that describe what students do in various forms of “making”: build, compose, construct, craft, create, design, develop, generate, model, shape, simulate

It goes on. The list is “by no means ... exhaustive,” says the author, concluding, “you folks do a good job, but all of you — not just some of you — have to be far more explicit in your student-learning-outcome standards than you are at present.”

ADVERTISEMENT

Remember Scholasticism? The medieval theological-philosophical system that strangled knowledge with dogma for six centuries. Scholars debated how many angels fit on the head of a pin as the Turks were battering down the gates of Constantinople. So it is with the assessment orthodoxy. A reader of a 2015 Chronicle article on assessment by Erik Gilbert comments astutely: “While we are agonizing about whether we need to change how we present the unit on cyclohexane because 45 percent of the students did not meet the learning outcome, budgets are being cut, students are working full-time jobs, and debt loads are growing.”

“Academics are grown-up people who do not need the language police to instruct them about what kind of verbs to use,” wrote Frank Furedi in a blistering denunciation of “learning outcomes” in Times Higher Education in 2012. Warning faculty against using words like know, understand, appreciate because “they’re not subject to unambiguous test” is fostering “a climate that inhibits the capacity of students and teachers to deal with uncertainty.” Dealing with ambiguity is one of the most important things the liberal arts can teach.

Most disturbing, as Furedi notes, is that “responsibility becomes equated with box-ticking.” Responsibility, the lifeblood of a class or a college, has been reduced, in the name of “accountability,” to the ticking off of boxes. What matters is that the directives have been complied with, not whether students have actually learned.

And how could you devise a metric for the kindling of imagination, for joy, wonder, wisdom, enlightenment, empathy, humanity? These are matters of the spirit, not the spreadsheet. You could easily, however, devise a metric for the costs in time and money of this pernicious nonsense.

ADVERTISEMENT

We in the humanities try to teach students to think, question, analyze, evaluate, weigh alternatives, tolerate ambiguity. Now we are being forced to cram these complex processes into crude, reductive slots, to wedge learning into narrowly prescribed goal outcomes, to say to our students, “here is the outcome, here is how you demonstrate you’ve attained it, no thought or imagination allowed.”

This essay is adapted from Immeasurable Outcomes: Teaching Shakespeare in the Age of the Algorithm (Johns Hopkins University Press).

A version of this article appeared in the January 20, 2023, issue.
We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Assessment & Accreditation Student Success
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
About the Author
Gayle Greene
Gayle Greene is a professor emerita at Scripps College.
ADVERTISEMENT
ADVERTISEMENT

More News

Photo illustration showing Santa Ono seated, places small in the corner of a dark space
'Unrelentingly Sad'
Santa Ono Wanted a Presidency. He Became a Pariah.
Illustration of a rushing crowd carrying HSI letters
Seeking precedent
Funding for Hispanic-Serving Institutions Is Discriminatory and Unconstitutional, Lawsuit Argues
Photo-based illustration of scissors cutting through paper that is a photo of an idyllic liberal arts college campus on one side and money on the other
Finance
Small Colleges Are Banding Together Against a Higher Endowment Tax. This Is Why.
Pano Kanelos, founding president of the U. of Austin.
Q&A
One Year In, What Has ‘the Anti-Harvard’ University Accomplished?

From The Review

Photo- and type-based illustration depicting the acronym AAUP with the second A as the arrow of a compass and facing not north but southeast.
The Review | Essay
The Unraveling of the AAUP
By Matthew W. Finkin
Photo-based illustration of the Capitol building dome propped on a stick attached to a string, like a trap.
The Review | Opinion
Colleges Can’t Trust the Federal Government. What Now?
By Brian Rosenberg
Illustration of an unequal sign in black on a white background
The Review | Essay
What Is Replacing DEI? Racism.
By Richard Amesbury

Upcoming Events

Plain_Acuity_DurableSkills_VF.png
Why Employers Value ‘Durable’ Skills
Warwick_Leadership_Javi.png
University Transformation: a Global Leadership Perspective
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin