Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    Hands-On Career Preparation
    An AI-Driven Work Force
    Alternative Pathways
Sign In

Measuring Stick

Experts explore the quality and assessment of higher education.

Measurement of ‘Learning Outcomes’ Comes to Graduate School

By David Glenn December 1, 2010

Graduate-level programs were once relatively immune from pressure to define and measure “learning outcomes” for their students. But for good or ill, the student-learning-assessment movement has begun to migrate from the undergraduate world into master’s and doctoral programs. (At

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

Graduate-level programs were once relatively immune from pressure to define and measure “learning outcomes” for their students. But for good or ill, the student-learning-assessment movement has begun to migrate from the undergraduate world into master’s and doctoral programs. (At some institutions, there is even talk of defining a set of “foundational outcomes” for all graduate students—that is, a set of learning goals that would be analogous to general-education goals for undergraduates.)

On Wednesday morning, as the annual meeting of the Council of Graduate Schools got under way in Washington, three graduate deans led a workshop on assessing graduate students’ learning and using such assessments to improve programs.

Formal assessment for improvement, they said, is more useful and less painful than many faculty members believe. (And in any case, accreditors are insisting on it.)

ADVERTISEMENT

The three deans sat down for an interview after the workshop.

Q. In doctoral programs with intense mentor-apprentice relationships, the idea of establishing rubrics and other lists of learning outcomes might seem off-key. If I’m a senior professor of comparative literature and I’ve supervised 30 dissertations during my career, I probably know in my bones what successful learning in my program looks like. Why should I be asked to write out point-by-point lists of the skills and learning outcomes that my students should possess?

Charles Caramello, associate provost for academic affairs and dean of the graduate school at the University of Maryland: If you write out lists of learning outcomes, you’re making the invisible visible. That’s really my answer. We’ve all internalized these standards. They’re largely invisible to us. Assessment brings them out into visibility, and therefore gives them a history.

William R. Wiener, vice provost for research and dean of the graduate school at Marquette University, who is currently dean in residence at the Council of Graduate Schools: There’s no way to aggregate and to learn unless you’ve got some common instruments. By having common instruments, we can see patterns that we couldn’t see before.

James C. Wimbush, dean of the University Graduate School at Indiana University: Part of the story has to do with the external enviroment. Because of the decrease in funding for state institutions, because of political pressures from state legislators, we are forced to be much more accountable. Our boards of trustees now are looking for more accountability. They don’t necessarily say, “We want to make sure that you’re doing assessments of graduate programs.” But they’re questioning, Do we have too many graduate programs? We have to do a better job of being accountable for how we use our resources from the state and elsewhere. Assessment is one of the ways of doing that.

ADVERTISEMENT

William Wiener: And not only at public institutions. My Board of Directors asks the same questions.

Charles Caramello: Faculty care about standards. They really care about excellence. They really care about evaluation, and they really care about peer review. To the extent that you can say, Look, assessment is a form of all of these things—it’s not alien to what you do every day. It’s another name for it, and a slightly different way of doing it. And the great advantage of it is that it gives you a way to aggregate information, and therefore to see patterns.

Q. What about graduate programs that are now being asked to do student-learning assessments for two accrediting bodies? An engineering program, for example, might now be expected to do student-learning evaluation both for the specialized engineering accreditor and for its university’s regional accreditor.

James Wimbush: Yes, that happens. The school of education, the school of business—they have very rigid accreditation standards from their associations. They tend to focus on meeting those particular criteria.

Charles Caramello: But those programs tend to come on board most quickly with student-learning assessment because for them this is familiar. One important thing that we try to do at Maryland is not ask these programs to do the same thing twice. If they’re already using an assessment model for their specialized accreditor, we don’t want to tell them that they have to create a second model. We’ll find a way to work with them.

ADVERTISEMENT

William Wiener: But sometimes there are elements that are missing. The outside accreditors are concerned with their own standards. They’re not always so concerned with the mission of the university.

Q. Once a university has developed learning goals for its graduate programs and has been through several cycles of assessment, how public do you want to make that information?

William Wiener: I think it should be public. I think it will give our public confidence in what we’re doing. I think the universities are afraid. But I think that will change. Where a program is low, so be it. What’s important is, Do they improve over time? And if you don’t start with something, you’re not going to go to the next level.

Charles Caramello: Programs are wary, and with some reason. You can’t create a situation where a program is shamed. Publicly, the message to put forward is, This is what we’ve discovered, and this is what we’re doing to improve. That’s useful to students, it’s useful to prospective students, it’s useful to the faculty in the program, it’s useful to the dean and provost. And that’s a real form of accountability. That’s not numbers. It’s “We found this problem. We’re going to fix this problem.” And then you can look two or three or five years later and see. Has the problem been fixed?

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
About the Author
David Glenn
David Glenn joined The Chronicle of Higher Education in 2002. His work explored how faculty members are trained, encouraged, and evaluated as teachers; how college courses and curricula are developed; and the institutional incentives that sometimes discourage faculty members from investing their energy in teaching.
ADVERTISEMENT
ADVERTISEMENT

More News

Marva Johnson is set to take the helm of Florida A&M University this summer.
Leadership & governance
‘Surprising': A DeSantis-Backed Lobbyist Is Tapped to Lead Florida A&M
Students and community members protest outside of Coffman Memorial Union at the University of Minnesota in Minneapolis, on Tuesday, April 23, 2024.
Campus Activism
One Year After the Encampments, Campuses Are Quieter and Quicker to Stop Protests
Hoover-NBERValue-0516 002 B
Diminishing Returns
Why the College Premium Is Shrinking for Low-Income Students
Harvard University
'Deeply Unsettling'
Harvard’s Battle With Trump Escalates as Research Money Is Suddenly Canceled

From The Review

Illustration showing a valedictorian speaker who's tassel is a vintage microphone
The Review | Opinion
A Graduation Speaker Gets Canceled
By Corey Robin
Illustration showing a stack of coins and a university building falling over
The Review | Opinion
Here’s What Congress’s Endowment-Tax Plan Might Cost Your College
By Phillip Levine
Photo-based illustration of a college building under an upside down baby crib
The Review | Opinion
Colleges Must Stop Infantilizing Everyone
By Gregory Conti

Upcoming Events

Ascendium_06-10-25_Plain.png
Views on College and Alternative Pathways
Coursera_06-17-25_Plain.png
AI and Microcredentials
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin