> Skip to content
FEATURED:
  • The Evolution of Race in Admissions
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Find a Job
    • Post a Job
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Find a Job
    • Post a Job
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Find a Job
    • Post a Job
Sign In
ADVERTISEMENT
The Review
  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

An Insider’s Take on Assessment: It May Be Worse Than You Thought

By  Erik Gilbert
January 12, 2018

An Insider’s Take on Assessment: It May Be Worse Than You Thought 1
Katherine Streeter for The Chronicle

No doubt many of you will spend part of the month of January looking over assessment material from the fall semester. Equipped with some pre- and post-tests, a couple of artifacts, a rubric, a curriculum map, and, perhaps, a little bourbon, you will study your data carefully, make a few quick inferences and then identify a minor problem that you can address by making equally minor changes to your course or program.

However, you may find that upon close examination the data don’t seem to be saying anything at all to you. You may even be tempted to just make something up. If you do go that route, it’s probably because you have concluded that assessment data do not tell you anything useful about your program, so there is no harm in fudging your analysis of the data.

We’re sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.

Once javascript and access to those URLs are allowed, please refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com

An Insider’s Take on Assessment: It May Be Worse Than You Thought 1
Katherine Streeter for The Chronicle

No doubt many of you will spend part of the month of January looking over assessment material from the fall semester. Equipped with some pre- and post-tests, a couple of artifacts, a rubric, a curriculum map, and, perhaps, a little bourbon, you will study your data carefully, make a few quick inferences and then identify a minor problem that you can address by making equally minor changes to your course or program.

However, you may find that upon close examination the data don’t seem to be saying anything at all to you. You may even be tempted to just make something up. If you do go that route, it’s probably because you have concluded that assessment data do not tell you anything useful about your program, so there is no harm in fudging your analysis of the data.

If that was you, don’t worry. It turns out that the assessment program your college imposed on you was probably never going to improve anything. A new article by an assessment insider explains why this is so and suggests that assessors have known for sometime now that assessment does not work.

The article, in Intersection, the journal of the Association for the Assessment of Learning in Higher Education is by David Eubanks, assistant vice president for assessment and institutional effectiveness at Furman University and a board member of the association. In it Eubanks details the methodological flaws that are inherent to assessment and argues that because of the broad scale on which assessment is done, few of the methods employed by social-science research are used.

ADVERTISEMENT

In Eubanks’s words: “The whole assessment process would fall apart if we had to test for reliability and validity and carefully model interactions before making conclusions about cause and effect.” But not doing so, and then applying “common sense” analysis to the dubious data, “is akin to a Rorschach test.”

This does not mean that it is impossible to do meaningful research on what works in the classroom. He shows in a case study how looking at the success rates of students in foreign-language classes can produce meaningless conclusions or interesting and nuanced conclusions depending on how the data are gathered, contextualized, and analyzed. The bad news (for assessment) is that getting to those meaningful conclusions requires a specific type of expertise and far more time and effort than are available to any assessment program. In addition to its critique of assessment, the article contains a couple of other noteworthy aspects. First, it’s clear that people in the assessment world have known for some time that their work was not producing results. It is also apparent that most of them assumed that the reason it did not work was not their fault. It was your fault.

After describing a 2011 article by Trudy Banta and Charles Blaich suggesting that there was little evidence that assessment led to improvements in student learning, Eubanks says:

There are two possible conclusions. One is that the faculty are generating good data, but are not using it. This way of thinking extends the diminishment of faculty expertise that began with telling them that grades do not measure learning: we’ve replaced grades with something better — assessments that do measure learning — but they still are not producing the intended results. Therefore (the argument goes) we just need to work more on our processes, so that when the faculty finally do fully adopt these changes a fountain of good educational research will spring forth.
There is another possible conclusion from the Banta & Blaich article, one that is confirmed by my decade of experience: it is not that the faculty are not trying, but the data and methods in general use are very poor at measuring learning.

If you look at a typical assessment conference program, you will see that there is an astounding amount of time devoted to dealing with reluctant faculty and doubters. So I don’t think he is misrepresenting the extent to which assessment researchers have blamed the failings of their ideas on faculty members.

ADVERTISEMENT

It is also noteworthy that Eubanks acknowledges that assessment inherently discounts faculty expertise. (I would add that it discounts disciplinary expertise, too.) He also seems to be opening the door to a challenge to what is perhaps the single most implausible idea associated with assessment: that grades given by people with disciplinary knowledge and training don’t tell us about student learning, but measurements developed by assessors, who lack specific disciplinary knowledge, do.

Because it’s fairly obvious that assessment has not caused (and probably will not cause) positive changes in student learning, and because it’s clear that this has been an open secret for a while, one wonders why academic administrators have been so acquiescent about assessment for so long.

Here’s why: It’s no accident that the rise of learning-outcomes assessment has coincided with a significant expansion in the use of adjunct faculty, the growth of dual enrollment, and the spread of online education. Each of these allows administrators to deliver educational product to their customers with little or no involvement from the traditional faculty. If they are challenged on the quality of these programs, they can always point out that assessment results indicate that the customers are learning just as much as the students in traditional courses.

But if assessment is little more than a Rorschach test, that argument loses whatever force it had. That harried adjuncts, high-school teachers, and online robo-courses get the same assessment results as traditional courses probably tells us more about the nature of assessment than anything else.

As Upton Sinclair said, “It is difficult to get a man to understand something when his salary depends upon his not understanding it.” David Eubanks has taken a courageous position. It would be nice to see a few university administrators take similar risks.

ADVERTISEMENT

Erik Gilbert is a professor of history at Arkansas State University.

A version of this article appeared in the February 2, 2018, issue.
We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Opinion
Erik Gilbert
Erik Gilbert is a professor of history at Arkansas State University. He blogs at badassessment.org.
ADVERTISEMENT
ADVERTISEMENT

Related Content

  • Assessing the Intangible in Our Students
  • Where Every Student Is a Potential Data Point
  • You Will Be Assessed and Found Mediocre
  • The Next Great Hope for Measuring Learning
  • Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
    Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
  • The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
    The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
  • Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
    Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
  • Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
    Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin