Are students better thinkers and problem solvers when they finish college than they were when they started? And how can we tell? The Wall Street Journal brought some new evidence into that long-running debate this week with an analysis of individual colleges’ results from the Collegiate Learning Assessment examination, known as the CLA+.
The Council for Aid to Education, which created the test, releases national results but keeps the scores of individual institutions confidential. The Journal, using public-records requests, obtained the scores of dozens of public colleges that have given the test in recent years.
We’re sorry, something went wrong.
We are unable to fully display the content of this page.
This is most likely due to a content blocker on your computer or network.
Please allow access to our site and then refresh this page.
You may then be asked to log in, create an account (if you don't already have one),
or subscribe.
If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com.
Are students better thinkers and problem solvers when they finish college than they were when they started? And how can we tell? The Wall Street Journal brought some new evidence into that long-running debate this week with an analysis of individual colleges’ results from the Collegiate Learning Assessment examination, known as the CLA+.
The Council for Aid to Education, which created the test, releases national results but keeps the scores of individual institutions confidential. The Journal, using public-records requests, obtained the scores of dozens of public colleges that have given the test in recent years.
It will come as little surprise to readers of the 2011 book Academically Adrift: Limited Learning on College Campuses that the Journal’s analysis found that, over all, results on the test — designed to measure students’ gains in critical thinking between freshman and senior years — were “discouraging.”
“At more than half of schools,” the Journal found, “at least a third of seniors were unable to make a cohesive argument, assess the quality of evidence in a document, or interpret data in a table.”
Some experts question drawing broad conclusions from the CLA+ and similar tests, and there has been some movement away from using such instruments to assess student learning. A number of the colleges that are low performers in the Journal’s analysis told the newspaper they no longer use the test. In fact, Chronicle reporting found that at least a couple of the colleges that came out near the top no longer use it, either.
ADVERTISEMENT
Others, however, think that something important can be gleaned from the CLA+. Among them is the University of New Mexico, which was near the top of colleges examined by the Journal on a “value-added” measure that seeks to account for pre-existing differences between students and differences in colleges’ student populations.
The Chronicle spoke with Greg Heileman, vice provost for teaching, learning, and innovation and a professor of electrical and computer engineering at the university, about how it uses the test. The following conversation has been edited and condensed.
Q. Tell me about how the university uses the CLA+.
A. We’ve had this push to try to understand the value-add that we provide to students. One regent in particular, General [Bradley C.] Hosmer, has been very insistent that we measure the value that we provide. He’s very data-analytically driven, and has really pushed us to not only assess our programs but assess how well our graduates do. So we’ve done this and other exams in order to make the case.
Q. You mentioned you use a variety of tools in this effort. What do you find most valuable about the CLA+ in particular?
ADVERTISEMENT
A. What we like about the CLA+ is it doesn’t just measure recall of information; it actually measures critical-thinking and problem-solving skills and the ability to communicate effectively. That is very important to us because that is what is important for employers.
Q. Has the university changed anything in its educational approach or curriculum due to the CLA+ results?
A. We’re constantly updating and trying to improve our curriculum based on the feedback we obtain. If you look at the results from this exam — I think you have our 2014-15 data — that’s from students who started in 2010 or prior to that.
We use this data … to make continuous changes to the quality of our programs.
We’ve had a long history of trying to meet our students where they’re at when they come to the institution, to provide programs that help them get acclimated to college and to find initial success. And so we use this data — back then, and continue to use this data — to make continuous changes to the quality of our programs.
Q. Could you give me a concrete example of the kind of changes you’re talking about?
ADVERTISEMENT
A. Sure: the developmental programs we have. We’re in a state that has quite a few students who come to us with additional needs, and so we’ve created an extensive set of programs in both English and mathematics to meet the students where they’re currently at, and get them ready to participate in college-level courses.
We have, for example, a math emporium that we created that quickly accelerates students up to a level of math that allows them to engage in mathematics content that’s specific to their discipline. And we’re doing the same thing in English; we have three different programs that we’ve implemented that are based upon where a student initially places in the institution.
Q. The CLA+ has been described by one observer as a “test worth teaching to.” Do agree with that description?
A. In the sense that it attempts to measure critical thinking, which is what we want to develop in our students, I agree with that.
Q. In the last several years there’s been a swing away from using these kinds of standardized tests to assess student learning. I’m curious what you make of that, as it seems as if New Mexico is staying the course with using these kinds of assessments.
ADVERTISEMENT
One of the things we want to be able to do is to compare ourselves to others.
A. One of the things we want to be able to do is to compare ourselves to others. What’s most important to us is that we prepare our students, but we want also to be able to gauge how well we’re doing that. I think it’s important to have these national norms.
Q. The test now offers student-level results. Is that something your students are using?
A. I’m not sure about that. I read a few years ago that the CLA+ is now providing that data to students, and students can put that on their résumés. I don’t think that’s a selling point at the moment. Perhaps in the future it could be.
Q. It sounds as if you have a culture there of trying to gather data on how well the university is preparing students. Given that your CLA+ results look positive, I wonder if you try to use them as a selling point with prospective students or their parents.
A. Certainly, when there’s good news like this, we’ll use it to affirm our mission and what we’re doing. I wouldn’t say that we use this in particular. We just try to sell the entire quality and value of the institution.
ADVERTISEMENT
Beckie Supiano writes about college affordability, the job market for new graduates, and professional schools, among other things. Follow her on Twitter @becksup, or drop her a line at beckie.supiano@chronicle.com.
Correction (6/8/2017, 5:49 p.m.): This article originally mischaracterized a “value-added” measure, following how it was described by the Journal. The measure is computed by the Council for Aid to Education, not the Journal, and for the years covered by the analysis, the council used ACT or SAT scores, not graduation rates, along with freshman scores, to calculate the scores seniors were expected to attain. The Chronicle asked the council about the measure before this article was published, but received clarification only after publication. The article has been updated to reflect this correction.
Beckie Supiano is a senior writer for The Chronicle of Higher Education, where she covers teaching, learning, and the human interactions that shape them. She is also a co-author of The Chronicle’s free, weekly Teaching newsletter that focuses on what works in and around the classroom. Email her at beckie.supiano@chronicle.com.