College seek their own measures of quality, but will institutions make the data public?
It’s an annual rite for college leaders to bash the statistical rankings of institutions compiled each year by publications like U.S. News & World Report. The
ALSO SEE:
Getting Satisfaction: Questions From a National Survey
leaders frequently insist that it’s the formulas the rankings are based on -- not the idea of assessment in general -- that they dislike.
Soon, they may have a chance to stand by their words.
A new national survey, developed by several heavyweights in the field of higher-education assessment, seeks to measure the extent to which colleges use “good practices” that encourage learning. For example, the survey asks students how many times they were required to make a class presentation, and how often they had conversations with professors outside of class. It also asks students how many times they wrote papers that were at least 20 pages long, and whether they would attend the same institution a second time.
The creators of the National Survey of Student Engagement plan to study undergraduates at 750 colleges over the next three years and use the data to establish national benchmarks for different types of institutions. They also hope that colleges will consent to make the information public, so that it can be used by prospective students and their parents, and, possibly, the statisticians at U.S. News.
“Unless we develop measures of quality where colleges can actually provide evidence of their contribution to student learning, then this whole system [of ranking colleges] turns on resources and reputation, and reinforces the elitism of higher education,” says Russell Edgerton, director of the Education Program at the Pew Charitable Trusts. “We don’t want to be heavy-handed about pressing the down-the-road uses for the survey findings, but I would hope that it would be in the enlightened self-interest of institutions” to make the findings public someday.
Colleges that decline to participate in the survey -- or that refuse to make their results public -- may have a tough time explaining their actions, according to Alvin P. Sanoff, a higher-education consultant and former managing editor of “America’s Best Colleges,” the U.S. News guide.
“This poses a particular challenge to those who are most critical of U.S. News,” he says. “Here is, in effect, something that the critics in one way or another have asked for.”
The inspiration for the engagement survey came in February 1998, when a group of higher-education leaders met at Pew’s Philadelphia headquarters and discussed new methods of assessment that could serve as alternatives to the U.S. News rankings. Mr. Edgerton, a former president of the American Association for Higher Education, asked the National Center for Higher Education Management Systems, in Boulder, Colo., to design the survey. The design team included Peter Ewell, a senior associate at the center, and several other well-known higher-education researchers, including Alexander W. Astin, Arthur W. Chickering, and John Gardner.
George D. Kuh, a higher-education professor at Indiana University at Bloomington, is directing the survey, which will be administered to second-semester freshmen and second-semester seniors. The design team chose to survey freshmen because they are at the greatest risk of dropping out, and seniors because they should be the best judges, among students, of institutional quality.
Twelve institutions participated in a “pilot” study last spring. Mr. Kuh declines to release the findings from that study, saying “we made a deal with the schools” not to do so.
Another pilot study, involving 65 institutions, is under way this fall. In December, Mr. Edgerton will ask Pew’s Board of Directors for $3.2-million to defray the cost of surveying 250 colleges annually over the next three years.
Only four-year institutions are eligible for the survey. Mr. Edgerton says he would like to survey community-college students, but that such a survey should include different questions.
The engagement survey isn’t particularly original: Many of the questions are derived from existing student surveys, such as the College Student Experiences Questionnaire, which Mr. Kuh also administers. But Mr. Kuh insists that the data produced by the engagement survey will be of higher quality than the findings of that study and other well-known surveys, including the poll of freshmen conducted each fall by Mr. Astin, of the University of California at Los Angeles.
The latter surveys are administered directly by each institution, raising the possibility that a college could try to shape the pool of respondents in a way that would result in flattering findings about the institution. Colleges that participate in the engagement survey will be required to submit a list of 500 to 1,000 students that fit parameters set by Mr. Kuh. He will then send the survey directly to the students, who will return it directly to him.
“We want to be careful that every school is on the same playing field,” Mr. Kuh says. “We’re trying to remove any questions of institutional bias.”
To encourage colleges to sign up, the creators of the engagement survey have -- at least for now -- agreed not to make public any of the findings about individual institutions. But they expect information will be shared within consortiums, such as the American Association of State Colleges and Universities, and the moderately selective private colleges that make up the Annapolis Group, such as Wittenberg University and the University of Puget Sound.
What’s more, accrediting organizations, such as the Western Association of Schools and Colleges, say they may encourage colleges to use findings from the engagement survey in “self studies,” which are often part of the accreditation process.
“Russ is trying to create a marketplace for information about valid learning-related indicators, not just reputational and resource indicators,” says Ralph Wolff, executive director of the Western Association’s Accrediting Commission for Senior Colleges and Universities. “I think that’s an important direction for us.”
Mr. Ewell, of the center on higher-education management, says he believes participants in the engagement survey will eventually be willing to share their findings with the public. “We’re sensitizing people to the idea that this doesn’t hurt as bad as they think it will.”
The survey results might contradict the findings of the national rankings, which tend to favor big-name institutions over the less well known. For example, over the past three years, the University of North Carolina system has been conducting its own surveys of students, and publishing the results on a campus-by-campus basis. As one might expect, the flagship campus in Chapel Hill scored consistently well across a range of categories in a 1998 survey of sophomores. But a greater percentage of the students at the system’s Asheville and Greensboro campuses (92 and 91 per cent, respectively, versus 87 per cent at Chapel Hill) described the quality of instruction at their institutions as good or excellent.
“There is no campus that comes across as a hands-down winner,” says Gary T. Barnes, the system’s vice-president for program assessment and public service.
Korey Reynolds, a junior at Indiana University who completed the engagement survey last spring, says the findings -- if they were made public -- would be useful for students and parents who want to get beyond campus stereotypes. A survey of seniors at Indiana may show that professors are more accessible than the conventional wisdom holds, Ms. Reynolds says.
“Some kids end up going to a small school because they think they won’t have any contact with professors,” she says. “The availability is there -- you just need to make the effort. If those kids knew that they could get contact with individual professors, this might be a place that they would have considered.”
The colleges that already do well in the U.S. News rankings have the least to gain from full disclosure. Williams College, one of the institutions in last spring’s pilot, ranks third in the magazine’s latest list of top liberal-arts colleges, well above Grinnell and Earlham Colleges, among others. But what if the engagement survey were to find that students at Grinnell and Earlham are more satisfied with their college experiences than are students at Williams?
Rick Myers, assistant provost and director of institutional research at Williams, says he’s confident that the college would do well over all. He adds that the pluses of sharing information with students outweigh any disadvantages.
“It’s important that we put out a clear and thorough profile of what it’s like to be a Williams student,” Mr. Myers says. “There will be some measures in which individual colleges look much better than they do in U.S. News & World Report. But you want to look across a number of different measures.”
Mr. Myers says the survey’s results should not be used in compiling the U.S. News guide. He believes the data would be hard to distill into a rankings formula.
But Martha M. Garland, vice-provost and dean of undergraduate studies at Ohio State University, which is participating in the fall pilot, says she’d love to see the magazine find a way to incorporate results from the engagement survey. “The better the inputs, the better the information that they’re going to be able to put out,” she says.
Robert Morse, director of research for “America’s Best Colleges,” says that U.S. News already takes into account some factors related to teaching effectiveness, such as class size and retention rates. He says he’s eager to see the results of the engagement survey, but has doubts about whether the information will be of much use. For one thing, he wonders, will only those institutions that receive “positive feedback” from their students be willing to release their data?
The creators of the survey, he says, “still need to work out a lot of specifics before U.S. News could comment in any concrete way about the potential use of the information.”
GETTING SATISFACTION: QUESTIONS FROM A NATIONAL SURVEY
The following sample questions are taken from the National Survey of Student Engagement, which seeks to measure learning experiences and student satisfaction at individual colleges. Students are asked to answer most questions by choosing from a range -- such as “poor” to “excellent,” or “very little” to “very much.”
In your overall experience at this institution so far, about how often have you done each of the following?
- Worked with classmates outside of class to prepare class assignments.
- Talked about career plans with a faculty member or adviser.
- Worked with a faculty member on a research project.
- Had serious discussions with students of a different race or ethnicity than your own.
About how much of your course work up to now emphasized the following mental activities?
- Memorizing facts, ideas, or methods from your courses and readings so you can repeat them in pretty much the same form.
- Synthesizing and organizing ideas, information, or experiences into new, more-complex interpretations and relationships.
- Applying theories or concepts to practical problems or in new situations.
Thinking about your overall experience at this institution so far, to what extent does your college emphasize each of the following?
- Spending significant amounts of time studying and on academic work.
- Providing you with the support you need to help you succeed in your academic work and meet your personal goals.
- Helping you cope with your non-academic responsibilities (work, family, etc.).
How would you evaluate your entire educational experience at this institution?
If you could start over again, would you go to the same institution you are now attending?
http://chronicle.com Section: Students Page: A65