All year long, from every direction, surveys bombard students.
On a scale of 1 to 5, how satisfied are you with campus career services? Health-center programs? Library space? How often do you discuss course topics outside of class? How comfortable is the campus climate for diversity?
Pleas for students’ time and opinions have proliferated in the past decade, to as many as 10 per person each year, according to some estimates. Events like orientation almost always trigger follow-up surveys. How was that study-abroad info session? Your pre-major-advising appointment?
The accountability movement, of course, accelerates assessment: Colleges must prove themselves to accreditors and legislators, and, within campuses, departments contend for scarce resources. Nothing shows effectiveness like data, and nothing generates broad-based data as quickly and cheaply as an online student survey, which, with an array of tools, anybody can now do.
But the megabytes of data that such surveys produce may not be reliable. That’s because students have come down with survey fatigue, the main symptom of which is nonresponse. Two decades ago, 70 percent of students would answer a survey, campus officials recall. Now, by some standards, a 20-percent response rate is decent. In this year’s National Survey of Student Engagement, more than a third of colleges had less than 30 percent of their students respond. Response rates on individual campuses were as high as 92 percent but as low as 4 percent.
Some researchers insist that students who fill out surveys are similar to those who don’t, so that to a certain extent, polls with low response rates can still be meaningful. But a few studies have found that women, for example, are more likely to take part; minority students, less likely.
The quality of responses, too, may be suffering. Databases show partial completers and straight-line respondents, students who scroll down a survey checking, say, only 3’s. Even reading and answering each question doesn’t necessarily represent much thinking.
“Students don’t respond fully, to the best of their ability, because they just want to finish,” says Eric Berlinberg, a senior and president of the student government at Colorado State University, which also polls students.
Oversurveying is counterproductive, says Patrick T. Terenzini, a professor emeritus of education and a senior scientist emeritus at the Center for the Study of Higher Education at Pennsylvania State University.
“This is a nontrivial threat to higher education and our ability to monitor what we’re doing,” he says. “Institutions are not getting the quality of information they need to make responsible, informed decisions.”
Still, surveys hold the most promise to elicit that information. Without viable alternatives for gauging students’ experiences and satisfaction, some campuses are treating the fatigue. More-deliberate, creative approaches, researchers hope, will make surveys seem like less of a drag.
Frankness and Frozen Yogurt
A survey is a transaction. To get students’ attention, administrators must entice them.
Skeptical reactions make that a challenge. “Most surveys are a waste of time,” a sophomore at the University of North Carolina at Wilmington told the student newspaper. “My answers won’t actually make a difference,” said another.
To persuade students otherwise, some campus officials craft careful e-mail invitations. They describe succinctly a survey’s purpose and importance, preferably in a sans-serif font, which by some accounts can be read faster. An ideal request, according to researchers, comes from a well-regarded administrator, estimates the time a survey will take, guarantees the confidentiality of responses, sets a deadline about a month out, and encourages questions.
But even the best invitation improves responses only if students read it. More often it may get that fatal designation: spam.
“Those e-mails get deleted pretty fast,” says Don A. Dillman, a professor of sociology and deputy director for research and development of the Social and Economic Sciences Research Center at Washington State University.
He has conducted nine student-experience surveys at Washington State in the past decade. Do students change majors? Graduate on time? His experiments with methodology have yielded enviable results.
From random samples of undergraduates, a whopping 50 to 60 percent usually respond to the surveys. And students have plenty to say. One year the survey’s Web site crashed because it hadn’t allowed enough room for an open-ended question on advising. “They were writing almost essays,” Mr. Dillman says.
The strong response comes from contacting students via both postal mail and e-mail, he believes. An initial letter—with a dollar bill or two enclosed—signals the importance of a survey. That invitation refers students to a Web site; follow-up e-mails, with a link, then seem more convenient, he says.
E-mail requests, which are sent by most colleges, can’t carry cash. But last spring Harvey Mudd College attached a coupon for a free scoop of frozen yogurt to a survey on diversity; 40 percent of the students responded.
Colleges more commonly lure students with lotteries: Complete this survey for a chance to win! Some administrators fret over the expense—and the precedent—of such incentives, or argue that in the general population, they don’t significantly improve response rates. But more colleges are dangling prizes at the free-food crowd, especially for larger-scale projects. Colleges that administer the National Survey of Student Engagement have held drawings for participants to win gift cards, parking permits, or iPads. Last year 39 percent of the 761 participating institutions offered incentives.
Indiana University at Bloomington is trying its luck with lotteries. Its response rates to the national survey, nearly 40 percent in the year 2000, had dropped off to 24 percent by 2009.
Judith A. Ouimet, assistant vice provost for undergraduate education, went on a quest for students’ attention. She began promoting the survey more. In 2010 she gave each respondent a coupon for a free soda and entered them all in a lottery for 87 prizes, including iPods and lunch for four at the campus’s Tudor Room.
Thirty-five percent of students responded to the survey, and they didn’t seem to have barreled through faster than usual; overall answers were more or less the same. In a meta-survey—yes, a survey about surveys—Ms. Ouimet probed the motivations of respondents and nonrespondents. Several of those who didn’t respond to the original survey doubted that anybody had actually won the lottery.
So she upped the chances, offering about 1,300 prizes this past spring. She could no longer afford free sodas for all, but some respondents won iPads; $10 gift cards to Amazon, Starbucks, or Target; personal training sessions; and IU sweatshirts.
Response rates remained more or less steady, at 32 percent. Meanwhile, the prize drawings suffered a few glitches. The student who won wooden nickels to a local bagel shop worked there; one who ended up with a pizza couldn’t eat it. Ms. Ouimet found other winners willing to swap. She asked for sizes from students who had won sweatshirts. “I want them to have a positive experience,” she says. “I want them to take another survey.”
When the big winners picked up their prizes, Ms. Ouimet asked for permission to photograph them for promotions of future surveys. Other winners never came. The personal training sessions, along with two-thirds of the smaller items, went unclaimed. Prizes are still stacked in her office.
‘We’ve Heard Your Voice’
An outlet for expression may motivate students more than an umbrella or a Kindle would. “My major incentive was to get my opinions heard,” one student responded to Indiana’s meta-survey. “I want to voice my opinion, but to know that some action will come of it,” said another. Among nonrespondents, many didn’t believe that the data would be used.
“The onus is on us,” Ms. Ouimet says, “to share with the students.” She would like to project survey findings, and the changes spurred by them, on the big screen in the campus cinema, before movies.
The University of North Carolina at Wilmington designed fliers for its “We’ve Heard Your Voice” campaign. In 62 versions, cartoon heads proclaim what survey respondents said and what the university did. Change No. 1: more internship listings. No. 37: more racquetball courts.
Wilmington featured the campaign in TV ads and a parent newsletter, and administrators talked about it to the student newspaper, The Seahawk. The resulting article promoted the pride they were after. “Your response actually matters in more ways than you may think,” students were told. Those extra math tutors and cable TV channels? All you.
At the State University of New York at Albany, in on onslaught of surveys, the “Your Voice” campaign tries to maintain a similar buzz. The administration had been sending out a dozen surveys a year until 2009, when it signed a contract with the assessment company StudentVoice. In the fall of 2010, Albany conducted 60 surveys, and it needed to keep students on board. For some surveys, response rates now rise as high as 35 to 40 percent, but for others, they hover below 20 percent.
Oregon State University has involved students in interpreting survey findings. Larry D. Roper, vice president for student affairs, took results from the National Survey of Student Engagement a few years ago and turned them over to two dozen freshmen. He asked the students, who had volunteered for a leadership program, to hold discussions in their dorms and report back with recommendations for the administration.
“It would not only generate data,” he remembers thinking, “but it would also more deeply invest them” in the survey enterprise.
Setting Limits
The case for the importance of any one survey breaks down in a barrage of them.
Yet many campuses proceed freely. “I hear people say, ‘Well, it’s just a student survey. Students are used to doing these things,’” says Mr. Dillman, of Washington State. He urges a more judicious approach: “We should view the opinions that students give us as a pretty precious resource.”
Some colleges have chosen to limit access to e-mail lists or have appointed a committee, like one at the University of Nevada at Reno, to coordinate surveys—managing costs as much as fatigue.
About 100 student-affairs administrators around the country are devoted full time to assessment, one researcher estimates. They watch the calendar, imposing embargoes around national surveys or final exams. At Indiana, Ms. Ouimet tries not to have more than one survey out at once. Timing can be crucial. Multiple sequential surveys tend to suppress response rates, researchers at Wesleyan University have found.
Committees or assessment directors also rein in other administrators eager to survey students. These gatekeepers may edit or cut questions. They often argue for contacting a representative sample of students rather than the entire undergraduate population.
But the impulse to poll remains strong, as surveys have become routine.
“People somehow think if they’re not collecting data, they’re not doing their jobs,” says Mr. Terenzini, of Penn State. He would like to see national guidelines to limit the indiscriminate interrogation of students. Annual surveys are almost never necessary, he says: Things don’t change that fast.
Often a recent survey may have asked similar questions, says Serge Herzog, director of institutional analysis at Reno. In those cases he points the survey-happy to the relevant data: “We are, in essence, trying to get as much mileage out of student responses by bothering them the least.”