Almost all graduates of Ferris State University find jobs, at least according to the statistics the university promotes to attract students.
“Ferris graduates have a 98-percent job placement rate,” the university, in Big Rapids, Mich., highlights in boldface on a Web page for recruiting international students. A general marketing page notes an overall placement rate of 97 percent, and in some disciplines, 100 percent.
Those rates are enviable. But, especially in this economy, are they even possible?
As it happens, the last year for which Ferris State reported a 98-percent job-placement rate was 2005-6, when fewer than half of its graduates responded to the university’s destination survey. Since then, the share of students responding to the survey has dropped, to 22 percent in 2009-10, the most recent year for which figures are available. The university reported a job-placement rate that year of 86 percent, although the older numbers remain online.
Like Ferris State, many colleges release placement rates based on scant information: More than a third of colleges’ reported rates in 2010 were based on responses from half of their graduates or fewer, according to the National Association of Colleges and Employers. That raises the question of whether the results are skewed by greater participation among happily employed graduates.
“This problem is endemic to many graduation surveys,” says Philip D. Gardner, director of the Collegiate Employment Research Institute at Michigan State University. “Another sample of 20 percent to 30 percent, which includes different respondents, could produce a different set of results.”
That’s not the only reason much job-placement data are unreliable—for prospective students comparing colleges or anybody else keeping tabs. For one, some colleges don’t collect such data at all. Some survey students immediately upon graduation, and others track employment success over several months. Some include recipients of associate and graduate degrees in their statistics, and others eliminate them or separate them out into different reports. Few ask if the jobs that students acquire relate to their fields of study or career paths; many count any positions at all, even unpaid internships.
“The problem is that there are no standard questions or even agreed-upon standards,” says Mr. Gardner.
Some career and for-profit colleges, as well as law schools, have faced high-profile accusations of job-placement fraud, in the form of lawsuits and scrutiny from accreditors. Meanwhile, experts also question the reliability of some of the data that traditional undergraduate institutions release.
Since 2008, the federal government has required colleges to disclose any placement rates they calculate to prospective students who request the information. And institutions with certificate programs, predominantly in the for-profit sector, must report placement rates to the Education Department. State lawmakers may ask public colleges for the numbers, but their legislatures are mostly silent on how the data should be tracked, as are regional accreditors that oversee the vast majority of nonprofit four-year institutions. National accreditors are more prescriptive, but their standards vary.
At Ferris State, Kristen Salomonson, dean of enrollment services, acknowledges that response rates are low. So low, she says, that it’s “dangerous to draw conclusions” about the success of Ferris graduates.
As response rates have fallen, the university has moved away from marketing its job-placement numbers, she says. Still, their use on the Web site is fair because the full reports are also online, she says. Ferris State’s use of the data, she says, is probably not that different from what most colleges are doing.
Pursuing Graduates Online
Students and families who look at placement rates while choosing colleges may be confused. Colorado College, for instance, reports that 53 percent of the Class of 2012 is employed. Another small liberal-arts college, Colgate University, in upstate New York, reports that 72 percent of the Class of 2011, the most recent for which data are available, had jobs in 2011.
At face value, Colgate’s graduates seem more successful. But the timing of the colleges’ surveys may explain the discrepancy. Colgate tracks students for six months after they graduate, but Colorado polls them during the rehearsal for graduation ceremonies, a time when many are still weighing their options.
“That most definitely has an effect,” says Gretchen Wardell, office coordinator at Colorado’s career center. “At grad practice each year, there are dozens of kids waiting to hear back from jobs.” The college used to try tracking students for six months after graduation, she says, but only 30 to 35 percent would respond.
Both Colorado and Colgate achieve high response rates—above 80 percent—the former by tapping a captive audience, the latter by pursuing graduates, one by one.
Staff members in Colgate’s career-services office start collecting placement-rate data six months after graduation by sending successive surveys to each graduate’s e-mail address. For those who do not respond, officials look on LinkedIn, Facebook, and other social-media sites to see if they can figure out what the graduates are doing. The staff members also ask professors, coaches, and others on campus who may know where students ended up. They stop only once they get information on 80 percent of graduates.
“It is painstaking and time-intensive, but that is how we get a decent response rate,” says Teresa Olsen, interim director of Colgate’s career center.
About a third of colleges had response rates above 75 percent in 2010, according to the National Association of Colleges and Employers. But roughly the same proportion posted rates of 50 percent or less.
It’s hard to know what an adequate response rate is for the surveys, says Mark Schneider, vice president at the American Institutes for Research: “Until you find out what that selection bias is, a good response rate is hard to gauge.” But especially when it drops below 50 percent, he thinks that mostly successful graduates are responding.
Underemployment Unknown
Even when job-placement surveys yield high response rates, they can be fuzzy on what counts as a job. Many colleges don’t ask graduates whether their jobs are related to their degrees or if they feel those jobs have career potential. Most colleges do not account for underemployment or know if a graduate is reporting an unpaid internship.
Kansas State University’s placement rate for the Class of 2011 was 92 percent, with 70 percent of graduates employed and 22 percent continuing their education.
But the data may include underemployment, says Kerri Day Keller, director of career and employment services at the university. She tries to make sure that all graduates listed as employed are in paid positions, but it is possible, she says, that some unpaid ones slip in.
Kansas State doesn’t track whether graduates have jobs related to their degrees because that can be subjective, says Ms. Keller. What about a history major who works for the Boy Scouts, she says, or an engineering major who moved to India to be an American-accent trainer?
One question colleges should always ask employed graduates in placement surveys is whether their jobs require a degree, says Andrew M. Sum, director of the Center for Labor Market Studies at Northeastern University.
Mr. Sum analyzed census data and found, for 2011, that 54 percent of college graduates under the age of 25 were either unemployed or employed in jobs that did not require a college degree. Those were the worst results he’d seen since 1995, he says. “This is bad for the country.”
A recent study by Rutgers University had similar results. Researchers interviewed 444 graduates from the Class of 2006 through the Class of 2011 and discovered that many were struggling to find full-time work. Only 51 percent were employed full time. Twenty percent were attending graduate or professional schools, 12 percent were working part-time or were unemployed, and 6 percent were in the military or volunteering, according to the study. (It excluded graduates who were not looking for work.)
The researchers had decided to do the study in part because they thought colleges’ job-placement numbers were unreliable, says Carl E. Van Horn, director of the John J. Heldrich Center for Workforce Development at Rutgers.
Whether a job requires a degree is helpful information, but underemployment is still difficult to track, says Mr. Schneider, of the American Institutes for Research. Salaries are probably a better measure of student success, he says. But even when colleges’ job-placement surveys ask about salaries, some graduates do not give that information. Colgate University and Colorado College don’t track salaries at all. At Kansas State, employed graduates in most majors reported salary data at a response rate of 65 percent or higher. But only 47 percent of graduates in the university’s College of Arts and Sciences reported their salaries.
Mr. Schneider has worked for the State of Virginia on a public database, scheduled to be released in August, that will show the median salary for graduates of various programs and majors at all public and some private colleges. To develop the database, Mr. Schneider has collected figures from the state’s unemployment-insurance agency and combined them with the unique ID numbers most Virginia college students carry. Five other states are considering similar databases, he says.
Keeping Closer Track
Meanwhile, some colleges are pursuing innovative strategies to acquire—and disclose—the best data they can. Some are using the same LinkedIn and Facebook tactics that Colgate does. Others are asking students to include reliable e-mail addresses on their graduation applications, for easier follow-up.
St. Olaf College, in Northfield, Minn., has established perhaps the most comprehensive system to show where its graduates end up. The small liberal-arts college offers on its Web site a searchable database of what almost every member of the Class of 2011 is doing, minus names and other identifying information.
A search of theater majors, for instance, reveals that four have moved on to further education, one works part-time at Caribou Coffee, and another is an assistant director of the nonprofit group Fund for the Public Interest.
St. Olaf created the database to answer the question of what a liberal-arts education is worth, says Steve Blodgett, director of marketing and communications at the college. The data collection took much longer than compiling a simple job-placement rate had in previous years. Campus officials had to send e-mails and call graduates, look them up on LinkedIn and other sites, and talk to faculty about where they ended up. With 92 percent of the class reporting, the college found that 70 percent were employed and 28 percent were enrolled in further education.
As consumers and government officials demand information on students’ outcomes, more colleges may take on similar projects, says Mr. Van Horn, of Rutgers. Still, he points out a limitation: Most colleges get students’ responses at one point in time, without following up. So even St. Olaf’s robust database may not tell the entire story.
For instance, one 2011 graduate with a degree in psychology is listed as a clinical research assistant at the Mayo Clinic, in Rochester, Minn., which sounds pretty impressive. Ashley Enke is almost certain that graduate is her. She started at Mayo in July 2011, as an unpaid research assistant. In March 2012, the clinic hired her part time, at $14.50 an hour. But her last day there was July 6.
“In what the database shows, that does not come across,” Ms. Enke says of her listing on St. Olaf’s site. “It’s not a stable position.”
Last week, Ms. Enke moved out of her apartment in Rochester and in with her parents, in Omaha, where she plans to focus on prerequisite courses for medical school at the University of Nebraska. Her work at the Mayo Clinic helped her realize she wanted to go into medicine, she says, rather than pursue a Ph.D in psychology. While studying, she may pick up some shifts at her mother’s coffee shop now and then.
On a national level, the Obama administration has proposed a new scorecard to compare colleges’ costs and their graduates’ earning potential. But recent experience suggests there may be snags on how earnings data will be collected.
Last December, the Education Department’s Integrated Postsecondary Education Data System set up a review panel to create a system for tracking job placement for graduates of certificate programs and for-profit colleges. It got stuck. The panel could not make any suggestions, its members said, without further study of the data’s limitations.
Lisa Severy, president-elect of the National Career Development Association, says she is glad to see that some colleges have become more rigorous about their job-placement data. But Ms. Severy, who is also director of career services at the University of Colorado at Boulder, is concerned that without standards set by accreditors or lawmakers, the data will never be meaningful.
“Until we can come to some consensus on what’s being collected and how,” she says, “progress on this issue will continue to be haphazard.”