The United States has embarked on an effort to hold its institutions of higher education more accountable for the success of their students both while they are enrolled and after they graduate. We have made substantial progress on the first front, but efforts to measure the success of students in the labor market after graduation and make that information public have been far less successful.
The U.S. Department of Education has been narrowly focused on its regulatory authority to ensure that career and technical programs lead to the "gainful employment" of their students. As a result, it has neglected an equally crucial role—getting the information it has collected into the hands of those students and their families in an understandable format that is independent of any regulatory function.
While the department has made some moves toward making its gainful-employment data more consumer friendly, it has missed opportunities to be more useful to students. For example, in its June 2012 release of the data, the department's downloadable spreadsheet had a column labeled "debt-to-earnings annual rate denominator." In fact, this is the average earnings of graduates from thousands of programs throughout the nation, data containing valuable information that is not adequately conveyed by ratios.
Earnings reported in dollars, not ratios, are of great importance to students thinking about programs of study. Simple dollar figures should factor into decisions about how much students should reasonably pay for training and how much they should borrow to pay for it.
But while the federal government is focused on regulation, states can fill an equally vital educational function, informing students and their families of the success of graduates from a far wider range of programs, not just the career and occupationally oriented programs to which the Department of Education is limited.
About half the states can now link data that document each student's experiences (e.g., major field of study) to unemployment-insurance records that track post-college earnings and field of employment. This lets us compare the return on the investment that students and taxpayers have made in, say, a student with a bachelor's degree in sociology with that in a similar student who earned a bachelor's degree in English literature on the same campus.
Perhaps even more important, these linked data let us measure the returns to students with the same credentials from different campuses. So students and policy makers can compare how successful students with, say, bachelor's degrees in materials sciences from one college are compared with students with the same degree from another college.
Of course, higher education is about many other things besides labor-market success. For most students, their families, and state policy makers, however, it is the ultimate economic-development strategy. So they all need to know how students fare after they graduate.
Last month College Measures, a partnership between the American Institutes for Research and Matrix Knowledge Group, released data documenting the first-year earnings of graduates from programs across all of the public institutions in Tennessee. We showed not only how much variation there is in the earning power of graduates from different fields of study but also how much variation there can be in the earnings of graduates from the same field of study across institutions.
We found, for example, that there is a nearly $15,000 difference in first-year earnings of bachelor's-degree holders in health professions from the University of Memphis and graduates of the University of Tennessee in the same area of study. A smaller gap, but still around $7,000 in first-year earnings, separates graduates of the University of Tennessee in multi/interdisciplinary studies from graduates of East Tennessee State.
And while Tennessee State graduates in health professions lagged behind those from other campuses, the graduates in multi/interdisciplinary studies were, on average, the highest paid in the state for students with that major. Those findings reinforce the need for information about specific programs—success is often not uniform across programs or across institutions.
We are at the beginning of a long process of turning administrative data into information that students and their families can use before investing time and money in pursuit of a degree that may or may not have market value. While there will be bumps along the road, and while the federal government is tied up by its focus on regulation, I expect that more and more states will fill the void by disseminating the information their students need about the economic success of graduates from programs of study across the state. That is something we should all encourage.