A Study to Measure Value of Community Colleges Falls Short

October 14, 2013

A recent report issued by the Nexus Research and Policy Center and the American Institutes for Research, "What's the Value of an Associate's Degree? The Return on Investment for Graduates and Taxpayers," tackles an important topic that is timely for both college students and policy makers. Unfortunately, the report misses the mark in providing information on which either group can take action.

In August, President Obama released a series of proposals on college affordability and accountability that includes the idea of pegging eligibility for federal financial aid, consisting largely of Pell Grants and federal student loans, to the earnings of an institution's graduates. At face value, the Nexus/AIR report appears to provide valuable information about the earnings of graduates from 579 community colleges, which account for more than 80 percent of full-time-equivalent community-college enrollment, data that could be used to help address the president's goal. While it is a noble attempt, however, the report has serious flaws, preventing it from being useful for this purpose.

A major problem with the report is its reliance on earnings data collected by PayScale, a company that aggregates voluntary salary reports from individuals around the country. People visit the PayScale Web site and report information that includes the college they attended, their degree, and information about their current position—employer, job title, salary, bonus, benefits.

The PayScale site says it applies a "rigorous filtering process before it [self-reported data] can be added to our active database. Our patent-pending algorithm applies over 32 stringent rules to verify data validity." Even so, there are still major problems with the PayScale salary data. Here are a few examples:

  • There is no way for the reader to tell if the data are representative of all graduates of any single college in the PayScale database. For example, the Web site reports that it has profiles from 4,675 individuals who reported that they attended Michigan State University, my institution. But this represents less than 1 percent of the living alumni of Michigan State. So, for example, when PayScale reports that the "typical starting salary after graduation" for graduates of Michigan State is $45,900, there is no way to tell just how representative that figure is for all MSU graduates. The site does not provide the breakdown by job category of all of the respondents, and although it gives the breakdown by major, over half of the people are listed as "bachelor's degree" without any indication of the field.

    In addition, we do not know how many of the 4,675 individuals reported their salary immediately upon graduation, so the $45,900 estimate may actually be based on a small subset.

  • There is no way to tell if people are reporting their salaries accurately. Do they round up to the nearest $5,000? $10,000? Do they inflate the figure in order to make their alma mater look a little better? Do graduates of different institutions differ in the way they report the data?

  • Starting salaries of the graduates of postsecondary programs, particularly for adult students who enroll in large numbers at many community colleges, are also affected by the previous experience of the individual. A community college that enrolls associate-degree candidates who are older and with more work experience should, other things being equal, produce graduates with higher starting salaries.

  • A single median-earnings figure for a college has relatively little meaning for someone trying to decide which college to attend. Studies with more valid data on the earnings of college graduates demonstrate that the greatest variations in earnings are not among colleges, but within individual colleges, with the differences driven primarily by students' degree or certificate programs and occupational choices.

The bottom line is that there is no basis on which to judge the validity of the PayScale salary data, other than to take the company's word for it. And it provides minimal information, at least publicly, about its methodology and the validity of its data. That is in contrast to traditional academic studies, in which the methodology is generally clearly articulated and the reader can judge for himself the quality of the study. And because the PayScale data are at the heart of the Nexus/AIR report, these problems contribute to the report's limitations.

For example, according to the study, the community college with the highest reported starting salary was Colorado Northwestern Community College, at $57,637. That is 39 percent above the average of the other 578 institutions in the study. Does this tell us that Colorado Northwestern is 39 percent "better" than the average community college in the nation? Not at all, because we know nothing about the validity of the earnings data from PayScale. The company's Web site shows that the earnings estimates for that college are based on the reports of just 21 individuals: four who reported annual salaries and 17 who reported hourly wages. It is impossible to tell how many of these 21 reported a starting salary. Do we want to pass judgment about that institution—or any other institution in the study—based on so few data points?

Even if the PayScale self-reports of salary information were reasonably accurate, there are still major problems with calculating a return on investment for individual community colleges.

The Nexus/AIR methodology compares the earnings of the graduates of each community college in its study with Census Bureau data on the earnings of individuals with only a high-school diploma. The problem is that this is an apples-and-oranges comparison. People who choose to attend a community college for an associate degree are not a random sample of high-school graduates. They most likely have characteristics that distinguish them from those who choose not to enroll in postsecondary education: such things as an interest in furthering their education and in gaining more skills for the labor markets, and very likely higher levels of academic ability and drive.

A failure to control for those characteristics, not measured in the report, can greatly bias the estimates of lifetime earnings—and the individual and public return on investment—calculated in the report. Most important, those biases will have varying impact on the estimates from community college to community college.

To the authors' credit, the report's introduction notes the limitations of their study, stating that "the data needed to fully understand what, on average, a degree is worth to a graduate or what the tax benefit of that degree is to taxpayers, is not currently available."

But then the report goes on to do precisely what the authors say cannot be done—so precisely that they calculate exact dollar amounts (graduates of Colorado Northwestern Community College, for example, are expected to have an average net financial return of $850,903 over their lifetimes).

The precision of the numbers reported in this study, based on data of such unknown validity, is careless at best and grossly misleading at worst. And the notion that this study can be used to compare community colleges is preposterous.

Donald E. Heller is dean of the College of Education at Michigan State University. Parts of this essay were drawn from his blog there, which can be found at