Despite growing pressure from policy makers and prospective students for colleges to prove their value, the institutions have often insisted that their unique missions make simple measurements forbiddingly difficult.
Now they have documented proof.
After three years of studying ideas for measuring institutional quality, an expert panel assembled by the National Research Council delivered a 192-page report on Thurday that indicates just how hard it is to do that.
"While productivity measurement in many service sectors is fraught with conceptual and data difficulties," the 15-member panel said in its summary, "nowhere are the challenges—such as accounting for input differences, wide quality variation of outputs, and opaque or regulated pricing—more imposing than for higher education."
The panel, led by Teresa A. Sullivan, president of the University of Virginia, nevertheless identified some starting points. Its 15 recommendations start with the notion that the productivity of higher education should be regarded as ratio of outputs, such as degrees completed and credit hours passed, to inputs, including both labor contributions and non-labor ones, like buildings and grounds, materials, and supplies.
But the panel acknowledged that those are rough measures, and their imprecision is a sign that colleges compile insufficient measures of the quantity and quality of their work. The panel recommended a series of efforts to improve data collection, covering such variables as fields of study, hours spent on instruction, and job placement.
It's good advice, but not something that can be accomplished quickly, said Terry W. Hartle, senior vice president of government and public affairs at the American Council on Education.
"This report shows in great detail how difficult it will be to put sensible productivity measures in place quickly or easily," Mr. Hartle said.
Obstacles and Shortcomings
Along with Ms. Sullivan, the panel members consisted largely of university administrators, economists, and policy analysts and consultants in higher education. It was assembled by the Lumina Foundation for Education and the National Research Council's Committee on National Statistics, which in February 2009 proposed a study to identify the complexities of measuring productivity and accountability in higher education. Lumina paid the $900,000 cost of the project.
Colleges are facing mounting demands, from both ends of the political spectrum, to hold down costs and demonstrate their value to students. Under the Bush administration, the education secretary, Margaret Spellings, formed a Commission on the Future of Higher Education that blasted colleges over their records on access, affordability, and accountability. President Obama outlined a plan for improving college affordability in this year's State of the Union address, though his administration has been slow to offer details of how to accomplish it.
Key obstacles include the difficulty of making meaningful comparisons across institutions in a system in which diversity of mission has long been a fundamental strength. And measuring some of the most important outcomes—like those involving student success in the job market—has been hindered by the complexity of tracking millions of students decades after their college years and overcoming the associated privacy concerns.
The National Research Council panel may have provided more detail on the dimensions of those obstacles, but it wasn't immediately clear that it provided any new path through the maze of variables.
One major shortcoming, said Grover J. Whitehurst, director of the Brown Center on Education Policy at the Brookings Institution, is the panel's emphasis on avoiding institution-by-institution comparisons. The panel apparently feared making such direct comparisons out of concern that emphasizing simple factors such as graduation rates might lead colleges to artificially lower their graduation standards.
But that's an argument for finding better institution-by-institution measures, not avoiding them altogether, said Mr. Whitehurst, a former director of the Institute of Education Sciences at the U.S. Department of Education.
Clicking 42 Times
Among its recommendations, the National Research Council panel suggested that the Education Department's National Center for Education Statistics revive a plan for creating a national unit record database, which would allow the government to follow students into their careers and compile detailed reports on which colleges ultimately produce the most successful graduates.
The idea of a unit record system has been pursued by both the Obama administration and the Bush administration before it. Both administrations have recognized it as critical to gaining a true bottom-line measure of the value provided by higher education. But neither has been able to overcome strenuous objections in Congress fueled by university lobbyists, most notably those representing private institutions.
The federal government is now working with some states to get around those privacy objections. At least three states are matching job data with college-graduate data, using Social Security numbers as the means of tracking, and the Census Bureau is considering ways it might assist those efforts.
In one of those states, Washington, data studied by Mr. Whitehurst involving two community colleges near Seattle showed one performing much better than the other as measured by graduation rates, job placement, and salaries for students in nursing programs.
Those are the kinds of comparisons that policy makers and prospective students would find highly valuable, Mr. Whitehurst said. But the few institutions that have such information don't make it easily accessible on public Web sites, he said. "You've got to click 42 times to get what you want," Mr. Whitehurst said.
The National Research Council panel shows little outward concern for improving that situation, gearing most of its recommendations toward data-gathering systems that would be used almost exclusively by professional researchers, he said.
It's also unlikely that colleges, already feeling overwhelmed by the volume of data required by federal officials, would welcome the additional demands suggested by the National Research Council panel, said Mark S. Schneider, vice president of the American Institutes for Research.
"You're talking about hundreds and hundreds of more pieces of data," said Mr. Schneider, who ran the Education Department's data-collection systems as commissioner of the National Center for Education Statistics. And without a unit record system, it's questionable how much additional value would be produced by all that additional data, Mr. Schneider said.
Mr. Schneider said he agrees with his former colleague, Mr. Whitehurst, that data efforts should be driven primarily by the need to help students weigh their options. The nonprofit American Institutes for Research, working with Lumina, is nearing an agreement with six states that will make public program-by-program and institution-by-institution records of the starting wages of their students.
The strategies outlined by the National Research Council panel also don't appear designed to reflect the changing nature of higher education as students increasingly seek alternatives such as online courses and single-subject certifications, Mr. Schneider said. The report, he said, appears "too backward-looking."
Still, Mr. Hartle said, the panel provided "a terrific service," giving the Education Department a series of new steps for data collection and analysis that it could begin to consider.
"There's no doubt that this will advance the conversation," he said. "But it makes clear this is going to be a pretty long chitchat."
Correction (5/25/2012, 12:03 p.m.): This article originally stated incorrectly that the Census Bureau was sharing data, using Social Security numbers as the means of tracking, to help at least three states match job data with college-graduate data. The Census Bureau is not sharing such data, but at least three states are using their own employment data derived from Social Security numbers to track the success of college graduates, and the bureau is considering ways to help those efforts. The article has been updated to reflect this correction.