The Florida Legislature has wrapped up an ugly budget battle, and Florida State University will receive $16.7 million in new money from a pool set aside for public institutions. That’s a small fraction of the university’s overall operating budget. But Florida State saw its state appropriation drop nearly $150 million from the 2006 to the 2012 academic years, so any new money is huge.
The university nearly missed out on that windfall, though, because it was awarded under Florida’s performance-based funding model.
That model is in vogue nationwide: Some 30 states now distribute at least a portion of their higher-education money based on achievement measures, and President Obama called for broader use of the strategy as part of his plan for free community college.
The jury is still out on whether performance-based funding is effective. But as Florida, an early adopter of the practice, demonstrates, it can certainly be fickle.
If Florida State had scored a single point lower on any of the State University System of Florida Board of Governors’ 10 performance metrics — each of which is measured on a five-point scale — the university would have ranked as one of the system’s bottom three performers. Finish in the bottom three, and you’re not eligible to take home any money from the performance-funding pool.
One area that almost cost the university dearly: the first-year earnings of its graduates who find employment in Florida. The median 2012-13 Florida State graduate earned $31,600, giving the university four out of five points on the 2015 performance-funding model. However, if the median graduate had earned $400 less, the university would have dropped another point and lost out on the new funds.
That’s not much of a difference, and the data are somewhat volatile from year to year. All 11 institutions within the system measured a first-year earnings swing of at least $500 from 2014 to 2015, and nine of those changed by more than $1,000.
“It’s possible that one point can make a big difference, and it can all hinge on a few students if you’re just below the next threshold,” said Jason Jones, director of institutional research for the Board of Governors.
Missing that threshold would have cost the university a lot more than $16.7 million, since Florida’s performance-based awards can recur annually after the initial one is paid out. Losing out on funds for one year can amount to a hit of more than $100 million over the course of a decade.
Changing Behavior
For years higher-education groups have argued against tying institutional funding to graduates’ earnings data. In the Sunshine State, one of the pioneers of matching earnings data with graduates’ transcripts, that fear is becoming a reality — or at least growing closer.
Since the creation of Florida’s performance-based funding model, in 2013, universities have watched it determine a bigger chunk of their overall budgets. For the coming academic year, the Florida Legislature allotted $2.1 billion to the state’s 11 public institutions; $400 million of that money was awarded through performance-based funding. (The number was $200 million last year.)
As more money is put up for grabs, said David Tandberg, an assistant professor of higher education at Florida State, Florida’s universities may emphasize programs that lead to greater first-year earnings. Often those are science, technology, engineering, and mathematics fields.
The University of Florida, where 55 percent of graduates receive STEM degrees, may have an earnings edge over Florida State, where that number is just 38 percent.
That’s not the only reason some universities are nervous about the earnings metric. A few campus officials cited other ways in which it might not provide an accurate picture of what graduates are making.
One concern that institutions cite is that the metric does not include earnings for graduates who leave the state, along with other students who cannot be located. Those missing graduates can add up. For instance, Florida State’s data include only 3,000 graduates, or 40 percent of its 2012-13 class.
The metric also counts a disproportionate share of certain majors. For example, 65 percent of education graduates stay in Florida, while only 22 percent of biology graduates remain in the state. Because education majors are among the lowest-paid degree recipients, they may skew the earnings figures.
Joseph Glover, provost at the University of Florida, said many graduates of his institution go on to make higher wages in places like New York City or Los Angeles. Those who stay in state are more likely to stay in North Florida or Gainesville, which have much lower costs of living than, say, Miami. Yet the university competes against institutions based in higher-paying regions, like South Florida.
Other university officials take issue with the time frame that’s being measured.
“Although first-year salaries for graduates may not start at as high a level as for those in STEM fields, many of our arts and humanities graduates succeed as entrepreneurial artists and arts managers, for example, or pursue graduate and professional degrees, all of which result in higher midcareer salaries,” said Sally E. McRorie, interim provost at Florida State.
The Florida Board of Governors has cited research defending the value of first-year earnings data. But on some issues, like the cost-of-living differences, the state has held off on making adjustments.
Programs on the Ropes?
The board started a review process in 2003 that signs off on the creation of new programs and periodically reviews whether existing majors should be continued. Earnings outcomes are among the factors considered, though they are way down on the list, behind components like graduation rates, enrollment, and whether the degree program serves a general-education requirement.
The review process has led to the elimination of a few programs, but it’s unclear what, if any, effect the wage data had on those decisions.
On the other hand, the Board of Governors can cite examples of programs created as a result of the data. In 2012 a doctor-of-physical-therapy program offered jointly by the Universities of South Florida and West Florida was approved, after a labor-market analysis determined that the Pensacola area needed additional physical therapists.
Is it worth using imperfect statistics to help make difficult funding decisions? “It’s absolutely not premature,” said Mark S. Schneider, president of College Measures, a research partnership that investigates higher-education outcomes. “We will only get better at measuring outcomes, but it only gets better if stakes are involved.”
Some researchers are less enthusiastic. “Performance-based funding makes a lot of intuitive sense, it makes sense on the face, it makes political sense, but where it starts to become murky is with some of these metrics,” said Mr. Tandberg, of Florida State. “My feeling is the data are just not there yet.”
Lance Lambert writes about data and trends in higher education. You can follow him on Twitter @NewsLambert, or write to him at lance.lambert@chronicle.com.
Update (6/17/2015, 5:46 p.m.): Because of incorrect information provided by a source, the original version of this article stated erroneously that the University of North Florida had dropped plans for a doctoral program after a labor-market analysis. That passage has been removed.