It is a deeply flawed measure of college performance, but it is also one of the best we have.
Today The Chronicle published an analysis of recent changes in the six-year graduation rates at nearly 1,400 colleges. At most institutions, the rate ticked up at least modestly between 2003 and 2008. But at 35 percent of the colleges in the data set, the rate declined, in some cases steeply.
In other words, despite all the attention thrown at graduation rates during the last 15 years, many colleges’ numbers remain stagnant or worse.
But what exactly is a six-year graduation rate? Here are a few basics. (If you already live and breathe this stuff, this post isn’t for you. Go watch this instead.)
Q. What do these numbers represent?
A. In 1990, Congress passed the Student Right-to-Know Act, which requires colleges to disclose information on graduation rates and serious crimes.
In particular, the law requires colleges to report the proportion of students “completing their program within 150 percent of the normal time to completion.” For four-year colleges, that means the proportion of students who earn bachelor’s degrees within six years. In 1997 the federal government began to systematically collect those numbers through its Integrated Postsecondary Education Data System, commonly known as IPEDS.
Q. In general, how are colleges doing on that measure?
A. A little more than half of students who enter four-year colleges earn a bachelor’s degree from the same institution within six years. You tell me whether that’s good or bad.
Q. And how have the numbers changed since 1997?
A. Recent trends have been at least modestly positive. Among students who entered four-year colleges in 1996, 55.4 percent had earned bachelor’s degrees six years later. For the cohort of students who started college in 2001, the figure was 57.3 percent. That’s according to data published by the federal government last year. If you study that table, you’ll see similar small, steady increases in most categories.
A. It doesn’t cover people who begin college as part-time students. It doesn’t cover people who begin at community colleges and then transfer to four-year institutions. In fact, it doesn’t cover people who transfer at all: To get picked up in the federal data, students have to begin and end at the same institution. It doesn’t cover the nontrivial number of students who complete college seven or more years after they start. Whole swaths of higher education are rendered invisible.
Q. What could we use instead?
A. In many people’s eyes, the gold standard would be a unit-record tracking system that would follow students from institution to institution for the full length of their college careers. But Congress rejected that idea several years ago amid heavy political opposition.
The unit-record concept has not died, however. The Obama administration has given tens of millions of dollars to states to build data systems that would track students’ progress from elementary school through college. (The State Higher Education Executive Officers has recently published two reports about the best ways to create those databases.) Even if no true federal data system emerges, there will probably be a de facto national unit-record database within a decade or so.
Q. If we used a unit-record data system, would four-year colleges’ graduation rates look healthier than they do now under the IPEDS six-year graduation statistic?
A. It depends on what you want to measure. If four-year colleges received credit for graduating students who transferred in from community colleges, then their numbers would certainly look better. That’s an important topic, and most people agree that four-year colleges should be credited for playing that role.
But if we want to focus on students who begin their college careers at four-year colleges, then a unit-record system would probably make the national graduation rate look only somewhat better.
Unlike the IPEDS data, a unit-record system would capture students who begin at one four-year college and graduate from another four-year college. But including those transfer students would probably improve the national six-year graduation rate by only a few percentage points.
To see what I mean, look at the data released last week from the Beginning Postsecondary survey, a periodic federal study that tracks a sample of students through their college experiences. Among students who enrolled in four-year degree programs in 2003-4, 63.2 percent had earned a bachelor’s degree within six years. That’s better than the 57.3 percent rate I cited above from the national IPEDS data, but it’s not an enormous difference. (For more analysis of the Beginning Postsecondary data, see Kevin Carey’s Brainstorm post from last week.)
Q. Until the dawning of a unit-record-tracking age, the IPEDS six-year graduation rate will probably be the best available measure. Since we have to live with this system, are there ways we could improve it?
A. Maybe. In July, the National Postsecondary Education Cooperative published a white paper about how to improve the IPEDS graduation-rate calculations. Among other things, the report suggested that the federal government’s College Navigator Web Site should display five-year rolling averages of an institution’s graduation rate, rather than focusing solely on a single cohort.