> Skip to content
FEATURED:
  • Student Success Resource Center
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
ADVERTISEMENT

Measuring Stick

Experts explore the quality and assessment of higher education.

  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

6-Year Graduation Rates: a 6-Minute Primer

By  David Glenn
December 6, 2010

It is a deeply flawed measure of college performance, but it is also one of the best we have.

Today The Chronicle published an analysis of recent changes in the six-year graduation rates at nearly 1,400 colleges. At most institutions, the rate ticked up at least modestly between 2003 and 2008. But at 35 percent of the colleges in the data set, the rate declined, in some cases steeply.

We’re sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.

Once javascript and access to those URLs are allowed, please refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com

It is a deeply flawed measure of college performance, but it is also one of the best we have.

Today The Chronicle published an analysis of recent changes in the six-year graduation rates at nearly 1,400 colleges. At most institutions, the rate ticked up at least modestly between 2003 and 2008. But at 35 percent of the colleges in the data set, the rate declined, in some cases steeply.

In other words, despite all the attention thrown at graduation rates during the last 15 years, many colleges’ numbers remain stagnant or worse.

But what exactly is a six-year graduation rate? Here are a few basics. (If you already live and breathe this stuff, this post isn’t for you. Go watch this instead.)

Q. What do these numbers represent?

ADVERTISEMENT

A. In 1990, Congress passed the Student Right-to-Know Act, which requires colleges to disclose information on graduation rates and serious crimes.

In particular, the law requires colleges to report the proportion of students “completing their program within 150 percent of the normal time to completion.” For four-year colleges, that means the proportion of students who earn bachelor’s degrees within six years. In 1997 the federal government began to systematically collect those numbers through its Integrated Postsecondary Education Data System, commonly known as IPEDS.

Q. In general, how are colleges doing on that measure?

A. A little more than half of students who enter four-year colleges earn a bachelor’s degree from the same institution within six years. You tell me whether that’s good or bad.

Q. And how have the numbers changed since 1997?

ADVERTISEMENT

A. Recent trends have been at least modestly positive. Among students who entered four-year colleges in 1996, 55.4 percent had earned bachelor’s degrees six years later. For the cohort of students who started college in 2001, the figure was 57.3 percent. That’s according to data published by the federal government last year. If you study that table, you’ll see similar small, steady increases in most categories.

Q. Why does the IPEDS six-year graduation statistic make some people rend their garments and gnash their teeth?

A. It doesn’t cover people who begin college as part-time students. It doesn’t cover people who begin at community colleges and then transfer to four-year institutions. In fact, it doesn’t cover people who transfer at all: To get picked up in the federal data, students have to begin and end at the same institution. It doesn’t cover the nontrivial number of students who complete college seven or more years after they start. Whole swaths of higher education are rendered invisible.

Q. What could we use instead?

A. In many people’s eyes, the gold standard would be a unit-record tracking system that would follow students from institution to institution for the full length of their college careers. But Congress rejected that idea several years ago amid heavy political opposition.

ADVERTISEMENT

The unit-record concept has not died, however. The Obama administration has given tens of millions of dollars to states to build data systems that would track students’ progress from elementary school through college. (The State Higher Education Executive Officers has recently published two reports about the best ways to create those databases.) Even if no true federal data system emerges, there will probably be a de facto national unit-record database within a decade or so.

Q. If we used a unit-record data system, would four-year colleges’ graduation rates look healthier than they do now under the IPEDS six-year graduation statistic?

A. It depends on what you want to measure. If four-year colleges received credit for graduating students who transferred in from community colleges, then their numbers would certainly look better. That’s an important topic, and most people agree that four-year colleges should be credited for playing that role.

But if we want to focus on students who begin their college careers at four-year colleges, then a unit-record system would probably make the national graduation rate look only somewhat better.

Unlike the IPEDS data, a unit-record system would capture students who begin at one four-year college and graduate from another four-year college. But including those transfer students would probably improve the national six-year graduation rate by only a few percentage points.

ADVERTISEMENT

To see what I mean, look at the data released last week from the Beginning Postsecondary survey, a periodic federal study that tracks a sample of students through their college experiences. Among students who enrolled in four-year degree programs in 2003-4, 63.2 percent had earned a bachelor’s degree within six years. That’s better than the 57.3 percent rate I cited above from the national IPEDS data, but it’s not an enormous difference. (For more analysis of the Beginning Postsecondary data, see Kevin Carey’s Brainstorm post from last week.)

Q. Until the dawning of a unit-record-tracking age, the IPEDS six-year graduation rate will probably be the best available measure. Since we have to live with this system, are there ways we could improve it?

A. Maybe. In July, the National Postsecondary Education Cooperative published a white paper about how to improve the IPEDS graduation-rate calculations. Among other things, the report suggested that the federal government’s College Navigator Web Site should display five-year rolling averages of an institution’s graduation rate, rather than focusing solely on a single cohort.

(Photo by the Flickr user reality-check. Used under a Creative Commons license.)

David Glenn
David Glenn joined The Chronicle of Higher Education in 2002. His work explored how faculty members are trained, encouraged, and evaluated as teachers; how college courses and curricula are developed; and the institutional incentives that sometimes discourage faculty members from investing their energy in teaching.
ADVERTISEMENT
ADVERTISEMENT
  • Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Find a Job
    Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Find a Job
  • The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
    The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
  • Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
    Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
  • Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
    Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin