Do Legacy Preferences Count More Than Race?

For years, colleges and universities have justified legacy preferences as a “tiebreaker” in close admissions calls. But as The Chronicle’s Elyse Ashburn reports, a new study by Harvard University researcher Michael Hurwitz finds that legacy preferences are larger than previously thought.

The analysis, which looks at Fall 2007 applicants to 30 elite schools, concludes that after better controlling for variables than previous researchers did, legacy preferences of all kinds increase one’s chances of admissions by 23.3 percentage points. More importantly, “primary legacy” candidates (sons and daughters, as opposed to siblings, nephews, nieces, or grandchildren) see a whopping 45.1 percentage point increase in the chances of admission. What this means, as Ashburn explains, is that if a non-legacy applicant with a certain set of credentials has a 15 percent chance of admissions, a primary legacy applicant with identical credentials would have a 60 percent chance of getting in.

Previous research—by William Bowen, Martin A. Kurzweil and Eugene M. Tobin—of an earlier cohort of applicants to 19 selective schools found that legacies generally had a 19.7 percent increased chance of admissions. Bowen and colleagues defined legacies as the children or grandchildren of alumni and did not distinguish between the two. Moreover, Bowen and colleagues controlled for SAT scores, but not the broader range of factors (such as high school activities and teacher recommendations) for which Hurwitz’s method effectively controls.

How does the 45-percentage-point increase given to primary legacies compare with other preferences such as those for under-represented minorities? Hurwitz’s study doesn’t say. But Bowen and colleagues (using earlier data, from a smaller set of schools and controlling just for SAT scores) found that being an under-represented minority increased one’s chances by 27.7 percentage points. If Hurwitz’s method of effectively controlling for the other variables would not dramatically change the racial numbers—a big if—then one might conclude that legacy preferences (which generally go to more advantaged applicants) have an even larger bite than racial preferences (which go to members of less advantaged racial groups). Bowen found colleges gave no preference to low-income students.

To be clear, comparing Hurwitz’s findings on legacies with Bowen’s findings on race is imperfect, given the different methodologies. But further research is certainly warranted to find out whether the 45-percentage-point boost provided primary legacies is, in fact, larger than the boost provided to racial minorities.

One discordant note in the Hurwitz study is a passing reference in the conclusion to the normative issues raised by legacy preferences. Hurwitz writes: “Although the admissions advantage received by legacy applicants may strike some readers as unacceptably large, I urge readers to consider that donations from alumni are increasingly important to the well-being of this paper’s sampled colleges.” But in an otherwise heavily documented study, Hurwitz cites no research evidence to support the hypothesis (long advanced by universities) that the presence of legacy preferences increases alumni giving.  In fact, new research by Chad Coffman of Winnemac Consulting (included in a 2010 Century Foundation volume that I edited) found that in looking at the nation’s top 100 national universities, from 1998 to 2007, “there is no statistically significant evidence of a causal relationship between legacy-preference policies and total alumni giving” once one controls for alumni wealth. Moreover, the study found that alumni giving did not decline at the seven universities that dropped legacy preferences during the time of the study.

Research finds, in short, that legacy preferences are more significant than previously believed, yet their fundamental rationale (raising money) is flawed. Study by study, the case for eliminating ancestry discrimination in college admissions continues to grow.

Return to Top