> Skip to content
FEATURED:
  • The Evolution of Race in Admissions
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
ADVERTISEMENT
The Review
  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

Measuring Faculty Productivity: Let’s Get It Right

By  Michael F. Middaugh
August 28, 2011

I have been following with considerable interest the recent controversies at the University of Texas at Austin and Texas A&M over the collection of data purportedly measuring faculty productivity. That’s because I was, for almost 20 years, director of the National Study of Instructional Costs and Productivity at the University of Delaware, the tool of choice for collection of detailed data on faculty teaching loads, instructional costs, and externally financed scholarly activity. The Delaware Study, as it is known, has included more than 600 four-year institutions since its inception, and provides participating institutions with information on which categories of faculty are teaching which levels of students and at what cost.

We’re sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.

Once javascript and access to those URLs are allowed, please refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com

I have been following with considerable interest the recent controversies at the University of Texas at Austin and Texas A&M over the collection of data purportedly measuring faculty productivity. That’s because I was, for almost 20 years, director of the National Study of Instructional Costs and Productivity at the University of Delaware, the tool of choice for collection of detailed data on faculty teaching loads, instructional costs, and externally financed scholarly activity. The Delaware Study, as it is known, has included more than 600 four-year institutions since its inception, and provides participating institutions with information on which categories of faculty are teaching which levels of students and at what cost.

POLITICS AND THE UNIVERSITY: ▶ Views From Experts on Six Campuses

I mention these specifics because attention to detail, along with patience, is crucial to the successful collection of data for the purpose of measuring faculty productivity. From the very start, institutions that participated in the Delaware Study were required to accept the caveat that study data should be viewed over time, not just in a single year. (Idiosyncratic data resulting from sabbaticals and other paid leaves can affect both teaching loads and instructional costs in any given year.) And when they are viewed, those data should not be used to reward or penalize academic disciplines. Rather, they should be used as a tool of inquiry for framing discussions about why individual institutional results are similar to or different from the national benchmark data.

ADVERTISEMENT

After all, faculty do a great deal more than teach, and faculty productivity embraces a great deal more than can be captured in a “student credit-hours taught per faculty” ratio. Colleges must consider the qualitative dimensions of out-of-classroom faculty activity, particularly in the fine arts, social sciences, and humanities, where there are little data about external support to provide context for teaching loads and instructional costs.

Because of this need for broader contextual information, the University of Delaware’s study collects data on a broad range of out-of-classroom faculty work that can affect both the amount of teaching done and associated instructional costs, including the number of undergraduate and graduate-student advisees, the number of thesis and dissertation committees served on and/or chaired, the number of course curricula designed or redesigned, and so on. It monitors, among other things, the number of manuscripts submitted and published, juried shows/commissioned performances/invited presentations or readings, grant proposals prepared and financed, and patents applied for and awarded. Additionally, it collects information on service to the institution, the profession, and the community. As with the teaching load/cost portion of the study, this information provides a context for more fully understanding the teaching loads and associated direct expenses within the discipline.

Indeed, analysis at the disciplinary level is an important feature of any study that sets out to measure faculty productivity. In analyzing multiple years of Delaware Study data in a major study for the National Center for Education Statistics, we started with the operating hypothesis that an institution’s Carnegie classification would be a major factor: Research universities would teach less and cost more than doctoral universities, which would teach less and cost more than those that do not confer doctoral degrees, with baccalaureate institutions costing the least and teaching the most. In fact, more than 80 percent of the variation in total instructional costs at four-year institutions is explained by the disciplinary mix that comprises an institution’s curriculum.

An institution like the University of Delaware, for example, which is heavily invested in the hard sciences and engineering, will have a higher direct instructional expense per student than a comparable institution that is more heavily invested in the humanities and social sciences, disciplines that are less equipment-intensive and lend themselves to larger class sizes.

In Texas, though, nearly all the data elements being collected are what we call input measures—faculty salary, number of courses taught, course enrollments, student credit-hour production, average grade awarded. This limited viewpoint can tempt universities to increase productivity by ramping up the number of courses and credit-hours taught and reducing the number of faculty doing the teaching. But are we prepared to give up all of the other dimensions of faculty activity? If faculty don’t advise students, are we prepared to hire professional staff in their place? If the only research that counts is externally financed, are we prepared to give up the contributions of faculty to studio art, music, theater, and literature? I suspect not, even in Texas.

ADVERTISEMENT

As educators, we should be far less focused on how many courses and credit-hours faculty teach, and far more concerned with seeing a variety of measurements, not simply a standardized test, of how much students are learning. The good news is that hundreds of colleges throughout the country are increasingly embracing such measurement as an essential component of faculty productivity. I can only hope that our colleagues in Texas take note of this. Yes, measuring productivity is important—but let’s do it right.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Opinion
ADVERTISEMENT
ADVERTISEMENT

Related Content

  • The Know-Nothing Assault on Higher Education
  • An Old Story With a Dangerous New Twist
  • The Road to Dystopia
  • Public Education for the Public Good
  • Keep Your Hands Off the ‘Fierce Humanities’
  • Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
    Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
  • The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
    The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
  • Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
    Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
  • Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
    Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin