Academic program reviews — or APRs, as they are known in administrative-speak — are both a blessing and a curse.
A well-executed internal review can be a blessing when it leads to a helpful external review that allows your department to shine and be appreciated for its strengths. The curse, of course, is that someone (often the department chair) has to convene a committee (not another committee!) of faculty members (already feeling overburdened) to write a self-study before any external reviewer can be brought to campus for a “tweed on the ground” evaluation of your program.
For many years, the two of us have served as external reviewers for APRs in psychology. Between us, we have done more than 70. Although we have yet to do an APR together, we often discuss the problems we’ve encountered on our visits around the country.
To be sure, most APRs go well — often very well — and lead to a healthy and productive exchange of ideas for improvement. APRs are standard practice in higher education. Whether the state mandates the review as an accountability measure or the institution conducts one voluntarily as a good practice, most academic programs face a periodic demand to verify their claims about the quality of their teaching and research. At minimum, an APR offers an opportunity for self-reflection and self-correction. Ideally, a favorable review can be used to leverage additional resources, such as those increasingly precious faculty lines.
Most faculty members who have been through a successful APR recognize that the true value of the process is in those intangible results — and not in the paperwork that gets filed at the end of the journey.
But some reviews don’t go so well, and are mostly memorable because of problems — we call them program-review “potholes” — that could have been avoided with some planning and forethought. In the worst-case scenario, an external reviewer persuasively argues for drastic changes in your program strategies and personnel. And yes, we have even advised administrators of clearly ineffective programs that they needed to “start over.”
We’ve come up with a list of common APR potholes, along with some advice on how to steer clear of them. We offer that list here in the hope of helping any programs facing imminent review.
The unexamined-life challenge. Both of us have visited programs that didn’t bother to do a detailed self-study before we arrived. Conducting a self-study ahead of time should be standard procedure in any program review. In fact, most reviewers would say the self-study is the whole point of the process, affording people in the department a chance to identify where they are succeeding and failing. A thorough self-study is also a means for a department to alert the administration that the review will be a serious exercise, rather than a self-congratulatory one.
The “You’ve got a friend in me” trap. Choosing the right APR visitor is essential if the department is going to receive recommendations that are actually useful. However, some departments hedge their bets by inviting allies (i.e., friends, often friends of the chair) to conduct the review, which poses a conflict of interest. In such cases, the APR loses traction from the outset because the reviewer can be criticized as biased or, at the very least, uncritical, thereby wasting the rare but important opportunity for program development.
Choose an external reviewer who is familiar with the context of your program (e.g., liberal-arts college, regional comprehensive, HBCU) or who has special expertise on problems identified in your self-study (e.g., curriculum design, faculty recruitment, assessment) — and not because that person can do you the favor of painting a rosier portrait than your program might deserve. Experienced reviewers should bring a variety of suggestions to the table rather than serving as an echo chamber.
The past-should-be-prologue matter. Some departments, looking to make a fresh start, resist giving the external visitor a copy of the previous APR report (there’s usually a window of five to seven years between reviews). That’s a mistake. Not only should you hand over the previous report, you should also provide a list of the responses and changes that resulted. Expect the new reviewer to ask why some recommendations were followed while others were ignored. Having that background information bolsters the confidence of the new reviewer in determining what next to recommend.
The solo-tour-guide setback. In some departments, preparing the self-study becomes an assignment sometimes passed off to a junior faculty member since the senior professors are “busy doing more important things.” To compound matters, the assignment often comes with little or no compensation or release time.
Despite those challenges, the single-author approach can produce a fine document. Trouble is: It’s unlikely to have the galvanizing effect of a shared process. Colleagues who have no stake in the self-study are unlikely to support many of its conclusions. Things fall apart when the external reviewer discovers the lack of involvement or interest. Whenever possible, a representative sample of department members should prepare the self-study, sharing section drafts with the rest of their colleagues and encouraging feedback so that the final version surprises no one.
The Luddite scenario. Is the department website hopelessly out of date or difficult to navigate? Was it last updated when Obama began his first term? An external reviewer will go to your program’s website to evaluate its public face. One of us did a site visit recently during a major campus web updating, in which the host department had virtually no digital presence. The inconvenience of that for the APR visitor is relatively minor, but it represents a lot of lost opportunities to interact online with prospective students and their families.
The missing loyal opposition. Even the most functional departments tend to have at least one outlier colleague who can be counted on to offer a dissenting perspective. External reviewers need to talk to every department member, happy or not. Experienced reviewers are sufficiently seasoned to recognize when crabby complaints come from the legitimately disenfranchised versus career curmudgeons or contrarians. As reviewers, we find it helpful for the chair to identify the likelihood of turmoil in the program without necessarily identifying the misbehaving.
However, if a faculty member is well known for truculence, we recommend a different course of action. Each of us has endured rude comments or inappropriate behavior from such faculty on several occasions. In the best of situations, the department chair or other host has taken us aside and warned us in advance about prickly colleagues. The wise course is to be honest about faculty members who may make a potentially embarrassing show.
The stacked-deck sleight of hand. An important feature of any program review is interviewing current students. It is less than helpful, however, when departments interview only their best and brightest students. Those students would probably fare well in any program on any campus and will always paint the rosiest of pictures about their educational experiences.
More telling can be how the department manages to deliver the program to students with more significant challenges or backgrounds (e.g., first-generation college students, adult learners, online degree earners). We think it is especially important to evaluate how the faculty relate to students who are not headed to graduate school.
The lofty-expectation problem. Many departments participate in APR with visions of (finally) getting an official channel to lobby for badly needed resources, such as additional office or research space, as well as the golden ticket, new faculty lines.
External reviewers can make the case for the redistribution of resources to resolve real needs. But it’s unrealistic to assume that these rewards will be forthcoming when the review is over. External reviewers do take into account the fiscal challenges facing the institution, just as administrators are on guard for the typical “they need more lines” speech from an APR report. One of us once did a review where we were explicitly told up front: Do not recommend any new faculty hires. Program reviews are more helpful when all parties are realistic about what can be done.
The care-and-feeding requirement. Because they want to take advantage of every available moment, departments sometimes overschedule the external reviewer. The irony here: If the visit is planned with no opportunity for taking notes and reflection, the reviewer may be sensitized to examine whether the work environment itself provides a reasonable context in which to get the job done and a reasonable quality of life.
The external reviewer needs a place to work (an office or vacant classroom will do), access to the internet, water and snacks for the ongoing conversations, and some “down time” to translate a lot of information into a meaningful final report.
The dusty-shelf destiny. Many departments remain skeptical of the APR process, fueled by the refrain that “nobody in the administration is going to read the report, anyway.” One of us worked with a psychology department that actually incorporated a graph of tractor production in its self-study as a means of proving this point.
Instead of assuming the higher-ups will or won’t read the report, ask for a formal response from the dean or provost who commissioned the review. That way, departmental needs will be acknowledged and any unrealistic expectations about prospective gains can be placed in check.
The end result. Sometimes chairing a department can feel like a hopeless enterprise because there are too many drivers all trying to go in different directions. We call that the “caravan dilemma.” The APR process can actually subvert it by steering department members through some shared decision-making. If you and your colleagues rarely meet because such gatherings are viewed as intrusive, time-wasting, or simply an opportunity for collective complaining, the cost is creating a department that doesn’t really feel like a department. Program review can (re)establish a sense of shared mission and direction.
An effective review can affirm the department’s identity and hard work, instill appropriate pride for its accomplishments, and generate a new commitment to dealing with emerging opportunities or inevitable weaknesses.
Most faculty members who have been through a successful APR recognize that the true value of the process is in those intangible results — and not in the paperwork that gets filed at the end of the journey.
Jane S. Halonen is a professor of psychology and former dean of arts and science at the University of West Florida. Dana S. Dunn is a professor and former chair of psychology at Moravian College.