This time of year is especially hectic for admissions staff. They’re visiting high schools and frequenting college fairs. They’re fielding questions from prospective students and beginning to sift through mountains of applications.
There’s a lot at stake, after all. Attracting and enrolling new students has become an existential priority as the proverbial enrollment cliff looms large — especially at public and private four-year colleges. Staff members who have the task of finding those students are also prone to burnout. Chronicle reporting last spring found that enrollment leaders experience panic attacks, anxiety, exhaustion, and “disillusionment.”
Artificial intelligence, then, is increasingly enticing to people who work in admissions and enrollment, both for identifying prospective students and tackling “administrative drudgery,” such as crafting messaging campaigns and transferring transcripts into databases that can be queried. One recent survey indicated that half of the 314 higher-ed respondents reported that their admissions departments use AI.
The heightened efficiency “has made us better admissions counselors to our students and families, because of the time we’re able to focus on them,” John Solewin, director of admissions at Rosemont College, a small, private Catholic university in Pennsylvania, told The Chronicle.
Tools that go a step further — like reviewing students’ applications — do exist. Student Select AI, for example, can scan essays and personal statements, and render scores on an applicant’s “noncognitive” traits, like positive attitude, or their “performance” skills, like leadership and analytical thinking. (It’s marketed itself as a key to a “holistic” admissions process following the U.S. Supreme Court’s decision in June to strike down race-conscious admissions). But sources The Chronicle spoke with for this article are proceeding with caution, reaching first for what feels like lower-hanging fruit.
Using an AI tool too aggressively, or without consideration of potential risks, like biased training data, would put a lot on the line, they said. Admissions officers need to continue nurturing human connections with prospective and admitted students. Many also cited an institutional commitment to ensuring access for all.
Still, they’re keeping an open mind about the possibilities.
“We may be talking about this again in a year,” said Andrew Brewick, assistant vice provost for enrollment management at Washington State University. “Who knows what will happen between now and then?”
The Assistant
AI in admissions isn’t limited to the now-ubiquitous chatbots that can answer questions 24/7.
For some, AI tools act as marketing assistants. At Southeast Missouri State University, the admissions office uses the “co-pilot” feature of Element451, a customer-relationship management (CRM) platform, to draft its email and text campaigns to both prospective and admitted students. It’s a timesaver as the team of 19 manages daily campus tours and some 30 annual events, alongside traditional office tasks like application processing, said Lenell Hahn, the college’s admissions director.
The heightened efficiency “has made us better admissions counselors to our students and families, because of the time we’re able to focus on them.”
Rosemont College’s team uses ChatGPT-4, the newest version of the OpenAI tool, to “copy edit” and “fine-tune” similar communications — an approach that Solewin said creates consistent formatting and a uniform “voice” for the roughly 1,000-student institution. It also catches grammatical errors and circumvents the need for days-long “back and forth” via email with the marketing department.
Solewin uses ChatGPT-4, too, to corral insights from across the internet on the best times and methods to communicate with potential applicants. While Solewin won’t always take its suggestions, he equates it to having “someone else in the room as a guide.”
AI can also conduct meaningful analyses to help inform decisions, sources noted. Hahn, at Southeast Missouri State, employs Element451’s “Insights Q” function to ask questions about the data that’s housed on the platform, including usage statistics for chatbot and live-chat features, details on campus events, and information about prospects and applicants. She can type a question into the search box — What is our top source of applicants this fall? or How many chatbot conversations have we had by student type? — and get an answer in seconds.
Analytics have been especially handy for Ethan Logan, vice president for enrollment and student experience at Western Kentucky University. As at other institutions, the budget team’s planning for the following fiscal year, which always begin on July 1, requires enrollment estimates six to eight months in advance (well before the next class of students is finalized).
So, Logan’s team harnesses predictive modeling, in partnership with Deloitte, to estimate the number of students — and, therefore, tuition revenue — expected for the upcoming year. With this particular analysis method, called “logistic regression,” an AI algorithm compares applicants’ data against a historical dataset (in this case, current and former students at Western Kentucky University) and places them in groups that indicate their likelihood of attending.
There’s a lot of pressure, Logan said, to provide an informed enrollment estimate that isn’t just “picking a number out of a fishbowl.” The other benefit he noted: Identifying students who are “on the fence” rather than, say, shoo-ins helps his department allocate its finite resources strategically. For example, determining who should receive messages and scholarship information.
The Chronicle asked Logan how he combats algorithmic bias toward student groups that may not be well represented in the college’s historical data. Logan wrote in an email that one way is to avoid seeing predictive modeling as a panacea, and to continue investing in outreach to underrepresented student groups.
The Chronicle didn’t come across instances of colleges using AI tools to review student applications. Companies often say their partnerships with colleges are shielded by confidentiality agreements; one of them, Talent Select AI, which runs Student Select AI, declined to provide client names to The Chronicle for this reason.
Seeing Potential
Even sources who said they weren’t using AI currently or extensively were quick to note applications that they’re eyeing.
Sebastian Brown, a regional admissions counselor at the University of Oregon, said he personally sees vast potential for AI in recruitment. More specifically, he’d be interested in a machine-learning model that could scan the college’s application and enrollment data and flag high schools “in which we’ve had any statistically significant” anomalies, like high yield. That could identify high schools and students who have historically flown under the university’s radar.
That’s what I worry about — that I lose a student before I had a chance to intervene. We need to make sure we don’t miss that hand off.
During peak recruiting season, Brown averages three to four high-school visits a day — largely in California’s San Diego, Inland Empire, and Imperial Valley regions — and wants to make sure he’s investing time in the right ones. “I might have a high school that’s really popular with my institution” because of the volume of applications received each year, “but not many students will actually end up coming. I might then have a school down the street that only sends us five applications, but we enrolled five students,” he said. “That is now a very different mindset for me going into that school. Maybe we have a great partnership that we can create.”
At least two other admissions officials The Chronicle spoke with were interested in similar capabilities, noting that traditional avenues for finding prospective students, like purchasing lists from the SAT or ACT, have borne less fruit in the era of test-optional policies.
Once students actually apply, Brewick, at Washington State University, could see admissions using an AI-powered optical-character recognition (OCR) program for both administrative and analytical assistance. On the administrative side, the software could “scrape” applicants’ transcripts and funnel that information — grades, courses, GPAs, etc. — into a uniform dataset. (For now, transcripts are just uploaded into the system as images due to resource limitations, Brewick wrote in an email. The university’s main campus in Pullman alone processed more than 32,500 applications for the fall of 2023 semester.)
Admissions staff could then ask questions of the dataset. Which applicants, for example, haven’t met the minimum high-school credit requirements to enroll? Those applicants wouldn’t automatically be rejected, Brewick noted, but quickly flagged for closer review.
Dale Leaman, executive director of undergraduate admissions at the University of California at Irvine, added that AI could be especially helpful for transfer articulations and evaluations, which he described as “one of the most time-consuming things we do.” The university receives about 22,500 transfer applications a year, and it’s not unusual for admissions staff to come across courses and institutions on a transfer student’s transcript that “we’ve never seen before,” he said. When those cases arise, it often falls on already-stretched faculty members to review those courses and determine if there’s a UCI equivalent.
Leaman said he’s talking with customer-relationship-management vendors like Slate on the feasibility of an AI tool that could compare UCI’s course catalog and descriptions against unfamiliar transfer courses to find the closest match. His office is also interested in a “self-service” tool that prospective students could use to check which of their courses UCI would accept for credit.
For Huma Madinawala, UCI’s director of California first-year outreach, part of achieving a truly “holistic” admissions process is making the transfer process less burdensome and confusing for all involved.
It’s about “access,” she said. “To make sure students aren’t just sitting and lying in wait to be able to understand what it takes” to attend the institution.
Baby Steps
Why not use such tools right now, then? For the most part, sources said they’ve seen or heard of existing products with these capabilities. (Unsurprising, perhaps, given the hype over AI in the year since ChatGPT was unveiled).
Sources attributed their caution to a number of factors: The AI market is evolving rapidly, and they don’t want to haphazardly add tools without knowing how or whether they’ll work. They don’t have the capacity at present to conduct the due diligence required to adopt an AI tool, such as data validation during the training period. They are wary that an AI tool would not hand off a certain task or conversation to a human when it’s supposed to.
Logan, at Western Kentucky, places a heavy emphasis on the latter.
“That’s what I worry about — that I lose a student before I had a chance to intervene,” he said. “We need to make sure we don’t miss that hand off.”