That’s no longer the case. After a frustrating freshman year, in which she spent hours every day trying to learn and remember enormous amounts of material, from general biology to chemistry, Abeldt ended up with a B grade-point average. Always an A student in high school, she decided to explore AI tools to see if they could help her study better.
It began with an anatomy and physiology course last summer, when she asked ChatGPT about a bone that she didn’t quite understand, and was amazed by the clear explanation.
Abeldt began building her AI arsenal. She used ChatGPT and watched YouTube videos to break down complex topics. Perplexity, an AI-powered search engine, helped her find information. Google NotebookLM turned class notes and lecture slides into study guides and podcasts, which she listened to on the way to class. Quizlet made flashcards and ChatGPT created mock exams and quizzes. Abeldt even used AI to develop a study plan to avoid procrastinating or spending too much time on any one assignment. She followed its guidance, and it worked.
In pharmacology this past spring, a course she called “insanely difficult,” AI proved crucial to doing well. She would ask for alternative presentations of slides used in class and different ways to describe complex systems, such as different parts of an EKG and the pattern that blood follows when it goes to the heart. On the final exam she earned an A.
The idea that when students turn to AI, they’re being lazy and cheating “completely enrages me,” Abeldt said, “because that’s not true.” She doesn’t think professors truly understand what students are doing with AI. That’s perhaps not surprising given that the news media is full of stories like “Everyone Is Cheating Their Way Through College” and “There’s a Good Chance Your Kid Uses AI to Cheat.” Abeldt knows some students abuse AI, but that’s not, she said, how she or many of her classmates use it.
“AI allows all students, despite the way they learn, to understand your course materials.”
Cheating is, of course, a major problem in colleges. Professors report a dramatic rise in AI-generated writing and other forms of misuse. Yet another, equally profound, change is taking place under the radar: Increasingly, students are turning to artificial intelligence as an all-purpose study tool, recasting how they think about learning and reshaping their relationships with classmates and professors.
For many students, AI has been a godsend, helping them overcome learning deficits or poorly taught courses. Others appreciate the tools’ efficiency. In a fast-paced world, where students might be juggling a full course load and working 20 hours a week or more, speed is everything. And while they are bothered by the idea that professors assume students just use AI to cheat, they are often unsure whether they are cheating by using AI. Even students don’t talk much about these gray areas, students say, other than through jokes about “ChatGPTing” an assignment.
AI has completely changed how I approach learning, it’s helped me study more efficiently, stay organized, and access explanations or resources that would normally cost a lot of money or time.
“I’ve not used AI to do assignments for me, but I’m not sure whether some of my uses would be considered cheating,” wrote a student who said they were enrolled in a graduate program in computer science at the University of Pennsylvania. “For example, I might write a long proof to a problem, then share what I’ve written with ChatGPT and ask it to point out any flaws in my logic. If it points out somewhere I’ve gone wrong, then I revise my proof. We’ve been told we can’t ask AI to do assignments for us (obviously), but I’m not sure whether this counts as unauthorized assistance.”
Students, just like older adults, also have conflicting feelings about these artificial-intelligence tools. They know generative AI can produce unreliable information. They fear that, in their pursuit of expediency, their attention spans are shrinking; some refuse AI altogether for these and other reasons.
What’s clear, though, from interviews and surveys, including responses from more than 100 students to the Chronicle invitation, is that it’s common for students to treat AI as a one-stop shop. They feed articles and lecture notes into chatbots to generate summaries and study guides. They use AI to brainstorm ideas for research papers. They road-test essays by asking AI to find weak spots in their thinking.
Recent research shows that AI use is expanding rapidly.
- A survey of 1,529 college students by Tyton Partners found that 42 percent of students used generative-AI tools daily or weekly in the spring of 2025, up from 36 percent the previous spring.
- A survey by Primary Research Group of 1,022 college students found that those in technical majors, such as business, computer science, and engineering, use AI most frequently. The survey also found that generative AI is either supplementing or replacing traditional search engines about 25 percent of the time.
- A study by OpenAI of 1,200 college-aged ChatGPT users pinpoint the ways in which students commonly use AI: including to start papers and projects, summarize long texts, brainstorm, explore topics, edit writing, solve math problems, conduct research, tutor themselves, and create lesson plans.
This generative-AI use is layered on top of an already robust array of tech tools that students have relied on for a while. Google searches years ago replaced trips to the library. YouTube has long been valued for its explanatory videos. TikTok is another favorite for quick explainers. Quizlet has been a go-to app for digital flashcards. Grammarly is widely used to make writing sound more precise and professional. Now many of these tools have incorporated AI.
A theme running through students’ commentary, as they described their AI use, was the idea of technology as an aid and tutor. It can keep them on track, streamline the “fluff” in their assignments, compensate for mediocre teaching or unavailable instructors, and help them juggle a busy schedule.
“AI has completely changed how I approach learning, it’s helped me study more efficiently, stay organized, and access explanations or resources that would normally cost a lot of money or time,” wrote Kahlil J. Davis, who is studying fine arts at the University of Texas at Austin. “When a task feels overwhelming, like I’m trying to figure out where to start or how to structure something, AI can lower that barrier and help me build momentum to just work without all the struggle.”
In other words, the tools are filling a gap that students feel in what their professors are providing. That echoes concerns expressed in a Chronicle focus group of 15 college students, supported by Ascendium Education Group. These young people said they turn to tech like YouTube and TikTok to learn about careers and fill gaps in their knowledge. They also said they wished for better communication and outreach from professors, and worried about their study habits and motivation.
Academics studying students’ use of AI have seen this shift in how students learn and what it reveals about what they think of school. They often come to class less prepared, believing they can get by on AI summaries. Instead of turning to a classmate or asking the professor when they have a question, they may plug it into ChatGPT. Seeking out a professor during office hours — an intimidating prospect for many students — becomes less necessary if they can turn to AI. And studying for tests and exams may mean marshalling material from various sources — lecture notes, AI summaries, slides — and uploading them to create a study plan.
“Things are changing in front of our eyes in the classroom,” said Linda Dowling-Hetherington, who teaches in the College of Business at the University College Dublin and has been studying the shift in students’ relationship to AI. “Some colleagues are still going in and teaching the way they always did, and in actual fact, students are learning very differently in many ways.”
“I have a full-time job, a part-time job, work 50 hours a week, and don’t have the privilege of time to pore over every single article in detail,” wrote Savannah Haley, who is pursuing a graduate degree in library science at the University of North Texas. “ChatGPT has leveled the playing field for those students who have to work for a living.”
In an interview, Haley described feeling overwhelmed by the sheer number of assignments she has: 15 articles a week in one course alone. Many of the assignments are dense, scholarly texts that can stretch to 100 pages or more. She has tried to talk to her professors about the heavy reading requirements and her need to work to cover her expenses. But “they just don’t understand. They think you can still just go rent an apartment somewhere super cheap, and just rough it. That’s not possible anymore.”
As an undergraduate at the University of Texas at Austin majoring in English and minoring in classics, Haley said she was wary of AI. She didn’t need it, and didn’t trust it. Her courses were discussion-based, and if you didn’t read the original texts for, say, a course on John Milton or Jane Austen, you simply wouldn’t be able to participate.
But graduate school is quite different. All of her classes are online. Discussions are often limited to “robotic” discussion-board posts. The purpose of doing all of the readings remains unclear. So she runs articles through AI, reads the summaries to determine which ones are most relevant to the assignment at hand, and focuses on those.
She still doesn’t trust AI for certain things. In one class exercise she was asked to compare the search function of a database made by librarians with Gemini for tracking down copies of a book in nearby libraries. The database was clearly better. She would also never use AI to write for her — a common theme among many of the Chronicle respondents. She trusts her own voice more.
For some students, AI has been a means of academic survival, even as they worry about its long-term effects on their ability to think.
ChatGPT has leveled the playing field for those students who have to work for a living.
Anna Swenson, a rising senior at Ball State University double majoring in marketing and international business, said that she had struggled in many of her courses because of sub-par teaching. Sometimes professors read passages aloud from articles, with no elaboration. One gave wrong or confusing answers to questions she and other students had. Other times she can’t quite understand what a professor is saying because they speak softly or have a thick accent. Meeting professors or tutors during office hours is difficult because she works full time.
“I kind of stayed away from [AI], because I feel like, why would we go to college and pay for college if we’re not going to actually learn anything?” she recalled in an interview. “But then there are situations where I feel like the professors don’t teach, they just read from a book. So I have to look up different videos and try to find different resources that can help me understand the topics in class. And sometimes ChatGPT can point me towards those resources, and sometimes they can explain things to me.”
In a course on global financial management, Swenson said, she did poorly on a practice exam. So she transcribed it into AI and asked it to explain problems and terms that confused her in ways that help her understand. She also uses it for “busy work” such as checking grammar and spelling.
But like many students who wrote in, she is disconcerted by a future in which people outsource their thinking to AI, and worries about some of her own behaviors. She no longer attends class as often as she feels she should. She admitted to feeling guilty for taking various shortcuts, and worrying that she was “not using my brain as much.”
“I feel like if I want to truly make a difference in the world, I don’t want to do it through a search engine or through knowledge on the internet,” she said. “I want to be able to represent myself and my beliefs as I think about them.”
“Students are often fiercely divided on AI depending on their social circles and beliefs: Artistic students and left-leaning students will likely be more anti-AI, while right-leaning students and technology-focused or business-focused students are more likely to be pro-AI,” Ezri Perrin, a student at Webster University, wrote to The Chronicle. “We are not all uniformly glomming onto AI just because we are young.”
Perrin, a rising sophomore and biology major, said they consider generative AI to be unethical because it harms the environment, produces biased results, and is untrustworthy. “I do not trust AI to do my schoolwork for me, nor to help me with my studies,” Perrin wrote in a follow-up email, “because I would have to spend extra time fact-checking whatever it spat out, and then correcting errors, which would take more time than if I were to simply do the assignment or make the flashcards myself.”
Students who like studying with classmates — in an AI-free environment — sometimes find themselves at odds with a shifting campus culture.
Shelby Foster, a humanities major who just finished her first year at McMaster University, said she had been looking forward to the college experience. Foster, who grew up in a small, rural community, imagined being among like-minded students driven by a desire to learn from each other. Instead, she found “a wall, where they didn’t want to share or be shared with.”
Foster doesn’t know how much of the isolation she felt was due to AI use, but she once approached a student in her English class to ask if she would be interested in editing each other’s papers. Foster had liked what the classmate had to say in small group tutorials. The student apologized, said she didn’t have time, and directed Foster to Grammarly, an app that uses AI to correct spelling and grammar, and suggest other edits.
“The conversation just kind of ended there,” recalled Foster. “I thought that was kind of heartbreaking. I was like wow, that could have been a friend.”
We’re losing an important aspect of the college experience.
After a few experiences like that, she became less inclined to seek out other students to collaborate with. Foster believes that plenty of students crave the kinds of connections she does, but are lured to AI by the promise of efficiency.
She recalled a close friend who proudly told her how they had used AI to create graphic designs for a project. But when Foster said, you know, plenty of people on campus love graphic design and I’m sure they would have loved to do this for you, her friend realized that she should have talked to other people first. Foster recalled her friend saying of the design she made with AI: “There’s almost too much of myself reflected back at me.”
“We’re losing an important aspect of the college experience,” Foster said. “And that’s networking and building relationships and learning from the person next to you, instead of just the professor in front of you.”
Other students expressed frustration at seeing classmates abuse AI. Juliana Fiumidinisi, a rising sophomore at Fairfield University, is a fan of AI as a study aid. It has helped her outline and organize papers, and served as a math tutor, breaking down problems step by step.
But she feels that many people in her generation lack self control and turn to technology for instant gratification. She recalled studying hard for a psychology test, taking handwritten notes, making flashcards, and doing everything “the old way,” she said. “I didn’t use AI at all because I really wanted to know the content.”
“Then I went to take the test, and it was an open browser, and people were literally just copying and pasting the questions from the test into ChatGPT,” she said. “I was very upset, because I spent a lot of time studying. It took hours out of my week to study.”
“So that was a really frustrating moment for me.”
Dowling-Hetherington, at University College Dublin, along with a colleague, Virginia R. Stewart, has been interviewing professors and students in the college of business, both graduate and undergraduate.
While faculty members spoke almost exclusively about academic-integrity concerns and how AI would change the way they assess students, what students said about how they used AI to learn surprised the researchers.
Some professors, for example, had wondered why students rarely asked questions in class. It turns out that rather than raising their hands, students would ask ChatGPT. Similarly, when professors raised a question in class, students in a focus group said, one of them would enter it into ChatGPT, and then disseminate it through WhatsApp, a group chat, to their classmates.
Dowling-Hetherington and Stewart asked students why they didn’t just stay after class and ask the lecturer if they were confused about something. One student talked about social anxiety and said AI was a safer space.
Sometimes AI created conflicts among students, such as when some in a group wanted to use AI, despite a professor’s prohibition, and others did not. And international students, because of language challenges, were more likely to rely on AI as a study aid. Many students also talked about how they would log into ChatGPT a few minutes before each class to refresh their memories of what was last discussed.
These new strategies will require professors to reconsider how they have always done things and what they expect of students, the researchers said.
Stewart noted, for example, that many students have decided it’s not worth doing the reading in advance of class. Instead, they run assigned case studies through AI to get the highlights. They come to class, she said, “looking for extra information, and then afterwards, I think, going back and putting all the materials together in some kind of recipe and coming up with a study guide.”
“If your class presumes people are prepared,” she said, “it’s dead in the water.”
So, if students are talking less in class and turning to AI as a tutor and study guide, what is the role of the professor? Students told the researchers that they didn’t want feedback from AI, particularly if it was critical. They wanted to hear directly from their professors about the problems in their work. “That human interaction, the human touch, is still really, really important,” said Dowling-Hetherington.
But students are also sending mixed signals. “We asked our focus groups, what is the role of the lecture or the teacher or the professor? And it just blew my mind when one was like, ‘Well, to tell stories about their experience,’” recalled Stewart. She thought at the time, but this is what case studies are for, yet students aren’t reading them. “[Y]ou’re going to talk about and analyze and get into the protagonist, because storytelling is so effective.’” she remembered thinking. “But I guess the written word is not being attended to.”
Faculty members are going to need to think differently about what they want their students to learn, Stewart said. Is it how to negotiate with the help of a bot? Will it be to demonstrate cognitive mastery? Is it acceptable for students to use AI as their information-retrieval system and demonstrate expertise by showing how to apply that knowledge to new situations?
Dowling-Hetherington said she came away feeling positive overall about the conversations with students. “It was a breath of fresh air,’ she said, “to hear them talking about how they can learn and how they’re adapting their ways of learning.”
Another academic, Rahul Divekar, an assistant professor of experience design at Bentley University, conducted an experiment with 20 students to understand how large language models change the way they learn.
Students were asked to explore a question on a topic they knew little about, such as how the internet works or how a power grid works. In one instance they were asked to use ChatGPT. In another, their given research tool was Google.
That human interaction, the human touch, is still really, really important.
Divekar and his fellow researchers asked students to contrast the experiences. Which was more difficult to use, ChatGPT or Google? Did they learn anything unexpected? How confident were they in each tool? And what contribution — if any — did they themselves make to the final product?
Students liked ChatGPT because its responses were tailored to what they were looking for. With Google they often had to jump from site to site looking for the explanations they needed so they could take their research to the next level. ChatGPT also was more comfortable to use, they said, because they could make requests like, “explain this as if you were talking to an eighth grader.”
Among the drawbacks students saw in AI was that it was likely to be an echo chamber, whereas Google would point them to a variety of opinions and points of view. They also weren’t entirely sure they could trust AI’s results. And they worried about its effect on their thinking. Divekar heard comments such as: Instead of absorbing information, I was just getting it to write for me.
Yet students did feel a sense of ownership if they ended up putting in some effort, Divekar found, particularly among students who said they spent time verifying what ChatGPT was telling them.
Like the Dublin researchers, Divekar has been thinking about how AI use is shaping students’ social-emotional skills and their ability to learn. And also like them, he is beginning to assess students’ learning using a combination of methods, such as presentations, written reports, and group discussions. If students can present on a topic deeply and correctly, he’s not concerned that they used ChatGPT to help them get there. “The product can still be an indicator of the process.”
Meanwhile, students, with little guidance from educators, are figuring out how they want to use AI in their own lives.
For her part, Abeldt, the Kansas State student, has been happy with her approach. She ended her sophomore year with all A’s and one high B, she said, a notable improvement over her first year in college.
Abeldt did have one bad experience, in which a professor accused her of using AI to write a discussion post in the form of a paper. The objective was to watch and analyze a mediation case on video, and write about it. The professor told her that an AI detector had flagged her piece for being about 30 percent AI written when, Abeldt said, she had not used any AI.
Rather than fight, she rewrote the paper, she said. She felt that her professor didn’t really understand what the detector was doing, such as picking up quotes she had cited from the video. To make sure she wasn’t flagged again, she turned phrases such as “she did an exceptional job of fulfilling her job as a mediator” to simpler wording like, “the mediator did a good job mediating.”
“I felt like I was having to dumb myself down,” she said. The experience confirmed her belief that many professors think students use AI only with bad intentions. She is on the pre-nursing track in college and imagines using AI in her future career. Health-care providers already use it, she noted, to make diagnoses and predict risks. A blended future, in which humans work alongside AI, doesn’t alarm her.
“No matter how smart AI becomes, empathy and intuition are human qualities that can’t be replicated,” Abeldt said. “I see AI as a supplement, not a replacement. And I think if we can find a balance between efficiency and human compassion, that’s when everything will fall into place.”