Taking a college course is a journey, and each student ends up charting a unique path through the assigned materials — bits of lectures that resonate, chance conversations with classmates, the parts of a textbook actually consumed.
With more courses happening online, colleges now can track those individual journeys more precisely. Such tracking is known as “learning analytics,” and it’s how administrators at Utah State University created a single graphic that depicts all the student activity from a recent online course. When I met one of the top officials from the university at the big ed-tech conference held by Educause, this image is what he was excited to show me, as if it held the solution to a longstanding riddle he was working to decode. He called it “the spider graphic.”
Or subscribe now to read with unlimited access for less than $10/month.
Don’t have an account? Sign up now.
A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.
If you need assistance, please contact us at 202-466-1032 or help@chronicle.com.
Taking a college course is a journey, and each student ends up charting a unique path through the assigned materials — bits of lectures that resonate, chance conversations with classmates, the parts of a textbook actually consumed.
With more courses happening online, colleges now can track those individual journeys more precisely. Such tracking is known as “learning analytics,” and it’s how administrators at Utah State University created a single graphic that depicts all the student activity from a recent online course. When I met one of the top officials from the university at the big ed-tech conference held by Educause, this image is what he was excited to show me, as if it held the solution to a longstanding riddle he was working to decode. He called it “the spider graphic.”
There’s certainly something powerful about the idea of having all of this information about student behavior revealed. Before the advent of computers, exactly what materials students looked at, and how long they spent reviewing each item, was unknown to professors, and seemed unknowable. Now such information about dozens of students can be displayed in a single picture, and one with a kind of strange beauty in its curving, interwoven lines.
Promoters of learning analytics talk of using these data-infused pictures to build more effective courses. They talk about teaching with an engineering mind-set, not just a curatorial one. If it’s done right, proponents argue, the average student will learn more than ever before — which will not only help individual learners expand their minds, but also improve an institution’s retention rate. And the best part is that colleges may already have the data, captured by the course-management systems they installed years ago. That’s why learning analytics has suddenly become one of the hottest topics in teaching with technology.
‘It takes a lot of mystery out of why students succeed and why students fail.’
“It takes a lot of mystery out of why students succeed and why students fail,” said Robert W. Wagner, executive vice provost and dean at Utah State, and the fan of the spider graphic. “It gives you more information, and when you can put that information into the hands of faculty who are really concerned about students and completion rates and retention, the more you’re able to create better learning and teaching environments.”
ADVERTISEMENT
But how much can big data actually reveal about something as personal and subjective as learning? And even if spider graphics can yield valuable insights, will professors know how to read them?
The story of Utah State’s graphic turns out to show the promises and limitations of learning analytics, and it highlights the issues raised when the type of data visualizations now common in business are brought to the college classroom.
Data Replaces Guessing
Courtney Stewart is the professor who designed the course depicted in that colorful graphic. He’s an assistant professor at Utah State’s School of Teacher Education and Leadership, and he’s been teaching online and blended courses for about seven years.
His students consistently give him above-average reviews in their course evaluations, but he has long been frustrated by how little he knows about how students experience his online classes. Without the ability to see students in person, he has to make guesses. For instance, when he saw students not following basic instructions on how to submit one particular assignment — information that was outlined at length in a video lecture — he figured that those students had skipped the video. “I do all this work to make these PowerPoints and these videos, but I was really worried about student engagement,” he says.
So, working with the university’s center for teaching and learning technology, he designed a course that gave students a variety of options, and then made an effort to analyze what students picked and how they did in the class. The course is titled “Instructional Strategies for Diverse Learners.”
ADVERTISEMENT
For each lesson, Mr. Stewart created several different delivery options. Students could read a text-based version of his lecture. They could listen to an audio lecture. Or they could watch a narrated PowerPoint. And when students did their homework, they could respond in whatever format they wanted to: submitting either a written essay, an audio reflection, or a video response. He would give feedback to each student in kind — so if a student submitted a video reflection on the week’s assignment, Mr. Stewart would reply with his own video critiquing the student’s submission.
The professor wanted to see whether students stuck with one delivery method or mixed it up. But he also just wanted to test more basic assumptions about his course’s design.
One of the biggest surprises he found: Only half the students ever used the home page he had so carefully built for the course.
One of the biggest surprises he found: Only half the students ever used the home page he had so carefully built for the course. Instead, many students just jumped to the homework, and only clicked to a reading assignment or lecture if they didn’t know the answer to a question.
Now that Mr. Stewart knows that, he is considering some design changes. One idea is to put course material on the assignment pages, where he knows all students visit.
How do the students feel about having every move of their learning tracked? At least one of Mr. Stewart’s students, Kade Hendricks, is fine with it — and in fact he says it might motivate him and other students to stay on task. Like many online students, Mr. Hendricks works full time and only has time for coursework on weekends or after he puts his kids to bed. So he admits to looking for shortcuts in many of the courses he takes. “If students know that the teacher can pull all the data, they might put in more effort,” he said. The important thing, he added, is to make sure students know they’re being watched, for ethical reasons.
ADVERTISEMENT
Mr. Stewart is still sorting through the data. And it turns out that the very chart that most excited Utah State administrators — the “spider graphic” — is the hardest for him to use. “It looks pretty, and it’s a really cool picture,” he said. “But I don’t quite know how that has meaning for my practice.” Will online teaching soon require skills in data science?
Kevin Reeve, director of teaching and learning technology for Utah State, says that the long-term goal is to create “dashboards” that give professors easy-to-understand views of the data — and to work with professors to find out what information is most useful to them. The dashboard metaphor seems to highlight how this technology might change teaching. If in the past professors could take in what they needed to know by simply looking out the windshield, now they also need to consult a range of gauges and dials.
A New Frontier
Mr. Reeve describes Utah State’s work as “charting a new frontier” when it comes to teaching, and that means that norms haven’t been developed yet. Some preliminary research done at other colleges may be able to give him and other administrators some guidance, though.
Isaac Chuang, senior associate director of digital learning at the Massachusetts Institute of Technology, says that one early lesson has been that the simplest results can be the most helpful to faculty members. “Forget about the fancy graphics,” he argues. Even a list of numbers can reveal meaningful patterns.
One example of that at MIT: using course-management-system data to detect when homework questions are duds. It helps that MIT is a major player in MOOCs, the free online courses where tens of thousands of students take part in a self-service education system. In one quantum-mechanics course, the same materials were taught in a MOOC as in an in-person version at MIT, and officials crunched the numbers on what percentage of students got each question right on the problem sets.
ADVERTISEMENT
One clear outlier quickly emerged: Almost everyone got the answer right on a problem designed to test one of the most difficult concepts. It turned out that an inadvertent clue in the question made it perhaps the easiest one on the problem set. “It was worded badly by mistake, and the instructor was aghast,” said Mr. Chuang, noting that the question was promptly rewritten.
MIT has since expanded its experiments with learning analytics, and starting a couple of months ago some professors in on-campus courses began receiving similar data on how students did in problem sets.
How did those professors react? “A big Huh, and Why?, and What can I use this for?” explained Mr. Chuang at a session at recent education-technology conference. “I think part of the challenge is addressing that gap of understanding the potential of analytics,” he added. To do that, MIT started a Digital Learning Lab and hired postdoctoral fellows who serve as “ambassadors to a revolution” to help professors interpret the numbers.
Josh Coates, chief executive of Instructure, which makes the course-management system used at Utah State, says he doesn’t expect teachers to become data scientists in order to interpret course data. “But most educational institutions do have administrators, or maybe a teacher or two who are data scientists, who can use that data to build up their corpus of methods.”
His key point was that just as every professor develops his or her own teaching style, instructors will find unique ways to incorporate learning analytics. Or as he put it, “What methods work is ultimately going to be very individualized.”
ADVERTISEMENT
Mr. Chuang was blunt with his colleagues at the conference about the typical faculty reaction to an older experiment with learning analytics, which flagged struggling students at the fifth week of classes, an approach many colleges have experimented with. “The number one thing our faculty would like is an ease in the burden of teaching so they can go back to research,” he said. “So easing the burden of having to deal with failing students really helped a great deal.”
One reason Mr. Stewart, of Utah State, wanted so much data is that he’s planning to present it as part of his tenure portfolio, as evidence of how much care and experimentation he puts into his teaching. But there is the chance that over time such analytics could become a way for administrators to grade the teaching ability of their faculty. That would be a mistake, argues Mr. Stewart, since each course is so unique that he can’t imagine a metric from course data that could compare professors. “For me this data only has meaning to me as the instructor.”
This is the time for skeptics, boosters, and those in between to talk about what can be learned from the digital breadcrumbs left by students. Part of that may involve deciding which trails to follow and which to avoid.
Jeffrey R. Young writes about technology in education and leads the Re:Learning project. Follow him on Twitter @jryoung; check out his home page, jeffyoung.net; or try him by email at jeff.young@chronicle.com.
Join the conversation about learning analytics on the Re:Learning Facebook page.
Jeffrey R. Young was a senior editor and writer focused on the impact of technology on society, the future of education, and journalism innovation. He led a team at The Chronicle of Higher Education that explored new story formats. He is currently managing editor of EdSurge.