Terri L. Renner had long wondered what made some of her students succeed more than others.
Maybe, thought Renner, a senior lecturer in the School of Public and Environmental Affairs at Indiana University at Bloomington, it was a question of preparation. Because her course in health-care finance involves accounting, she expected that students who previously took classes in that subject would do well in hers — but a quick review showed no clear correlation. So what, exactly, did prepare students best for rigorous quantitative courses?
We’re sorry, something went wrong.
We are unable to fully display the content of this page.
This is most likely due to a content blocker on your computer or network.
Please allow access to our site and then refresh this page.
You may then be asked to log in, create an account (if you don't already have one),
or subscribe.
If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com.
Terri L. Renner had long wondered what made some of her students succeed more than others.
Maybe, thought Renner, a senior lecturer in the School of Public and Environmental Affairs at Indiana University at Bloomington, it was a question of preparation. Because her course in health-care finance involves accounting, she expected that students who previously took classes in that subject would do well in hers — but a quick review showed no clear correlation. So what, exactly, did prepare students best for rigorous quantitative courses?
This wasn’t an idle question. As enrollments in Indiana’s program in health-care management and policy more than tripled in the past 10 years, the number of students struggling in the program’s quantitative courses also climbed. Faculty members had added prerequisites to some courses, but did those have the intended effect?
To find answers, Renner, who is on the school’s undergraduate-curriculum committee, turned to a resource that was hiding in plain sight: the trove of data points that her institution collects about its students.
ADVERTISEMENT
With help from colleagues who specialize in data analytics, Renner has spent the past two years analyzing data on student demographics, prior coursework, SAT scores, and grades. In one experiment, she crunched the data on thousands of students who had taken certain finance or health-care-economics courses, comparing those who had been grandfathered in under the old rules with those who had to take the prerequisites.
The results were similar to what she found with her early, informal survey: There was no significant difference in performance between students who had taken the prerequisites and those who hadn’t.
Her research has raised important questions about the value of prerequisites. Those questions are all the more challenging, says Renner, because instructors’ teaching styles, course content, and level of rigor can vary widely.
But because she was able to test her ideas on a large scale, using lots of data, her department can use her findings to engage in deeper discussions around course requirements, what students are learning, and how they can transfer knowledge from one course to the next.
ADVERTISEMENT
“We have been choosing prerequisites based on intuition and not a lot of fact,” she says. “What this has done for us is give us a real opener. Things are not always what we expect.”
None of this would have been possible without the support she received to help her do experiments on the data. When it comes to learning analytics, she notes, “there are a lot of data available, “but they don’t exist in the form that makes them easy to analyze.”
Renner’s experience highlights both the potential and the challenge of learning analytics for faculty members. These data are everywhere on college campuses. Course transcripts, demographics, financial aid, socioeconomic data, student surveys — all are handled and parsed with increasing fluidity and sophistication by administrators, deans, student advisers, and institutional researchers.
But how many professors have access to, or even consider exploring, the kinds of data that could inform their teaching? An instructor may pop open the dashboard of her course-management system to see who turned in homework. But deeper dives into learning analytics driven by faculty curiosity are more rare, say advocates for this kind of work.
There’s a broader challenge of building a research-informed teaching culture. ‘It’s not part of standard operating procedure.’
ADVERTISEMENT
Learning-management systems and digital courseware can support fine-tuned analyses of student behaviors. Every keystroke can be recorded and evaluated to see what it says about student learning. Clickers, apps, and other technologies can be incorporated into research about teaching.
In many cases, though, professors find these data puzzling. Others are curious but don’t have the time to explore. And still others may conclude that the information on the dashboards of their learning-management systems doesn’t answer the questions they’d like to ask.
“If those fancy visualizations don’t make meaning, or they’re not interpretable for faculty members to take action,” says Marsha C. Lovett, associate vice provost for teaching innovation and learning analytics at Carnegie Mellon University, then “they’re just pretty pictures.”
Professors often rely on a combination of intuition, experience, standard practice, and cues from their students to figure out what works and what doesn’t in their classrooms. But now colleges are increasingly aiming to do for instructors what they’ve done for advisers and others — use data and analytics to do their jobs more effectively.
Renner worked with colleagues at Indiana’s Center for Learning Analytics and Student Success, who help academics interested in this kind of research. She participated in a fellowship program, which she credits with helping her understand her students better and design appropriate interventions.
ADVERTISEMENT
Other large universities, like Carnegie Mellon, the University of California at Davis, and the University of Central Florida, also have centers where experts in learning science and data analysis work alongside faculty members to conduct research on their teaching.
Institutions with leaner budgets, like Montgomery County Community College, in Pennsylvania, or Colorado Technical University, experiment on a smaller scale, by helping instructors design and test classroom interventions.
Getting faculty members to see the value of this kind of research, advocates say, is a critical next step.
“The typical faculty at a campus like ours, it’s like having blinders on. They don’t know what happens to students before they come in the classroom and probably don’t know what happens when they leave the classroom,” says Dennis Groth, vice provost for undergraduate education at Indiana.
ADVERTISEMENT
With analytics, he says, “people say, ‘I can look at how students change from our major to another major, or what happens to students based on the grade they receive.’”
Through the Indiana center’s learning-analytics fellows program, begun in 2015, faculty members submit proposals for research projects that can be answered with data, with the goal of being able to take action on their findings to improve student success.
Those whose proposals are accepted then work with experts in institutional research, data, and analytics as they pursue their projects. They meet regularly with other fellows to share their work, and submit a final report that is made public.
“The actual data is so interesting and fascinating that faculty come back for a second, third, and fourth year,” says George Rehrey, director of Indiana’s learning-analytics center.
Research areas fall into four broad categories, he says: student choice, demographics, preparation, and performance.
ADVERTISEMENT
Some of this research involves collecting qualitative data, through student surveys, in addition to more traditional metrics, like grades. In one project, to run for several years, faculty members in Indiana’s College of Arts and Sciences and the School of Informatics, Computing and Engineering used a seven-question survey to measure “grade surprise” in large introductory courses — those unrealistic expectations students have for their grades — and the consequences for their self-confidence and how they go on to perform in the classroom. Potentially, professors could use the results to work with students to better set expectations for grades, leading to stronger academic outcomes.
Another faculty member, Ben Motz, is hoping to give instructors more and better information on their students before they even meet in class.
Motz, a research scientist in the department of psychological and brain sciences, helped redesign something called the student-profile report, which gives faculty members a quick overview of students enrolled in a course. Because it included only a few pieces of information, such as first-generation status, average SAT score of the group, and the most common prior coursework, it was of limited value. And it was generated only on request.
A new version that Motz is testing presents more complete information on prior coursework and grades and is more easily searchable. A professor teaching a math-heavy course, for example, could see that only 40 percent of students enrolled got A’s or B’s in a prior math class, and develop his lesson plan accordingly. The goal, says Motz, is that such reports would be automatically generated for every instructor in every course, well in advance.
If you don’t turn the light on what is happening in your educational culture and climate, you’re only attending to half of your mission.
ADVERTISEMENT
He is now working with Indiana’s University Information Technology Services to test an intervention system designed to give students nudges, through an app called IU Boost, to improve their performance. In a pilot effort this fall, the app sent push notifications to students when they had not submitted an assignment that was due within four hours. Researchers saw significant increases in the number of times students looked at assignment pages and decreases in missed assignments, compared with a control group.
One of the big challenges to using student data more effectively, Motz says, is helping instructors figure out what is available and how to use it. Learning-analytics dashboards can seem a bit random, he says. “There are severe problems of giving information to faculty and just seeing what works.”
Zhongzhou Chen, an assistant professor of physics at Central Florida, has dealt with the limitations of existing data on student learning by building better research tools into his coursework. Chen, who has a background in education research, teaches a large introductory-physics course in which students have struggled.
An instructor using a typical learning-management system could have difficulty figuring out why some students are having difficulty. You might see if a student opens a page, he says, but is he reviewing content that he already knows? Or maybe he is just staring at the page and not learning anything. “You can collect as much data on those traditional things you want,” Chen says, “but the level of insight you’ll gain is pretty low.”
He is building his own system content, through which he can track student behaviors with higher precision. Each online lesson includes a pre-test to determine what students already know; instructional materials such as readings, problem sets, and videos; and follow-up quizzes.
ADVERTISEMENT
It also maps student behaviors with more clarity. For example, if two students answer a question incorrectly, Chen wants to know if it’s because they didn’t try or because they didn’t understand the material.
To figure that out, he measures how much time a student spends on each piece of content, and then on answering a related question. Although he can’t read students’ minds, he can study their online behavior as a proxy for what they’re thinking.
If someone skims the reading and answers the question in a few seconds, that tells him the student wasn’t trying. But if she spends a long time on both things, that probably means she struggled to grasp the material.
Collectively, Chen says, these behaviors help him understand where students might be mentally checking out of class and where they might be in over their heads. And he can design his class time accordingly.
“Big data don’t necessarily give you better educational research,” he says. “It’s better-quality big data.”
ADVERTISEMENT
Chen acknowledges that he’s unusual. Average instructors are not trained in data science and education research, nor do they necessarily want to design courseware. But Central Florida has also fostered a culture of research around teaching on campus, keeping in mind not just people like Chen but also instructors who need support in mining and interpreting data, as well as in designing their studies.
One of the biggest challenges facing faculty members who want to study what’s happening in their classrooms is the sheer complexity of the educational environment, says Charles Dziuban, director of the Central Florida’s Research Initiative for Teaching Effectiveness.
A classroom is not a laboratory, and coming up with a simple cause-and-effect experiment is usually not possible. Enrollment changes every semester; different instructors teach the same material in different ways; and it’s often not clear which activities, assignments, or evaluations have what effects on students. Vacuuming up a whole bunch of data from your learning-management system isn’t going to help untangle those complex interactions and give neat and clean answers about what works.
But, says Dziuban, a well designed research project can model what’s happening, allowing the instructor to make informed decisions about what is taking place in the classroom.
ADVERTISEMENT
He points to one faculty member, Marino Nader, who experimented with a “killer” engineering course by flipping his classroom so that students studied concepts online before class and did group problem-solving activities in class. Nader compared their scores, attitudes, and behaviors — for example, by tracking when and how long they watched the videos — to the performance of students in a more traditional lecture class, Dziuban says, to evaluate the effectiveness of this new form of teaching.
In many cases, students taking the flipped model performed better than those receiving lectures. But Nader found that a small cohort actually fared worse under the new model, and that advanced students were sometimes bored. He is now studying how the different components of the flipped model work together.
Dziuban frames the value of this kind of research in terms of student success. “If you don’t turn the light on what is happening in your educational culture and climate,” he says, “you’re only attending to half of your mission.”
The University of California at Davis is engaged in similar work through its Center for Educational Effectiveness, which blends a traditional teaching-and-learning center with instructional design and learning analytics.
Marco Molinaro, an assistant vice provost who heads the center, says he’s seen a steady increase in interest in the scholarship of teaching and learning. A recent campuswide conference on the topic drew about 160 people, including 120 scholars.
ADVERTISEMENT
Equally significant, he says, has been an increase in openness among professors to evidence-based teaching reforms. “Before, it was, ‘How can you fix the students before they come to class?’” he says. “Now I hear a lot more questions about ‘What can I do with students once they’re in my class?’”
Institutions with lesser resources are finding creative ways to support faculty research on teaching.
At Colorado Tech, where most faculty members are adjuncts and most students are working adults, instructors can apply for small grants, of up to $1,000, to conduct research, write a paper on their teaching, or present at a conference, says Connie Johnson, the provost and chief academic officer.
Colorado Tech is also partnering with other universities, including Central Florida, to study learning behaviors. That’s another way to stretch their research capacity, says Johnson.
The Central Florida project, done in conjunction with an adaptive- learning vendor, Realizeit, looked at how students across a range of courses completed assignments over the course of a semester. The study grouped them into four behavioral types: tortoise, frog, hare, and kangaroo. The first two make consistent progress throughout the semester, although they differ in pacing, as the animal names suggest. Hares and kangaroos, by contrast, operate at their own pace. Hares rush ahead and get coursework done early. Kangaroos wait until near the end of the term to do most of their work.
ADVERTISEMENT
One eye-opening finding, says Johnson, was that kangaroos go through more material than those who finish early. This, she says, helped some faculty members understand those students better, particularly if the instructors were inclined to see them as lazy. Such research, in short, can help mitigate prejudice they might have toward those types of learners.
Montgomery County Community College has done work to improve advising, degree planning, and other retention efforts. The next frontier, says Celeste Schwartz, vice president for information technology and chief digital officer, is to study classroom activities. To that end, a data and innovation team is working with a small group of instructors in English and ESL instructors, using information from the college’s predictive-analytics system to design activities for students identified as having a higher risk of dropping out.
In this experiment, faculty members teach a regular course and a course in which they try out certain interventions. They might spend part of a class having students work together on specific questions, for example, to help them form bonds with one another, on the idea that the act of working together fosters persistence. Their goal is to see if these activities have any effect on grades or retention.
As faculty members dive into the data on their students, though, they might well encounter potential ethical dilemmas.
If, for example, a study finds that first-generation students in STEM classes struggle more than average, could an instructor or a department then be inclined to steer them away from these courses or majors? These kinds of questions come up frequently with predictive-analytics programs as well, which are increasingly being used to find early-warning signs of students at risk.
ADVERTISEMENT
At the same time, as they work with faculty members on these challenges, experts note that learning analytics can also potentially reduce the risk of stereotyping, especially if it digs into the connections among course design, teaching, and student success. In other words, used thoughtfully, this kind of research could uncover certain biases built into teaching.
Molinaro, at UC Davis, tells the story of one instructor who discovered that students for whom English is a second language were underperforming compared with native-born speakers. The instructor read up on the challenges facing such learners, then worked with graduate students in linguistics to review his course material, watching out for idioms or unnecessarily complex sentence structures — say, for example, the use of double negatives in a quiz question. In other words, he redesigned his course to remove barriers to learning, rather than concluding that the problem lay with the students.
“What I’ve noticed over the years,” says Molinaro, of evidence-based teaching reform, “is that you have to build awareness. But then you have to understand what the data is showing you and what you might do about it.”
At Montgomery County, David Kowalski, executive director of institutional research, says the data and innovation team has regular conversations with faculty members in the project they’re working on together. They talk about things like stereotype risks. There’s no process you can put in place to eliminate the possibility of bias, he says. “It’s all about reflection and being aware and having people who can check you as well.”
ADVERTISEMENT
If faculty members plan to publish on their research, they will also need to seek approval from an institutional review board for their work, adding another layer of protection for students. Indiana, for example, has IRB approval for the fellows program, and then tailors that approval, as needed, for specific projects.
There’s also a broader challenge of building a research-informed teaching culture on campus. “It’s not part of standard operating procedure for faculty members,” says Lovett, who is director of the Eberly Center for Teaching Excellence and Educational Innovation at Carnegie Mellon. “And that’s really where we need to go.”
Carnegie Mellon is helping faculty members move in that direction with tools similar to those Chen has developed for his course at Central Florida. By working with instructional designers and other learning experts to determine which skills and concepts she is trying to teach, and then organizing the lesson plans and questions around them, Lovett says, a professor can more clearly see where students may be getting confused and why.
Her center is also helping design more-complex projects, looking at students’ sense of belonging and at their pathways through college. That helps professors understand if and when students who may be underrepresented in certain fields have classroom experiences that are different from those in the majority, and why.
For Terri Renner, at Indiana, research on her teaching is now part of her life. She is doing a third fellowship to dig deeper into the effectiveness of prerequisite courses to help with curriculum planning and development at the School of Public and Environmental Affairs.
ADVERTISEMENT
It’s not easy work, and she admits she hasn’t yet figured out how to get struggling students — who tend to be among the most disengaged — to use an online math camp she created to help them. Inconsistent grading and teaching in previous courses, she says, also makes it difficult to study how previous work shapes success in subsequent classes.
But she feels positive about the effect of being a research fellow. “Just listening to what other people are doing sometimes prompts an idea,” she says. “I love the world I’m in — trying to solve problems, trying to help kids. It’s my favorite thing to do.”
Beth McMurtrie is a senior writer for The Chronicle of Higher Education, where she focuses on the future of learning and technology’s influence on teaching. In addition to her reported stories, she is a co-author of the weekly Teaching newsletter about what works in and around the classroom. Email her at beth.mcmurtrie@chronicle.com and follow her on LinkedIn.