You’re reading the latest issue of Teaching, a weekly newsletter from a team of Chronicle journalists. Sign up here to get it in your inbox on Thursdays.
This week:
- I describe how the California State University system got professors to pay attention to its data on student outcomes with a tool meant to improve teaching.
- Beth explains new research showing that students think they learn more in a well-polished lecture than in an active-learning classroom, when in reality the reverse is true.
- I share more examples of what professors can do when they can’t make it to class.
Delving Into Data
Jeff Gold knows that showing professors how different groups of students fare in their courses can be transformative. As part of a push to raise graduation rates while closing equity gaps, the California State University system, where Gold is assistant vice chancellor for student success, has built a set of interactive dashboards that drill down to the level of the individual course.
“Some of our faculty consider themselves and believe in their hearts they are champions for equity,” Gold says. When the dashboards reveal, for instance, that “first-generation students are less than half as likely to get an A in their course,” professors will change the way they teach.
The problem? Not many of the system’s professors were looking at the dashboards.
To change that, Cal State tried a “nudge,” a low-cost, low-touch intervention that encourages, but does not require, a particular action. Over the years, I’ve watched nudging catch on as a way to help students through complex tasks on the road to a college degree. (You can read my latest story on the strategy’s promise and limitations here.) So I was intrigued by the idea of a college trying to nudge professors, rather than students, to do something that might help improve graduation rates.
The system tested on two of its campuses a personalized email sent to professors from their provost. The semi-customized message included an infographic designed to encourage recipients to click through by posing questions about students in their own department, like “How many of my electrical-engineering majors come from historically underserved populations?” and “How do graduation rates for electrical-engineering majors compare with other majors?” along with a button they could click to learn each answer.
The messages captured professors’ attention. In the first round, 46 percent of recipients opened the messages and 12 percent clicked through. The second time, 80 percent of recipients opened the messages and 29 percent clicked through. And instructors looked at the messages more than once: On average, they visited them three times within a two-week period. That kind of engagement exceeded the university’s expectations.
Once professors see the data, the thinking goes, they’ll be more aware of who’s succeeding in their classes — and who isn’t. That information can be eye-opening for individual instructors. And it might also spark change at the department level. The four-year-graduation-initiative committee of the mechanical-engineering department at California Polytechnic State University at San Luis Obispo, for instance, reviewed the data and has been discussing what it might do to close student-performance gaps, said Jim Widmann, the department chair, in an email.
“As engineers,” he wrote, “the faculty typically respond to data, and it creates conversations that might otherwise not occur.”
Does your college provide course-level data on student performance? Can you compare your department with others? If so, how does that affect conversations about teaching — and teaching itself? Tell me about it at beckie.supiano@chronicle.com, and your example may be featured in a future newsletter.