John R. Barker paces the front of the lecture hall, gesturing at slides with a laser pointer and explaining to a room full of undergraduates how scientists use data to make predictions about global climate change.
At the moment Mr. Barker, a professor of atmospheric science at the University of Michigan, is facing a climate crisis of his own: The atmosphere in this lecture hall is dead.
The students are supposed to be following along with the slides on their computers while taking notes using a program called LectureTools. It was designed to collect data on how students are reacting to lectures—in theory, giving professors a window into what is going on in the heads of their students.
Today the data collection seems to be going poorly. Few students appear to have LectureTools open on their monitors, and even fewer are using the program to take notes. One student is watching a soccer match. Another is surfing message boards on Reddit. Several are wearing ear buds.
The majority watch Mr. Barker with inscrutable expressions. Occasionally he asks, “Are there any questions about this?” Silence. Are the students learning anything? He does not know.
This lack of awareness has become unacceptable in some corners of higher education. Colleges face mounting pressure to show that students walk away with more than millstones of debt. Traditional universities, especially prestigious ones like Michigan, face less scrutiny than newer institutions that run big online programs and operate like upstart businesses.
But traditional universities also may be less well set up to adapt to a culture of accountability. At research universities in particular, professors face less pressure to use technology to measure and modify the classroom experiences they are delivering to students.
Echo360
A student uses LectureTools, a program that collects data on students’ reactions, during a lecture at the U. of Michigan’s College of Engineering.
LectureTools is supposed to help Michigan’s professors become more data-driven in their teaching. The software is the creation of Perry J. Samson, one of Mr. Barker’s colleagues in the department of atmospheric, oceanographic, and space sciences. Mr. Samson’s idea was to invent a system that could spur his colleagues to squeeze data out of the thin air of the lecture hall—data they might use to become better teachers.
On this drizzly April morning, from where Mr. Samson sits in the back of the lecture hall, it doesn’t seem as if his system is changing anything.
If data-driven teaching is the future of higher education, traditional universities are at a disadvantage.
In virtual classrooms, the subtlest gestures are preserved in digital amber. Colleges that are largely online, like the University of Phoenix and Southern New Hampshire University’s College of Online and Continuing Education, sit atop vast deposits of data describing students’ interactions with instructors, peers, readings, and quizzes.
Those data can be mined for insights about teaching techniques that are not working and concepts that students are failing to grasp. They also can be used to design software that adapts on the fly to the needs of individual students, an approach that many advocates see as online education’s trump card against traditional instruction.
At Michigan, however, many undergraduate courses operate as they have on campuses for centuries. Classroom discussions, if they happen at all, are ephemeral. Professors rely on grades, student evaluations, and other old-fashioned methods to figure out whether they are any good at teaching the things they have devoted their lives to knowing.
The university is doing what it can to become more data-driven. Over the past two decades, Michigan has built an infrastructure aimed at making it easier for administrators, researchers, and professors to use data to do their jobs better.
Laura M. Patterson, chief information officer, remembers arriving on campus in 1993 to take a job as registrar. At the time, the university was still using a decentralized, ink-and-paper filing system.
“I had a staff of about 86 people who typed information onto paper records and then made photocopies of 40,000 records every term and delivered those photocopies out to every school and college,” says Ms. Patterson. “Then they’d go in filing cabinets.”
But things were beginning to change. That same year, Michigan created a central data warehouse that has become a giant digital filing cabinet for all of the data collected by the university’s 19 schools and colleges. And soon universitywide management software vastly increased the amount of data flowing into that central warehouse.
More recently, Michigan has piped in data from its learning-management system that not only identify students and the courses they are taking, but also indicate how frequently they log in to the system, download digital course materials, and submit online assignments.
All of this has allowed the university to keep pace with colleges that are using student data collected outside the classroom to predict which students might need help and nudge them in the right direction.
Now Michigan is building an “early warning” tool, called Student Explorer, that keeps advisers apprised, throughout each term, of how their students are doing relative to others in the same course, says Stephanie Teasley, a research professor at Michigan’s School of Information. If most students have submitted 10 assignments, and one student has turned in only five, her adviser will know something is amiss.
Timothy McKay, a physics professor, has built another tool, called ECoach, that provides similar real-time feedback directly to students. ECoach takes information about them, collected via a voluntary survey at the beginning of the semester, and combines it with academic and demographic data from the warehouse to generate advice based on students’ backgrounds and where they stand in their courses. The tool is now being used in several large, introductory courses.
But collecting data on what happens inside lecture halls is a trickier feat, and harder to manage centrally. Michigan’s leaders believe the university needs to become more data-savvy in order not to be left behind. Whether that belief will extend to the teaching mission remains in the hands of the faculty.
Data storage at Michigan has been centralized, but the authority over teaching techniques remains distributed. Deans, departments, and professors are largely left on their own, and many prefer to devote their time and brainpower to winning research grants and publishing articles rather than to reworking their teaching methods with new, unfamiliar technologies.
They are under little external pressure to do so. While for-profit institutions and community colleges face intense scrutiny of their programs and teaching methods—especially online—research flagships and their lecture halls remain in good standing with regulators, accreditors, and prospective students.
Hunter R. Rawlings III, president of the Association of American Universities, says this holds true for its membership of top research institutions. “Most of the pressure, where there is some, at the AAU schools is self-imposed,” he says.
U. of Michigan
Martha Pollack, provost at the U. of Michigan at Ann Arbor, says the university is “committed to using data to understand learning.”
Martha E. Pollack, the Michigan provost, says the university is “committed to using data to understand learning.” But she also says she is not about to try to make professors do anything they don’t want to do: “Orders don’t work at a university.”
In lieu of mandates, the provost has tried to create a hospitable environment for professors to experiment with data-driven teaching—in the hope that they can persuade colleagues to follow suit.
Whether data-driven teaching actually spreads at Michigan may depend on whether professors like Mr. Samson are able to get others to use such technology, and use it well.
Mr. Samson, who has been at Michigan since 1979, wears a white goatee and glasses and an expression of low-level amusement. He describes himself as a “lazy” ex-hippie who skipped Woodstock because he didn’t want to deal with the traffic.
He is not shy about admitting where teaching falls on the list of priorities for most of his peers: a distant third, after publishing articles and landing research grants. “Instructors want to do the right thing,” he says. “They’re just busy guys, and they don’t sense that the bean-counting is heavily weighted toward the teaching.”
In 1993, while the university was devising its data warehouse, Mr. Samson was conducting his own experiments in data collection. He co-founded a digital weather service, called the Weather Underground, that helped members of certain computer networks get data from the National Weather Service in real time instead of having to wait for a report on the news.
With LectureTools, Mr. Samson hopes to explore another frontier of data collection: the lecture hall. He does not believe Michigan’s lecture halls are going to be decommissioned anytime soon. But he does not think they are doing students any good.
Echo360
Perry Samson is a professor of atmospheric science at the U. of Michigan and the inventor of LectureTools. “Instructors want to do the right thing,” he says, and his system could make that easier for them.
“I think that universities are doing students a disservice,” he says, “because in order to make our ends meet, we have these large intro courses that are just terrible environments for learning.”
LectureTools is supposed to improve those environments by helping professors trawl their lecture halls for data. This involves getting everybody literally on the same page, with students opening LectureTools and following along with the professor’s slides on their own screens.
Students can take notes in the margins of the slides. They can respond to questions the professor builds into the lecture. They can click a button that says, “I’m confused.” Every time they do, they produce data that begin to make sense of those Sphinxlike stares.
Mr. Samson is not the first professor to try to snatch data out of the ether in traditional classrooms. For several years, Arizona State University has been teaching some of its math courses by sitting students in front of computers and tracking their progress with electronic tutoring software; a professor walks the aisles and offering human intervention when necessary.
Other professors have tried collecting real-time data in traditional classrooms by using student-response systems, often called “clickers” because of the remote controls that come with some products. Clickers essentially allow professors to keep a running tab on the collective comprehension of their students during lectures.
By aggregating signals of confusion (like the “I’m confused” button) and apathy (like declining to take notes) from the classroom, Mr. Samson says, LectureTools might be able to identify at-risk students more effectively than counting log-ins and assignment submissions. “We have a lot more data on what students actually do in class,” he says.
Other features are being developed. In the future, the software also might intervene pre-emptively if a sleep-inducing lecture seems to be in progress. Or if a professor is uploading too many slides and not enough surveys or quizzes, then LectureTools could nudge him with a message, says Mr. Samson.
Rather than disrupt traditional higher education, Mr. Samson’s idea is to disrupt the lives of professors as little as possible. “We don’t have time to go to meetings in the center for teaching and learning,” he says. “We have research to do.”
He makes no apology for this, nor does he believe that universities should leave instruction to adjunct instructors and teaching assistants. Every professor wants to be a good teacher, he says, but there are structural aspects of research universities—the pressure to win research grants, the size and format of many undergraduate courses, everyday inertia—that can make it difficult for them to improve.
Technology does not have to destroy those structures in order to improve undergraduate instruction, says Mr. Samson; it just has to help professors work more efficiently within them. And a system of hints and reminders is more efficient than trusting professors to read and apply the lessons of education research. “If you can get the information when you need it,” he says. “You don’t need to read the damn article.”
The professor has had some success getting his colleagues to try using LectureTools in large introductory courses. In the spring, the software was being used in about 40 classrooms at Michigan, he says.
Adoption elsewhere has been scattered. In 2012, Mr. Samson sold LectureTools to Echo360, an education-technology company, which has started marketing it to professors at other universities. The program is being used in at least one classroom at 1,100 institutions, according to Mr. Samson, who has kept his title of chief executive of LectureTools. But only 80 are using the software in 10 or more courses.
Those data are, in any case, superficial. Just as traditional universities rarely mandate that professors use specific technologies to teach, they do not require professors who adopt a certain technology to use it in a particular way. At Michigan, the question of whether such tools are making professors more data-driven can be answered only one classroom at a time.
When John Barker lectures about climate change, he knows what he’s talking about. He has published papers, edited journals, and won several awards from his department, including two for excellence in teaching.
“I consider myself to be kind of an average professor. ... We have a lot of things going on. Teaching is kind of not at the forefront.”
And yet, when it comes to teaching, Mr. Barker does not flatter himself. “I consider myself to be kind of an average professor,” he says. “I’m sort of typical. We have a lot of things going on. Teaching is kind of not at the forefront.”
Several years ago, when Mr. Samson started proselytizing for LectureTools among his Michigan colleagues, he pegged Mr. Barker as a member of a crucial demographic: the silent majority of faculty members willing, if not necessarily eager, to try out a new teaching tool so long as it does not put them out too much. “I figured,” says Mr. Samson, “if I could get this guy to start using it, I could get anybody to use it.”
On this April morning, Mr. Barker is not relying heavily on LectureTools. His lecture has a lot of slides, and some of them are so jam-packed with graphs and figures that the fine print is difficult to read. Occasionally he pauses to ask if anybody is lost. But nobody speaks up, and he does not use the LectureTools polling feature to solicit anonymous answers.
“This is the problem,” whispers Mr. Samson. “John still teaches in the same way when he just uses his slides. It’s not as engaging. But the hope is that, having the tools, we can push him to use them.”
Mr. Barker is not a novice with LectureTools. This is his third year using the program. He used to treat the software as a back channel for students who wanted to ask questions and receive answers from his teaching assistant while the lecture was in progress. These days he relies on the program mainly to collect answers to in-class assignments, which he occasionally uses to break up the lectures.
At the end of each class, LectureTools sends Mr. Barker an email message with a summary of participation data: how many students logged in, who took notes, which slides students flagged as unhelpful.
Mr. Barker says he has tweaked slides on the basis of student feedback, but he does not pore over the data his course now produces. After all, they paint an incomplete picture. “The way I view LectureTools is, it’s a resource,” he says. “I don’t require students to use it.”
“Just like faculty, it’s hard to make students do something. It’s on us to help students find value in using this.”
Professors who are accustomed to academic freedom can be hesitant to forcibly standardize the experience of their students. Mr. Barker is one of them. Mr. Samson is another.
“I don’t like the word ‘make,’ " says the LectureTools founder. “Just like faculty, it’s hard to make students do something. It’s on us to help students find value in using this.”
The global climate changes gradually and unevenly. So does higher education.
In teaching, the most noticeable changes will happen where “there’s pressures coming from both the top and the bottom toward more innovative teaching methods,” says Charles Henderson, a physics professor at Western Michigan University who studies research-based instructional strategies.
“That’s really the only way large-scale change is ever going to happen,” he says, “if it happens at all.”
Of course, students are hardly enrolling at Michigan on the basis of professors’ use of classroom data to inform their teaching. The draw of a university, especially one highly visible in popular culture, is much broader than that. “They’re not picking between the University of Michigan and the University of Phoenix,” says Mr. Henderson.
As for pressure from the top, the administration at Michigan has opted to use carrots, not sticks, to steer instructors toward innovative teaching techniques. Ms. Pollack, the provost, has appointed a task force to support faculty members who are using data to shape their teaching.
The task force has created a series of grants that will offer as much as $3-million to professors who propose “large-scale changes to instruction and/or infrastructure” that enable their colleagues to “implement new learning approaches for sustainable and replicable adoption.” It has also made smaller grants available to professors with “shovel ready” projects that put teaching-and-learning tactics under a microscope.
Ms. Pollack says she hopes that framing data-driven teaching as a research opportunity will harness the instincts of professors. “The faculty here are very smart, and they’re very competitive,” she says. “When they see experiments that work, they want to be on the cutting edge, too. So if you created an environment that’s hospitable to experiments, and those experiments bear fruit, then other people come along.”
The provost acknowledges that online colleges have an advantage over traditional universities when it comes to capturing “click by click” data from classroom exchanges. But she does not think that universities necessarily need to be collecting that fine-grained data in order to become as evidence-driven as they need to be.
“I still think there is an enormous amount of data that you can capture and analyze” without turning classrooms into controlled laboratories, says Ms. Pollack. “My goal is not to ensure that every single faculty member changes the way they teach. My goal is to have a group of people who are excited about innovation and who are trying out new sorts of things.”