A week before classes begin at the University of Maryland University College, students can start poking around the online course materials. Some do, looking over the syllabus and getting a feel for the subject, but others don’t bother. It turns out that with just a little number crunching of that pre-course behavior, university officials can make some surprisingly accurate predictions about who will flourish and who will flounder.
We’re sorry, something went wrong.
We are unable to fully display the content of this page.
This is most likely due to a content blocker on your computer or network.
Please allow access to our site and then refresh this page.
You may then be asked to log in, create an account (if you don't already have one),
or subscribe.
If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com.
A week before classes begin at the University of Maryland University College, students can start poking around the online course materials. Some do, looking over the syllabus and getting a feel for the subject, but others don’t bother. It turns out that with just a little number crunching of that pre-course behavior, university officials can make some surprisingly accurate predictions about who will flourish and who will flounder.
“We know the day before the course starts which students are highly unlikely to succeed,” says Marie Cini, provost at UMUC, where most of the 60,000 undergraduates take courses online.
Colleges are increasingly awash in information and so-called clickstream data about their students — much of it ripe to be mined and analyzed. Data is becoming ubiquitous thanks to advances in analytics software, a slew of new personalized-learning and student-success companies, and course-management platforms that collect and analyze students’ online interactions. The promise is that colleges can use such data to improve retention and help students graduate.
But as more colleges experiment, they’re facing complex questions about what to do with the findings the data-crunching reveals.
What, if anything, should students be told about the judgments institutions are making about them from the data footprints they’re leaving behind? Should companies be able to profit from that data? And should students have the right to opt out of being monitored?
ADVERTISEMENT
Just as a new medical finding can create standards by which doctors provide care to their patients, does having such information establish a new standard of care for colleges?
“We are entering a new era of data and data responsibility,” says Mitchell Stevens, an associate professor in Stanford University’s Graduate School of Education who has long pushed for ethical standards around educational data that go beyond legal issues of privacy or security. In an era of ubiquitous data, he says, colleges need to decide: “Are we acting responsibly as educators? What values are we trying to pursue and preserve?”
Those were also some of the questions Mr. Stevens put front and center this month at a private convening of several dozen academics and a smattering of ed-tech company and foundation leaders.
Sponsored by Stanford and Ithaka S+R, a nonprofit education consultancy, the meeting at the Asilomar Conference Grounds, in California, was designed, in the words of its website, to produce “succinct statements to inform institutional, national, and global policies regarding the research, application, and representation of adult student data.”
The group has no formal authority, but it is building on precedent. In 2014, in the wake of MOOC mania, Mr. Stevens and others met at Asilomar and established principles to guide research using student data while respecting the rights of students and the “humanity of learning.” Those principles were prompted by concerns that colleges and MOOC providers were collecting reams of digital data about the hundreds of thousands of people signing up for MOOCs. But since most of those students weren’t enrolled in the institutions offering the courses, there were no clear guidelines governing how the data could be used.
ADVERTISEMENT
Flood of Data
This month’s meeting, Asilomar II, was focused more directly on how institutions treat their own students — and the flood of data they generate every time they log in to a course-management system to turn in an assignment or answer a quiz in a digital textbook equipped with “personalized learning” software.
The group’s findings won’t be formally released until August, but Mr. Stevens and his co-organizer, Martin Kurzweil, director of the Educational Transformation Program at Ithaka S+R, provided The Chronicle with a rough summary of the conclusions. Among them:
Student data collected into analytics programs should be thought of as a joint venture, where the students, institutions, instructors, and — where they are involved — third parties all need to have a shared understanding of how the information is used, including when it is developed as a revenue source for colleges or companies.
Data-analytics programs and products should be designed with “transparency,” especially in cases where an algorithm in the analytical software makes decisions about what happens next to a student; those decisions should be explainable and appealable.
Educators using data analytics have a responsibility to take action based on what they learn from their data analysis, a principle that the group called “informed improvement.”
Decisions and discussions about the ethical use of data analytics need to be under “continuous consideration” that, ideally, is embedded in an explicit governance process.
Colleges relying on data analytics, and particularly tools that use information to predict student outcomes, should ensure that students have “open futures.” As Mr. Stevens put it: “Education should create opportunity. It shouldn’t foreclose it.”
Participants at Asilomar II noted that the explosion in data analytics will also create new ethical issues around students’ transcripts and other official academic records that colleges have traditionally stewarded. With today’s predictive-analytical tools, colleges could conceivably calculate a student’s likelihood of graduation and make that prediction part of the academic record, says Mr. Kurzweil, “but should you?”
The group also spent a lot of time considering ways to ensure that student data remains useful to academic researchers. Today, says Mr. Kurzweil, too many institutions are organized in ways that actually inhibit such research. Many still keep their data in silos that can’t or don’t connect with one another, he says, or they use commercial products where the key information about student learning is controlled by a proprietary “black box” algorithm that’s off limits to outside researchers. At many institutions, he says, instructors still have “absolutely sovereignty” in their classrooms, so there is little compelling them to collect or share data that could inform educational research.
‘It’s Evolving’
The debate over data and privacy recently played out at the Open University, in England. The largely online institution tracks its 126,000 undergraduates daily on how far they’re progressing in their courses and other metrics. Last year officials there considered an opt-out policy after instituting a major effort to use learning-analytics data to intervene more actively with its students.
ADVERTISEMENT
The internal debate, says Sharon Slade, who took part, was “a tussle between the obligation to act and recognizing the student as an adult.” She’s a senior lecturer and regional manager in the faculty of business and law.
Are we acting responsibly as educators? What values are we trying to pursue and preserve?
After about a year of spirited debate, officials decided on a policy of disclosure instead.
But even so, Ms. Slade is quick to acknowledge that the disclosure policy goes only so far. “Do we describe in the policy the complete range of activities? Probably not,” she allows, “because it’s evolving.” (The Open University’s policy, along with other relevant documents, can be found on Asilomar’s resources page.)
Many of those at the Asilomar meeting were chosen because they or their institutions have been wrestling with these issues.
Take UMUC and what it knows about students who don’t log in early, for example. Ms. Cini says the university has been experimenting with various kinds of automatic alerts or “nudges” to students to see what works best to encourage them. “You don’t want to just start throwing data at people,” she says. “It can be very overwhelming.”
ADVERTISEMENT
The institution developed the insights about those who log on from its work with Civitas Learning, one of several new data-analytics companies. It is also part of the PAR Framework, a consortium of institutions that collect and share data about students’ progress. And in some of the university’s economics courses, students use an adaptive-learning product called Waymaker, developed by a company called Lumen Learning.
Ms. Cini says UMUC is already abiding by the “informed improvement” principle articulated by the group; based on what it has learned from the PAR Framework, officials have been redesigning a number of courses to reduce withdrawal rates. Having the data does create a different standard of care, she says. “If you know this, you have to do something.”
Yet one thing UMUC doesn’t do is explicitly explain to its students how deeply their data is being mined. Unlike the 11-page Open University policy, UMUC says only: “Unless otherwise notified, information provided to the university may be shared among offices within the university and with the University System of Maryland and outside entities as necessary or appropriate in the conduct of legitimate university business and consistent with applicable law.”
Ms. Slade, at the Open University, says she believes in the usefulness of data analytics, but she wishes her institution had gone with the opt-out provision. “They are adults, she says. “We shouldn’t treat them as children.”
Students at the Open University were consulted extensively during that decision-making process. Many of them didn’t realize the extent to which their data was being collected, says Ms. Slade, and some were “horrified by it.” The president of the student association, Ruth Tudor, says, however, that once students were made aware of the potential support that could be offered, they were “satisfied that they would not be pushed in any one direction.” She credits the university’s consultative process with easing anxieties.
ADVERTISEMENT
For Lumen Learning, the company that developed the Waymaker product, the ethical challenge is less about disclosure to its more than 6,000 student users at UMUC and 11 other institutions — each student is asked to sign a consent form — and more about how to ensure that the tool is put to the best use. Studies of the product show that when professors send customizable prompts to students saying things like, “nice job,” the students perform better.
Knowing that, says Julie Curtis, Lumen’s vice president for strategy and communication, “we feel this obligation to let instructors know and strongly encourage them” to use it. Until now, she says, Lumen has made the prompts optional but has emphasized their importance during instructor training. But now she wonders if following one of the principles of Asilomar II — that, once a practice is found to work, it should be used — means that Lumen and other companies should set the feature as a default.
Goldie Blumenstyk writes about the intersection of business and higher education. Check out www.goldieblumenstyk.com for information on her new book about the higher-education crisis; follow her on Twitter @GoldieStandard; or email her at goldie@chronicle.com.
Join the conversation about this article on the Re:Learning Facebook page.
The veteran reporter Goldie Blumenstyk writes a weekly newsletter, The Edge, about the people, ideas, and trends changing higher education. Find her on Twitter @GoldieStandard. She is also the author of the bestselling book American Higher Education in Crisis? What Everyone Needs to Know.