The battle for the future of higher ed has landed—at least for the time being—on a concept few in academe had even heard of a year ago: the Massive Open Online Course, or MOOC. The idea of offering free courses online to tens of thousands of students has suddenly become the latest, greatest way to “fix” higher ed, promoted by education-technology entrepreneurs and bemoaned by traditional academics.
Some of the country’s richest and most elite universities, including Harvard, Princeton, and Stanford, have been at the forefront of experiments with the format, and their moves have led some in higher ed to wonder if they’re missing out on something big if they don’t join in. That seemed to be the thinking of some members of the governing board at the University of Virginia last month, when they ousted the president, Teresa A. Sullivan, for not moving fast enough to position the university for the future (only to reinstate her two weeks later).
In the midst of the turmoil at the University of Virginia, I suggested in a New York Times op-ed that colleges could take advantage of MOOC’s, perhaps by “ultimately shedding their lowest-quality courses (and their costs) and replacing them with the best courses offered by other institutions through loose federations or formal networks.” I received plenty of criticism about that column from faculty and higher-ed administrators. Perhaps the most thoughtful response came in a blog post on Innovations last week by Siva Vaidhyanathan, a professor of media studies and law at the University of Virginia.
As Vaidhyanathan correctly pointed out, I don’t see MOOC’s as a panacea. But unlike Vaidhyanathan, I can imagine how the format might reduce costs, improve learning, increase access, and maybe produce revenue for a few universities. The problem is that MOOC’s probably can’t do all four things at any one institution—and that’s the reason they are not “the” solution to the myriad of problems facing higher ed.
We often refer to “American higher education” as if it’s a monolith, but the fact of the matter, as we well know, is that we don’t have a single American higher-education “system.” The issues facing the University of Virginia are quite different from those facing, say, the 31 colleges that make up the Minnesota State Colleges and Universities system. I picked that public system because I heard recently that it spends some 25 percent of its budget on remedial education. My bet is that the University of Virginia spends nothing, or close to nothing, on getting its students ready for college work.
As more MOOC’s are developed, institutions with high remedial costs could use them to replace—or at least to supplement—their noncredit-bearing courses.
A common theme in the comments on the Vaidhyanathan post was whether MOOC’s should be considered “education” or just “information.” The assumption seemed to be that the current methods of teaching on college campuses were working just fine. Again, maybe so at the University of Virginia, where some of the best scholars are teaching some of the best students. But we know from a 2011 book, Academically Adrift, that American higher education is “characterized by limited or no learning” for a large proportion of students.
“You can’t assume that in sending off a student to a typical college that they’re going to get a rigorous education,” one of the book’s authors, Richard Arum, told me recently. “You can’t trust these institutions to police themselves.”
Taken alone, the format of MOOC’s might not improve learning, but coupled with some face-to-face teaching, they could be a worthy experiment. Various studies have found that students who have taken all or part of a class online performed better, on average, than those who took the same course through traditional face-to-face instruction.
MOOC’s might also play a role in improving access and graduation rates, and ultimately in reaching President Obama’s goal of making the United States the nation with the highest portion of college graduates by 2020. The University of Virginia’s six-year graduation rate is 93 percent—again, an outlier when the average rate of four-year public universities in the United States is 56 percent.
While MOOC’s don’t carry credit, they can be used as part of an evaluation to gain credit through prior learning. Students who receive credits for prior learning are 2½ times as likely to graduate as those who do not earn such credits, according to the Council for Adult and Experiential Learning.
One criticism of MOOC’s seems credible, at least for now: There is no business plan to produce revenue. A few moneymaking ideas have been floated, from charging for the credential to selling access to corporate recruiters.
I took a MOOC from Coursera this past spring, and recently received an e-mail message about study groups and social meet-ups planned for the courses being offered this fall. Right now it’s free to join those face-to-face meetings, but I could imagine some older students like myself, who are no longer in a college setting, paying for the chance to meet others in their courses or for an opportunity to meet the professor.
Like so many debates about the future of higher ed, the discussion about MOOC’s has quickly devolved into an all-or-nothing argument. The format must offer answers to all of higher ed’s problems or be as good as or better than what we do currently, critics say, or it’s a failure itself. But thousands of students around the world have completed the MOOC’s offered so far, with many of them performing as well as students on the residential home campuses where the courses were created.
There’s a lot we don’t know about the students who take MOOC’s and their reasons for doing so, but the format has clearly captivated a group of learners, and there must be something of value we can take from that in navigating the future of higher ed.