Far be it from me to oppose the democratization of education. I’m a big fan of the idea, which in its most recent manifestation focuses heavily on the potential of technology to bring more educational opportunities to more people than ever before in history. But what do we really mean by the ubiquitous “democratization” phrase? How does aspirational talk about using technology to upend convention translate into concrete action? I’ve been mulling over two divergent strands in educational philosophy that seem to be emerging, each with quite different implications for how techno-reformers ought to proceed.
The first strand came into relief for me last month at the British Council’s Going Global 2012 conference in London, when I chaired a session on how technology is changing postsecondary education. There were six discussion tables, three led by Americans and three led by Indians. The Indian academics focused above all on the acute need to reach far greater numbers of students with high-quality coursework. They cited problems from huge unmet demand for university places to faculty shortages to lack of collaboration between institutions.
Two examples: Ravi Singh, director of the regional services division of the Indira Gandhi National Open University, called for a network of online courses aimed at students already enrolled in conventional universities. Online courses would be accepted at a range of institutions and would be “seamlessly integrated” into those universities’ existing curricular frameworks. Shiva Prasad, dean of academic programs at the Indian Institute of Technology, Bombay, offered a purely online model, featuring top professors from around the world and premised on the idea that it will be impossible to build enough brick-and-mortar campuses in India to meet current and projected demand.
In brief, the Indian academics at this forum were pragmatic, emphasizing quality and scale. They focused on technology as a means of broadening access, and perhaps improving pedagogy–but not radically personalizing students’ educational choices, rethinking the notion of what a degree is, and putting up for grabs the nature of knowledge creation itself.
This second strand of edu-thinking was very much in evidence a couple of weeks later in– surprise – Silicon Valley. At a Palo Alto workshop on Redesigning Higher Education, hosted by the Institute for the Future, a range of participants – including representatives from Udemy, BioCurious, Open Study, the Khan Academy, and Science Hack Day – spent a day discussing their initiatives to use technology to disrupt longstanding higher ed conventions. Presentation after presentation laid out of the contours of the emerging higher-education landscape, from learning through gaming to peer-to-peer online study networks to creating open platforms for course creation.
So what’s not to like? Actually, I did like a good deal of what I heard. We’re in a period of extraordinary ferment in postsecondary education, and entrepreneurial ventures like these are surely going to play a significant role in the future shape of the enterprise. At the same time, to generalize a bit, the Indians I heard in London accepted the following premise: university education, at least in part, involves seasoned, well-educated people determining what less seasoned, less well-educated people should know, then proceeding to teach them. By contrast, at the Palo Alto gathering there was much more talk of “hacking college.” This worldview, though certainly not universal, reflects a certain disdain for hierarchies of knowledge and expertise. Instead, it embraces a radically individualistic approach to higher ed, one based on the idea that it’s always a great thing when students create their own education, piecing together courses and educational materials, free from the confines of convention.
This makes me fret. Yes, it’s hard to be against independent thinking and study. It’s true that technology may make a lot of conventions obsolete, from where students learn to how they learn. However, what I don’t think will become obsolete, or should become obsolete, is the hierarchical notion that senior instructors, and the academic traditions they come from, should help determine what students learn and measure whether they have done so.
There may be some learners who can figure it out on their own. Indeed, I had the impression during parts of the Palo Alto conference that some of these blue-sky ventures have in mind as a prototypical student the scary-smart 17-year-old who may be left disengaged by a regular college education. But that just doesn’t describe the typical student, who in my view needs a ton of guidance about what to study, why it’s important, and how it can best be learned. Lots of students who want a college education, whether at the elite or mass-access end of the spectrum, either want to study a curriculum that has been proven, need support because they don’t have the tools to educate themselves, or both.
The edupunk aesthetic certainly has some appeal – why not blow up some traditional assumptions about the structure of education? And it is so varied that it wouldn’t be fair to say its proponents are all blind to questions of giving students the guidance they need to succeed. But its anarchistic edge risks leaving students to their own devices so much that oversight and quality control goes out the window. (During one panel discussion in Palo Alto, I suggested that futuristic proposals for postsecondary education should include some assessment of whether students had actually learned anything. One participant responded that such thinking amounted to turning universities into prisons.)
Tech-driven innovation doesn’t have to mean an academic state of nature. There’s no reason we can’t use disruptive tools to teach an established body of knowledge, in a particular course sequence, to more people, more effectively. It’s a mistake, I think, to assume that the tech revolution inevitably goes hand-in-hand with the idea that students hack their own knowledge. Sure, we may see a growing blend of institution-based and self-guided learning, and that will be fine for some postsecondary learners. But if taken too far, the hacker ethos so popular in the Valley risks betraying its proponents’ promise of truly democratizing student learning.Return to Top