Jose Ferreira, the chief executive of the adaptive-learning company Knewton, has been endlessly mocked by many involved with educational technology (including me) for making exaggerated claims about the power of his products.
Among his choice pronouncements:
- “We think of [our product] like a robot tutor in the sky that can semi-read your mind and figure out what your strengths and weaknesses are, down to the percentile.”
- “We can take the combined data power of millions of students — all the people who are just like you — [who] had to learn a particular concept before, that you have to learn today — to find the best pieces of content, proven most effective for people just like you, and give that to you every single time.”
These claims are absurd on their face. But they are also dangerous in ways that are not always obvious. To better understand what I mean, it’s worth making an analogy to another hot technology, driverless cars — and more specifically, Tesla’s so-called “autopilot” feature, which has been linked to three crashes and one fatality.
Unless you are a self-driving car enthusiast, you probably don’t know that the National Highway Traffic Safety Administration has proposed a scale of 0 to 4 for classifying the degree to which a car can be considered self-driving.
Most of us have some experience with Level 1, in which one automatic system like cruise control, antilock brakes, or stability control can take over one function of the car at one time or another.
Level 2 is when two or more systems work in coordination. For example, a Level 2 system could enhance cruise control with features like collision detection and automatic braking to help manage sudden heavy-traffic situations. Anybody who has driven while drowsy can appreciate the value of such innovations. But they still require hands on the wheel and eyes on the road.
Level 4, fully self-driving cars are estimated to be anywhere between 10 and 25 years away.
(Level 3 is where the car takes over most of the driving but still requires the driver’s occasional attention. Because Level 3 may invite dangerous driver inattention, both Volvo and Ford have announced that they will wait until they can deliver a Level 4 car.)
Tesla’s cars are considered to be Level 2. In fact, they share many of the same off-the-shelf assistive-driving components with other high-end cars on the road today. The differences between them and their competitors are small but disturbing. For example, most Level 2 cars will issue urgent warnings and eventually even slow to a stop when drivers leave their hands off the wheel for more than 30 seconds. Tesla eschewed that standard, with some drivers reporting that the car would allow several minutes of hands-free driving before warning them.
Equally concerning is the fact that Tesla has marketed its Level 2 features not as “assistive driving,” like Audi does, but as “autopilot.” Tesla’s chief executive, Elon Musk, has told reporters that “autopilot is almost twice as good as a person.” His wife posted a video of herself dancing while driving, hands-free, eyes off the road, on a busy highway.
Consumer Reports has slammed Musk and his company for giving Tesla drivers a “false sense of security.” Now, three crashes and one fatality later, the company has put out a statement that its autopilot feature “is by far the most advanced driver assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility.”
Compare this to the way in which Knewton’s Ferreira dances around the role of the teacher in the same NPR story in which he waxes poetic about robot tutors in the sky:
Ferreira says he’s merely trying to complement and support teachers, not replace them. A hundred-plus years ago, he notes, hospital operating rooms didn’t have much technology. Now there’s tons. That technology “didn’t replace doctors or medical personnel,” he says. Teachers who are used to staying up until midnight crafting lessons, he asserts, can now “press a few buttons and say, OK, Knewton, show me all content in the world ranked by data that’s the best to add to my content plan.”
What is the role of the teacher in this picture? Lesson planning is important pedagogical work. It’s fine for assistive technology to be a helper and time saver in that work, but the hyperbolic language about replacing planning that keeps teachers “up until midnight crafting lessons” with pressing “a few buttons” suggests more — particularly when combined with “robot tutor in the sky.” At best, we are getting confusing and contradictory signals about the product’s value.
The truth is that Knewton’s mixed messages are different from those of many other adaptive-learning vendors in degree rather than kind. They will all tell you that they respect teachers and are not trying to replace them. Many will then quickly move on to tout the ability of their products to fill vital teaching functions without explaining the relationship between those two claims.
The very best adaptive learning products on the market are closely analogous to Level 2 autonomous cars. They combine several types of assistive technology that can be very useful in certain circumstances but are not anywhere close to being full human replacements. When it comes to teaching with these technologies, professors still need to keep their eyes on the road and their hands on the wheel.
The over-the-top marketing by some vendors encourages irresponsible behavior that can lead to students getting hurt.
Michael Feldstein is a partner at MindWires Consulting, co-publisher of the e-Literate blog, and co-producer of e-Literate TV.
Join the conversation about this article on the Re:Learning Facebook page.