Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    An AI-Driven Work Force
    University Transformation
Sign In
Profhacker Logo

ProfHacker

Teaching, tech, and productivity.

Against the 3A’s of EdTech: AI, Analytics, and Adaptive Technologies in Education

By Maha Bali November 29, 2017

Nick Sousanis cartoon showing rules of culture

I’ve had/have the privilege of being a member of the expert panel working on the NMC Horizon Report (the one that’s still in progress, the process of which I plan to write/co-write about soon, inshallah). One of the key things about the report is that we are asked to suggest what technologies we think will be highly influential in education in the far, medium and near terms. You can see the list of topics and expert panelists’ comments

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

Nick Sousanis cartoon showing rules of culture

I’ve had/have the privilege of being a member of the expert panel working on the NMC Horizon Report (the one that’s still in progress, the process of which I plan to write/co-write about soon, inshallah). One of the key things about the report is that we are asked to suggest what technologies we think will be highly influential in education in the far, medium and near terms. You can see the list of topics and expert panelists’ comments here. I’ve had technical problems and haven’t been able to post many comments myself, but needed space to clarify my votes on some things, and that’s what this post is for.

I want to clarify why I voted for certain things as influencing higher education in the future, but that I don’t actually endorse these things. I can see evidence around me that AI (Artificial Intelligence), Analytics and Adaptive Learning are being pushed in the education field, but I have a strong aversion to all of them, mainly on ethical grounds. I agree with all of Audrey Watters', Chris Gilliard’s, Autumm Caines’ and Benjamin Doxtdator’s critiques on these topics (also: it’s scary how my Google docs app immediately recommended their websites when I started inserting the links here), and it’s difficult to pick just one post by any of them on these issues, but if you’re not following these people, you should be.

But I’d like to explain this a little more from my personal perspective, in a way that’s focused on pedagogical considerations (so I am not focused on surveillance capitalism or the hegemony of Silicon Valley, which others above speak much more eloquently about). I have struggled in the past to explain my views on this to a subset of computer scientists I know (my past professors and others in edtech) who don’t normally think critically about education, and perhaps the ones who do think critically about education can give me feedback on how to make this clearer.

Misunderstanding of Teacher Roles

In some of the discourses surrounding these technologies, there tends to be a reduction of what a teacher’s role is. I have watched people make arguments for AI by basically listing 10 functions of a teacher, and ticking them off as doable by a computer. This is highly problematic, because those lists are, in the first place, limiting a teacher’s role to some cognitive and routine tasks that indeed could be done by a computer, and completely ignoring the educational research that suggests the importance of having an adult or more knowledgeable peer supporting a young person’s process of knowledge construction. It also completely ignores the socio-emotional role of a teacher: in motivating the learner, promoting their love of a subject, recognizing their interest in something, and in providing moral support whether in things related to learning or to life. It also is completely unaware of the potential role a caring teacher can have to foster the critical consciousness of a person, beyond transmitting knowledge.

Not every teacher does this with every student, of course, and those who have to teach larger numbers of students cannot do so, but instead of investing in cold technologies to teach e.g. refugees remotely, can’t we instead invest in helping re-train adult refugees to teach and mentor younger refugees, thereby making those communities more sustainable? Don’t refugees need emotional support and a human touch more than knowledge delivered via devices they may or may not have? Don’t adult refugees want to work and support their communities? Why invest in technology when the money can be put to better use investing in people?

Two of my pet peeves that use AI are plagiarism-detection systems and automatic grading of writing. They reduce the problems of both these things to their technical bare bones. Teaching writing is NOT about getting the grammar of the sentences right into a paragraph within a five-paragraph essay. Teaching writing is about helping learners express themselves clearly and effectively with other human beings. What value is there in a machine giving students feedback? It’s bad enough if our students’ writing gets read by no one but a teacher, but at the very least, let another human being see it. I understand there are teachers who are responsible for larger classes who cannot humanly read that large an amount of writing. So give them teaching assistants or do more peer assessment. But for goodness sake, why would anyone want to write for a machine?

Plagiarism-detection tools are no better. It sort of primes students to figure out technical workarounds to plagiarism, rather than developing the attitude of valuing their own ideas and words while being respectful and appreciative to how others’ ideas and words influence their own. A good teacher can promote these attitudes despite plagiarism-detection software, but the software itself does not do this.

Collecting Data: Bias, Surveillance & Control

All three of these technologies (particularly if AI is of the machine learning variety) rely heavily on collecting data from the learner - usually ubiquitously (meaning, without learners recognizing that their behaviors are being mined as data all the time, without giving explicit permission, without knowing what will be done to the data and by whom). This data is being collected based on some tech person’s decision of which data is relevant in order to categorize learners in certain ways, which are often indicators that don’t necessarily help you towards a particular conclusion, but maybe make policymakers feel good. For example, learning analytics may look at how many times someone has watched a video and for how long. It doesn’t usually tell you if they were concentrating while watching it, multitasking (though I’m sure someone will come up with ways to check if they’re doing something else on the computer or suggest putting a camera to track if they’ve moved away from the computer - shudder), or if they’re having technical problems that make them restart the video many times. The choices of which data to collect are usually made by the techies and policy makers, but not teachers or students. Do teachers really need to know this stuff and should they be asked to look at this data to make decisions about their learners, or should they instead, say, actually look into their students eyes while teaching them, and maybe, you know, ask them what they’re thinking, or feeling, or how they felt about those readings.

These kinds of tools make it seem like surveillance of learners in order to control them is acceptable and even desirable. It should not be, even if education already does so in some ways; actual learning is not helped by this kind of approach. Using technology to reproduce neoliberal approaches to education won’t make learning better. It will make it worse in all the worst ways, particularly reproducing inequalities.

I don’t know if I need to re-hash issues of bias within algorithms. Yes, even when a machine is learning, rather than following a prescribed set of steps, it learns in biased ways. We (well the developers) train the algorithm to learn from data, and that data has in-built biases, which the resulting algorithm (now having learned from it) reproduces back to us when confronted with new data, and we pretend its output is more “objective” than we are. We all know about algorithmic bias (e.g. Safiyya Noble’s work on Google). And if you haven’t read Cathy O’Neil’s Weapons of Math Destruction, you really should (here’s a list of podcast interviews and her blog). Using such approaches in education is likely to reinforce bias against minorities and underprivileged students. Articles on tools that would allow the teacher to predict a student’s grade or performance from early in the semester is likely to disproportionately label minority and underprivileged students or anyone whose learning habits deviate from the “norm”. And wouldn’t labeling these students to the teacher possibly bias the teacher further against them, or make the teacher have lower expectations of them? I realize these biases exist with or without technology, and that the technology is intended to “help” the students - but that does not discount the dangers of using these tools uncritically, and the ways in which they are likely to go horribly wrong.

Stripping Learners of Agency

From conversations with people working on adaptive learning, some of these use a pre-built algorithm that the developers came up with, and others use machine learning. In either case, someone make decisions over which data points to collect and which adaptations to “give” learners at certain stages. This completely strips learners of their agency to control their own learning! Shouldn’t education be about fostering learners’ metacognitive capacity to know when they need to go faster or slower or seek additional information? Feeding them what the machine has decided they need at a certain point in time is an approach where we are letting the machine program the person, not the other way around, as Seymour Papert would have said. For a discussion on decolonizing learning analytics, check out this blogpost by Paul Prinsloo, and then do a Scholar search on his publications on the topic of ethics in learning analytics.

Some Critical Questions

This post is getting long. I’ll just end by saying that many of these technology solutions need to address these questions:

  1. Which educational problem are you trying to solve? (sometimes this problem is non-existent, and these tools aren’t justifiable)

  2. What human solutions to this problem exist? Why aren’t you investing in those?

  3. What harm could come (to students) from using this tool? What harm could come to teachers? To society?

  4. In what ways might this tool disproportionately harm less privileged learners and societies? In what ways might it reproduce inequality?

  5. How much have actual teachers and learners on the ground been involved in or consulted on the design of these tools?

I was talking to an undergraduate computer science student yesterday about my ethical objections to surveillance used in personalized learning, and he told me “ethics have evolved”. I told him, “Actually, ethics still exist, but many computer scientists don’t listen to what others are telling them.” I rarely see AI, adaptive learning or analytics tools questioned in this way by those who create and implement them. And they need to start listening to all of us who are doing so and be held accountable.

What are your thoughts on the 3A’s? Tell us in the comments!

Feature image: Copyright Nick Sousanis, used & edited with permission:Games Culture 1, from p. 12 of “Possibilities” created for Game Show Detroit in 2006.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
ADVERTISEMENT
ADVERTISEMENT

More News

Black and white photo of the Morrill Hall building on the University of Minnesota campus with red covering one side.
Finance & operations
U. of Minnesota Tries to Soften the Blow of Tuition Hikes, Budget Cuts With Faculty Benefits
Photo illustration showing a figurine of a football player with a large price tag on it.
Athletics
Loans, Fees, and TV Money: Where Colleges Are Finding the Funds to Pay Athletes
Photo illustration of a donation jar turned on it's side, with coins spilling out.
Access & Affordability
Congressional Republicans Want to End Grad PLUS Loans. How Might It Affect Your Campus?
Florida Commissioner of Education Manny Diaz, Jr. delivers remarks during the State Board of Education meeting at Winter Park High School, Wednesday, March 27, 2024.
Executive Privilege
In Florida, University Presidents’ Pay Goes Up. Is Politics to Blame?

From The Review

Photo-based illustration of a tentacle holding a microscope
The Review | Essay
In Defense of ‘Silly’ Science
By Carly Anne York
Illustration showing a graduate's hand holding a college diploma and another hand but a vote into a ballot box
The Review | Essay
Civics Education Is Back. It Shouldn’t Belong to Conservatives.
By Timothy Messer-Kruse
Photo-based illustration of a hedges shaped like dollar signs in various degrees of having been over-trimmed by a shadowed Donald Trump figure carrying hedge trimmers.
The Review | Essay
What Will Be Left of Higher Ed in Four Years?
By Brendan Cantwell

Upcoming Events

Plain_Acuity_DurableSkills_VF.png
Why Employers Value ‘Durable’ Skills
Warwick_Leadership_Javi.png
University Transformation: A Global Leadership Perspective
Lead With Insight
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin