The message, tucked in a routine fall-planning email to Oakland University students, took Tyler Dixon by surprise.
Along with wearing masks and social distancing, students living on campus would be expected to wear a coin-size “BioButton” attached to their chests with medical adhesive. It would continuously measure their temperature, respiratory rate, and heart rate, and tell them whether they’d been in close contact with a button wearer who’d tested positive for Covid-19. In conjunction with a series of daily screening questions, the button would let them know if they were cleared for class.
We’re sorry, something went wrong.
We are unable to fully display the content of this page.
This is most likely due to a content blocker on your computer or network.
Please allow access to our site and then refresh this page.
You may then be asked to log in, create an account (if you don't already have one),
or subscribe.
If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com.
The message, tucked in a routine fall-planning email to Oakland University students, took Tyler Dixon by surprise.
Along with wearing masks and social distancing, students living on campus would be expected to wear a coin-size “BioButton” attached to their chests with medical adhesive. It would continuously measure their temperature, respiratory rate, and heart rate, and tell them whether they’d been in close contact with a button wearer who’d tested positive for Covid-19. In conjunction with a series of daily screening questions, the button would let them know if they were cleared for class.
Dixon, a senior and resident adviser, said the late-July email was the first he and any of his friends at the university north of Detroit had heard of the BioButton. “No one I spoke to liked the idea of having to wear something on their body to be on the campus,” he said. “They wondered how secure the information was and who would have access to it.”
A friend worried about what would happen if he went to a Black Lives Matter protest where violence broke out. Would he be tracked down and disciplined? Would sleeping on the opposite side of a thin dorm-room wall from an infected student force someone unnecessarily into quarantine?
ADVERTISEMENT
Dixon posted a petition on Change.org urging Oakland to give students the choice to opt out. Angry responses to the BioButton requirement flooded in from students and parents. The college was invading their privacy, they wrote. They’d rather quit than wear the button; the college was turning Communist.
Institutions have looked for a technological fix where there isn’t one.
“I went to bed with 100 signatures, and when I woke up, it had blown up, and a guy from a far-right talk show wanted to give me an award,” Dixon says.
Oakland isn’t the only institution seeing this kind of pushback. The pandemic has prompted many colleges to quickly roll out surveillance tools that could help limit the spread of the virus, or mitigate its effects on learning, as students are sent out of the classroom and into private quarters. Some students, required to flash Covid-free badges to enter classrooms or rotate their laptops for online test proctors to scan their bedrooms, have grown weary of feeling watched. And some are leery of how the information that’s being collected will be used, whether it could leak out, and whether there’s a process to destroy it when the pandemic is over.
That wariness isn’t limited to students. Colleges scrambling to keep students healthy and educationally on track have erected a mass-surveillance structure that won’t just disappear, and may have lasting effects on the student experience. “There’s a tendency with tracing technologies for them to linger after their initial purpose fades,” says Sarah E. Igo, a professor of history at Vanderbilt University who studies surveillance and privacy. “It should be clear that these are temporary, extraordinary measures. We have to pay as much attention to how we kick them off as put them up.”
Dixon knows no one at Oakland has any reason to misuse his health data. But even seemingly secure government and business systems can be hit by sweeping cyberattacks, he says. “We’re living in insane times.”
ADVERTISEMENT
Oakland officials say they regret that the information about the BioButton was shared before they could educate people about what it did and didn’t do. Only the wearers would have access to their specific data, and the close-contact alerts were based on Bluetooth recognition, not GPS location tracking. In other words, the device doesn’t track a student’s specific location. It just monitors whether it is within Bluetooth distance (about 15 feet) from another BioButton device. Given the backlash, the university agreed to “strongly encourage” rather than mandate its use.
David A. Stone, a professor of philosophy and chief research officer at Oakland, led the team that selected and evaluated the BioButton. As he sees it, handing over health information is a relatively small price to pay if it means halting the spread of a virus that has ravaged the nation.
“When you consider the hundreds of thousands of people who have died in this pandemic, is it too much to ask to share your heart rate or temperature?” he asks. He says the wearable technology seemed the least invasive way to catch symptoms early and give students tools to know if they might have early signs of, or potential exposure to, Covid-19.
Other campuses, fearing the kinds of privacy objections Oakland faced, have concluded that the solutions being heavily marketed in the early months of the pandemic could create more problems than they solve. The University of Maryland at College Park considered, but decided against, using technologies that track someone’s temperature or location. One company offered an Internet-connected thermometer that could help the campus predict where the virus was spreading, but some faculty members feared that the company would sell the personal data it collects.
“Heaven forbid that the thermometer notices you’re spiking a fever,” and all of a sudden you start getting direct mail about Nyquil or Clorox wipes, says Neil Jay Sehgal, an assistant professor of health policy and management at Maryland.
ADVERTISEMENT
Some might wonder why Gen-Z college students, who post the minutiae of their daily lives on social media, are concerned about privacy.
There’s a difference between posting information yourself — often the carefully curated version of a life you want to convey — and having a proctoring service require you to scan your bedroom before a test for cheat sheets or open books, says Chris Gilliard, an English professor at Macomb Community College, in Warren, Mich., who studies privacy and inequality.
“For a long time, we’ve believed the myth that students didn’t care about these issues. Now, it’s impossible to ignore the way they’re pushing back,” he says.
At some colleges, including the City University of New York and the University of Illinois at Urbana Champaign, students have circulated petitions demanding that online proctoring systems be kicked out of their classrooms.
After about 1,000 students at Urbana-Champaign protested against the systems, the university announced last month that it will no longer use Proctorio software after the summer 2021 term. That doesn’t mean anti-cheating software is out the window. A campus spokesman said the short-term license it signed with Proctorio last March as a Covid-related emergency isn’t being extended, but that it will be looking at other remote proctoring options.
Surveillance is really about power and control, and universities are looking for certainty in very uncertain times.
ADVERTISEMENT
Some colleges have argued that remote learning has left them no other way to safeguard the integrity of exams. But critics say that’s a cop-out.
“A lot of the technology being implemented are things schools did in the past or wanted to do but didn’t have license to,” Gilliard says. “The pandemic served as a convenient excuse to supercharge these technologies.”
And they have a particular incentive now, he says. “Surveillance is really about power and control, and universities are looking for certainty in very uncertain times. There wasn’t a safe way to return students to campus.” But instead of keeping campuses closed and taking the political heat, Gilliard says, “institutions have looked for a technological fix where there isn’t one.”
Menlo College, in Atherton, Calif., isn’t claiming that its latest technology tool is such a cure-all. But it hopes to help students with a smartphone app that listens for signs of anxiety and depression.
With fewer than 900 students, the private college in Silicon Valley prides itself on the ability to offer personal attention, but Covid-19 left students dispersed and feeling isolated. So Menlo collaborated with a start-up, Ellipsis Health, to encourage students to try an app that uses machine learning to flag people whose speech matches the vocal patterns of people who are depressed. Students start out by recording themselves speaking for two to three minutes. Then, each time they log in to the app, they’re asked a series of questions. Based on how they’re scored for anxiety and depression, they might be urged to unwind with a meditation tape or to call a crisis hot line.
ADVERTISEMENT
College officials stress that a machine, not a person, is listening in, and the student is the only one who gets the individual feedback.
Ellipsis and the college worked with student leaders to fine-tune an approach that raised as few privacy flags as possible. “They were really receptive to what students wanted and felt comfortable with,” says Lina Lakoczky-Torres, an entrepreneurship major who serves as wellness representative for the college’s student government. “It makes it feel like it’s our baby as much as theirs.”
Students didn’t want any mental-health counselors listening in, she said, and they wanted to add their own questions to assess their mental health, like to what extent they were stressed by posts and “likes” on social media. “There’s a lot of fear-mongering about technology, but this comes from a place of wanting to help,” Lakoczky-Torres says.
Students have bought in to the technology, she said, because they played a role in developing it and felt they were in control of the data it was collecting. When that’s not the case, and students suspect that their personal lives are being probed by companies more concerned about profit than their well-being, they’re likely to rebel.
One of their biggest targets is the automated online proctoring — also one of the fastest-growing forms of student surveillance. The technology, used on many campuses well before the pandemic, has ballooned since then with the mass migration to online classes. In April, an Educause poll found that 54 percent of higher-education institutions were using online or remote proctoring services, while another 23 percent were planning on, or considering, using them. And recently, McGraw-Hill, a major academic publisher, bundled remote-proctoring and browser-locking capabilities with its digital textbooks.
ADVERTISEMENT
The software, which faculty members can customize, typically scans students’ rooms, locks their computer browsers, and monitors eye and head movements through their webcams as they take tests.
Critics complain that using such software signals to students that faculty members don’t trust them. Some students also say the possibility of being flagged for “suspicious” activity adds to the stress of taking a test, sometimes causing panic attacks.
“I got flagged quite a few times for moving, or taking a second and looking away while thinking,” says Olivia Eskritt, a second-year student at St. Clair College in Windsor, Ontario, whose class used the software Respondus.
Before beginning a test, students had to pick up their laptops and rotate them around their rooms to show that they hadn’t posted cheat sheets on the walls, she says. They also had to record themselves talking so the system would recognize if someone else began feeding answers to them. “My mom has walked into the room while I’m in the middle of the test, and I’m like ‘Oh no, you’re going to get me in trouble!’” Escritt worried, meanwhile, that her dad would set off the cheating software with his booming, ex-military voice while Zooming into a work call nearby.
Black and brown students face even more concerning barriers, critics say — one of the complaints made by students protesting at the University of Illinois at Urbana-Champaign. Studies have shown that facial-recognition software sometimes has trouble identifying the faces of dark-skinned students.
ADVERTISEMENT
Alivardi Khan, who recently graduated from Brooklyn Law School, found that out the hard way.
The @ExamSoft software can't "recognize" me due to "poor lighting" even though I'm sitting in a well lit room. Starting to think it has nothing to do with lighting. Pretty sure we all predicted their facial recognition software wouldn't work for people of color. @DiplomaPriv4All
Khan says he spent much of the week before the New York State Bar Exam trying to get ExamSoft, the proctoring system, to recognize him. “I tried sitting in front of a window when the sun was shining in, then I went into a bright bathroom with light shining off white tiles,” he says. Eventually, after he got help from a customer-service rep, the system recognized him.
Even though Brooklyn Law School gave him a room in which to take the bar exam, Khan took along a lamp just in case. Being forced to sit still for so long caused the room’s automated light to turn off. “I had to flail my arms to make it come back on,” he says, creating another potential flag for cheating. “We had a 15-minute break between sections, and I used it to call ExamSoft’s customer service.” All in all, a pretty stressful experience, he says.
Britt Nichols, ExamSoft’s chief revenue officer, says that poor lighting can cause problems recognizing anyone’s face, but that there’s no evidence the problem is worse for those with dark skin.
“Every once in a very small blue moon it doesn’t recognize your face,” he says. “Some people assume there is something nefarious at play,” he added, when the problem could be a weak internet connection.
ADVERTISEMENT
Students with disabilities, too, have complained that something like a facial tic or other unexpected movements could cause them to be flagged. Some have reported that the browser-lockdown feature can limit the use of tools that convert text to speech.
Proctoring services say instructors have the option to account for special needs by, say, turning off the camera or by allowing students a short break during an exam. But realistically, faculty members who are struggling with the technological demands of online courses might find it difficult to make such individual accommodations.
Some faculty members have made clear that they have no intention of using anti-cheating software.
Derek A. Houston, an adjunct professor of educational leadership at Southern Illinois University at Edwardsville, said he was alarmed to learn that the state’s Public Higher Education Cooperative had published a request for proposal for $44 million over five years to fund two online proctoring programs. Houston wanted to signal to his employer, his students, and higher education more broadly that he feels online proctoring sets the wrong tone.
His message on Twitter: “You will not have to worry about this sort of unnecessary surveillance. We will build within the classroom mutual trust and expectations. My goal is collective growth, and surveillance is the antithesis of that.”
ADVERTISEMENT
Students and faculty members aren’t the only ones resisting. In December, a group of Democratic senators wrotetothree online-proctoring companies demanding to know how they were protecting student privacy and ensuring that students, including those with disabilities or dark skin, aren’t falsely accused of cheating.
In response to such concerns, the proctoring companies have argued that doing away with their tools will cause widespread cheating.
In an interview, the founder and chief executive of Proctorio, Mike Olsen, says much of the criticism of proctoring software is based on misconceptions.
“We don’t kick anyone out of an exam if anyone’s talking or they get up” to go to the bathroom, he says. The system will just flag the interruption for a faculty member to review later. If someone has a shaky internet connection, they can be disconnected for up to two minutes and return to the exam, but allowing someone to be offline for longer than that, he says, introduces too much risk for cheating. That also raises equity issues, since disadvantaged students with spotty Wi-Fi are more likely to have prolonged outages.
Fairness challenges will arise even without his software, Olsen says. Some students get upset when their professors tell them they’re using the honor system, he says, because they know that some of their classmates will intercept the answers from online tutoring tools, like the subscription-based Chegg, that not everyone can afford.
ADVERTISEMENT
He advises instructors to explain to students if they need to use certain features, like cameras, that might make some uncomfortable. “Maybe accreditation requires a certain level of exam security — communicate that. Students just want to know why.”
Of course, many of the tools that colleges are using today to keep close watch on their students long pre-date the pandemic and are likely to outlive it. Data analytics allow colleges to track students’ movements across campuses: How many times they visit the library, how often they skip meals, what time of day they typically do their homework. Bluetooth sensors in some classrooms connect with apps on students’ phones, marking them as present.
In a 2018 opinion piece for The Washington Post, Mitchell E. Daniels Jr., president of Purdue University, pointed out that the university’s technology infrastructure, designed to support student success, campus services, and research, produces, as a byproduct, “a massive amount of fascinating information.”
“Forget that old ominous line, ‘We know where you live,’” he wrote. “These days, it’s, ‘We know where you are.’”
The quandary Daniels then posed is one many more are pondering now: “Many of us will have to stop and ask whether our good intentions are carrying us past boundaries where privacy and individual autonomy should still prevail.”
ADVERTISEMENT
It’s a question that frequently comes up when discussing location tracking and facial-recognition tools. In September, some Brown University students were alarmed to receive emails from the administration incorrectly accusing them of living in Providence when they had said they’d be attending remotely. The students were accused of violating the code of student conduct, which requires campus residents to adhere to strict Covid-19 testing requirements, and were threatened with disciplinary measures.
The factors used to locate the students included “evidence of having accessed private university electronic services or secure networks from the Providence area; indications of having accessed buildings on our campus directly; and/or reports from other community members,” a Brown spokesman, Brian E. Clark, wrote in an email to The Chronicle. When more details the next day revealed that the students weren’t, in fact, nearby, the university withdrew the charges and apologized to the students.
The pandemic isn’t the first crisis that has unleashed a flood of security technologies. After a series of school shootings, “there was a rush and urgency to deploy new technology to prevent mass violence,” said Elizabeth Laird, director of equity in civic technology for the Center for Democracy & Technology. She’s seeing a similar response to the Covid pandemic, when tools that would have been considered too intrusive are being tolerated, if not exactly welcomed, now. But what happens, she asks, when the urgent need for them is over?
“It’s in moments of crisis that you’re most likely to sacrifice your civil rights,” she said. “But the problem is that once you sacrifice them, it’s hard to get them back.”
Katherine Mangan writes about community colleges, completion efforts, student success, and job training, as well as free speech and other topics in daily news. Follow her @KatherineMangan, or email her at katherine.mangan@chronicle.com.