The internet and the technology companies powering it have shown their dark side recently. Racism and sexism have flourished, mostly unchecked, on social media. Algorithms used by Facebook and Twitter have been blamed for the spread of fake news. And as phones, cars, and household devices scoop up their users’ data, the expectation of privacy has practically evaporated.
Under each of those phenomena lie ethical quandaries. Is technological development outpacing our ability to tease out its implications? If so, is higher education responsible for the problem?
We’re sorry, something went wrong.
We are unable to fully display the content of this page.
This is most likely due to a content blocker on your computer or network.
Please allow access to our site and then refresh this page.
You may then be asked to log in, create an account (if you don't already have one),
or subscribe.
If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com.
The internet and the technology companies powering it have shown their dark side recently. Racism and sexism have flourished, mostly unchecked, on social media. Algorithms used by Facebook and Twitter have been blamed for the spread of fake news. And as phones, cars, and household devices scoop up their users’ data, the expectation of privacy has practically evaporated.
Under each of those phenomena lie ethical quandaries. Is technological development outpacing our ability to tease out its implications? If so, is higher education responsible for the problem?
Jim Malazita, an assistant professor of science and technology studies at Rensselaer Polytechnic Institute, believes higher education has played a role. He thinks there’s something about how the STEM disciplines are taught — science, technology, engineering, and mathematics — that discourages students from considering ethical questions as they learn the skills they need to work for big technology companies. But if colleges and universities are contributing to the problem, then they can also help fix it.
With funding from the National Endowment for the Humanities, Malazita is piloting an initiative to inject discussions of ethics and politics into introductory computer-science courses at Rensselaer, in New York. He is pushing back against the idea that programmers should focus purely on technical work and leave the softer questions about how their products are used to social scientists. He hopes his students will see it as their job to build socially responsible technology.
ADVERTISEMENT
He spoke to The Chronicle about his course and the history of ethics in STEM education. The interview has been edited for clarity and length.
Q. How is what you’re trying to do different from the way ethics and computer science are usually taught?
A. Rarely will you talk to a STEM student who says ethics aren’t important. But by the time they’re done with their education, they’re like, It’s other people around me’s job to make sure this technology is doing the right thing.
Rather than pairing computer science with a suite of courses to make computer science ethical, what if we get humanists into core computer-science classes to get students to think about the ethics and politics of computer science as part of their core skill set?
How can we teach you Python and coding, but at the same time always talk about coding as a political practice?
ADVERTISEMENT
Q. What will that look like in your course?
A. We’re using data sets about various social issues, such as race and violence in New York City, and a Unesco database about education funding. We’re saying, Here are these data sets you’re going to have to crunch through using Python. What do these algorithms leave out? What can’t you account for?
Every language you learn has a bias to it, so let’s acknowledge that.
We’re thinking through teaching how to use code and the way the code shapes the way you think about the database. Every language you learn has a bias to it, so let’s acknowledge that.
Q. What’s an example of a type of problem you might have your students solve that helps them understand their work as programmers more politically?
ADVERTISEMENT
A. The data set about gun violence in New York City is already used by computer-science faculty in the classroom. But the way the problems are framed is: Walk through the data set, parse up where gun violence is and where it’s not. And then based on those findings, tell me where you would rather live and rather not live in New York City.
We use the data set, but with readings about gun violence. We ask what’s the problem with asking the question in this way. How can we use this data to understand the phenomenon of gun violence rather than “these parts of New York City are good and these parts are bad”?
Q. You mentioned you’ve been getting some pushback.
A. I’ve had to do a lot of social work with computer-science faculty. The faculty were like, This sounds cool, but will they still be able to move on in computer science? We’re using different, messier data sets. Will they still understand the formal aspects of computing?
Q. What do you tell faculty members to convince them that this is a good use of your students’ time?
ADVERTISEMENT
A. I use a couple of strategies that sometimes work, sometimes don’t. It’s surprisingly important to talk about my own technical expertise. I only moved into social science and humanities as a Ph.D. student. As an undergraduate, my degree was in digital media design. So you can trust me with this content.
It’s helpful to also cast it in terms of helping women and underrepresented-minority retention in computer science. These questions have an impact on all students, but especially women and underrepresented minorities who are used to having their voices marginalized. The faculty want those numbers up.
Q. Is there a precedent for teaching ethics as part of STEM education?
A. In early computational education, businesses asked the instructors of engineering and STEM classes to teach philosophy in their classes. They thought if they learn philosophy, they’ll be better able to market products, understand what people need.
They read the same texts as humanities students — Plato, Hobbes, Locke — but they would read them differently. The thinking was, Now that I’ve read Plato, I know how people work, and I can create systems that take advantage of how people work.
ADVERTISEMENT
By the 1960s, during the Vietnam era, there was a pushback against technology. You start seeing these movements of new engineers who say, Hey, we need to teach things like technology and society. This is part of a newer generation of writings. Rachel Carson’s work is an example.
[[relatedcontent align="left” size="half-width”]] In the 1970s, there was a huge backlash from deans who said, You’re diluting the core of engineering. Students are starting to become less-efficient employees. We need to reconstruct engineering education. It’s not their role to make sure it has positive impact. That’s the role of their managers.
Q. What do you hope your students will do with what you’re teaching them?
A. I tell my students, I’m not training you for your first job, I’m training you for your second job. Chances are their entry-level job will want them to not think about social issues and will want them to just be programmers.
But once they move into design, that’s wide open for them to bring back in this knowledge.
ADVERTISEMENT
If my students get a second job at Facebook, I want our students to go in and say how is Facebook working from a technological perspective and how are those technology issues framing the social issues and discussions that get posted on Facebook.
Nell Gluckman writes about faculty issues and other topics in higher education. You can follow her on Twitter @nellgluckman, or email her at nell.gluckman@chronicle.com.
Nell Gluckman is a senior reporter who writes about research, ethics, funding issues, affirmative action, and other higher-education topics. You can follow her on Twitter @nellgluckman, or email her at nell.gluckman@chronicle.com.