Digital media platforms like Google and Facebook may disavow responsibility for the results of their algorithms, but they can have tremendous — and disturbing — social effects. Racist and sexist bias, misinformation, and profiling are frequently unnoticed byproducts of those algorithms. And unlike public institutions (like the library), Google and Facebook have no transparent curation process by which the public can judge the credibility or legitimacy of the information they propagate.
We’re sorry, something went wrong.
We are unable to fully display the content of this page.
This is most likely due to a content blocker on your computer or network.
Please allow access to our site and then refresh this page.
You may then be asked to log in, create an account (if you don't already have one),
or subscribe.
If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com.
Digital media platforms like Google and Facebook may disavow responsibility for the results of their algorithms, but they can have tremendous — and disturbing — social effects. Racist and sexist bias, misinformation, and profiling are frequently unnoticed byproducts of those algorithms. And unlike public institutions (like the library), Google and Facebook have no transparent curation process by which the public can judge the credibility or legitimacy of the information they propagate.
That misinformation can be debilitating for a democracy — and in some instances deadly for its citizens. Such was the case with the 2015 killings of nine African-American worshipers at Emanuel A.M.E. Church in Charleston, S.C., who were victims of a vicious hate crime. In a manifesto, the convicted gunman, Dylann Roof, wrote that his radicalization on race began following the shooting death of Trayvon Martin, an African-American teen, and the acquittal of his killer, George Zimmerman. Roof typed “black on White crime” in a Google search; he says the results confirmed (a patently false notion) that black violence on white Americans is a crisis. His source? The Council of Conservative Citizens, an organization that the Southern Poverty Law Center describes as “unrepentantly racist.” As Roof himself writes of his race education via Google, “I have never been the same since that day.”
“Post-truth—adjective; relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” So says Oxford Dictionaries, announcing their 2016 word of the year. If we really have entered a post-truth era, as so many have written, what does that mean for the scholar and the student? For the citizen and the state? In our special issue, we wrestle with these and other urgent questions.
Roof’s Google search results did not lead him to an authoritative source of violent-crime statistics. FBI statistics show that most violence against white Americans is committed by other white Americans, and that most violence against African-Americans is committed by other African-Americans. His search did not lead him to any experts on race from the fields of African-American studies or ethnic studies at universities, nor to libraries, books, or articles about the history of race in the United States and the invention of racist myths in the service of white supremacy. Instead it delivered him misinformation, disinformation, and outright lies that bolstered his already racist outlook and violent antiblack tendencies.
Online search can oversimplify complex phenomena. The results, ranked by algorithms treated as trade secrets by Google, are divorced from context and lack guidance on their veracity or reliability. Search results feign impartiality and objectivity, even as they fail to provide essential information and knowledge we need: knowledge traditionally acquired through teachers, professors, books, history, and experience.
It’s impossible to know the specifics of what influences the design of proprietary algorithms, other than that human beings are designing them, that profit models are driving them, and that they are not up for public discussion. It’s time we hold these platforms accountable and perhaps even imagine alternatives — such as regulation of search engines — that uphold the public interest.
ADVERTISEMENT
Safiya U. Noble is an assistant professor in the department of information studies at the University of California at Los Angeles. Her forthcoming book, Algorithms of Oppression: Data Discrimination in the Digital Age, will be published by New York University Press in the fall.