In an effort to curb harassment on college campuses, 72 women’s and civil-rights groups from across the nation recently announced a campaign to enlist the federal government to shut down applications like Yik Yak, which they claim foster an environment of exclusion and hate. For those unfamiliar, Yik Yak is a social-media app, described by many as an anonymous version of Twitter. It requires no user name or log-in information, and users, thanks to geolocative technology, engage only with others in the vicinity. People are able to create their own yak, comment on other people’s yaks, and “upvote” or “downvote” content.
I agree that college administrators (and researchers) need to pay more attention to what is happening on forums like Yik Yak, but shutting them down will not alleviate the larger problem of deep-seated misogyny, racism, and homophobia on college campuses. As a 2013 study that I conducted with Andrea Press demonstrates, the sexism that circulates on forums like Yik Yak is not a new phenomenon. Closing Yik Yak’s window will likely open the door for a similar app waiting to take advantage of the displaced network of users.
Yik Yak should work with colleges to identify users who spew particularly hateful or defaming speech, and colleges should care that students on their campuses are doing this. However, my research demonstrates that harassment via Yik Yak is rare. Users attest that “the community” does a good job of regulating what they qualify as derogatory speech.
Given these findings, organizations need to stop focusing on “shutting down” technology easily replaceable, and rather use forums like Yik Yak to better understand the broader cultural problems within their communities. I don’t mean to diminish the hurt and isolation students feel when they witness extremely racist, sexist, or damaging content, but focusing solely on harassment and explicitly derogatory content undermines two more pressing problems: the content that persists and the protests hidden from view.
Take, for example, analysis of the screenshots used in a recent article about Yik Yak published in The Chronicle. It is clear from the vote counts associated with each comment that the wider student body at the University of Mary Washington does not agree with blatant sexist rhetoric. My findings mirror this analysis. Yakkers in my study do not post yaks to harass their peers simply because the space is anonymous. Rather, students who actively yak are much more inclined to craft posts they think will garner upvotes from their peers, even relying on formulaic text they know will be well received.
Analyzing these kinds of posts posits a larger problem that faculty and staff should consider. For example, yakkers described to me how “racist” comments are continuously downvoted; yet yaks bemoaning TAs who don’t speak “good enough” English are routinely on the hot list. One of my respondents described this as “PC racism.”
Unfortunately, because of the programmatic functions designed to curb cyberbullying, the voices of those who disagree with the opinion that TAs should be native English speakers are removed. Yaks that score more than four downvotes automatically disappear. While this helps wipe away content that the majority of users qualify as “racist,” it also removes opinions that challenge the status quo, and allows “PC racism” to persist. Similarly, comments that challenge “PC sexism” are frequently downvoted and removed.
While Yik Yak representatives contend that defaming speech constitutes the majority of what gets downvoted, my own findings indicate that the voices of the disenfranchised are also routinely silenced.
I urge researchers to pay more attention to the type of content that gets deleted. While Yik Yak representatives contend that defaming speech constitutes the majority of what gets downvoted, my own findings indicate that the voices of the disenfranchised are also routinely silenced. As one respondent described the process, “my voice was literally erased from the conversation.” And as a group of LGBTQ Yik Yak users recently lamented on the app, content about the LGBTQ community is regularly downvoted by other users who do not “agree” with its sentiments.
In an analog era, studying deleted content would be an impossible task, but deleted yaks are erased only from the interface. If researchers were able to gain access to what I call black-hole data — deleted content stored on proprietary servers — we could use big-data analytics to find patterns in what type of expression is hidden from view. By exposing these inequalities, researchers could bring to the surface points of view not currently being heard by the majority on college campuses.
Moreover, disenfranchised voices have taken the content of Yik Yak offline in an effort to engage like-minded supporters who might also be shut out. In fact, a recent flier posted at the College of William & Mary used a copy of a particularly offensive yak to encourage the student body to challenge hurtful gender stereotypes and embrace their own identities.
Shutting down Yik Yak might hide insidious remarks from view, but it also shuts down the opportunity for faculty, administrators, and students to better understand the broader problems of racism and sexism that exist on their campuses. The popularity of Yik Yak is the sense of camaraderie users describe when they engage in the space. Perhaps studying what fosters this sense of community might provide the opportunity for more students to engage in meaningful, institutionally sponsored dialogue. Incorporating Yik Yak into the solution could end up solving the problem it created in the first place.