> Skip to content
FEATURED:
  • The Evolution of Race in Admissions
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
ADVERTISEMENT
Research Ethics
  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

Intellectual Piecework

Increasingly used in research, platforms like Mechanical Turk pose new ethical dilemmas

By  Nathan Schneider
February 16, 2015
Rochelle LaPlante talks with her 6-year-old daughter about a school project. Ms. LaPlante has been working from home using Mechanical Turk since 2012. “Some days are great, and there are some days that I make $5,” she says.
David Zentz for The Chronicle
Rochelle LaPlante talks with her 6-year-old daughter about a school project. Ms. LaPlante has been working from home using Mechanical Turk since 2012. “Some days are great, and there are some days that I make $5,” she says.

Nowadays, Alfredo García studies graffiti. His dissertation in sociology at Princeton University is an ethnography of a changing Miami neighborhood, which means he spends his time chatting up strangers, arranging interviews, and climbing ladders with street artists. But during his second year at Princeton, in 2012, he tried something else. With a bit of out-of-pocket money, he surveyed 420 people about how looking at different kinds of fake Facebook profiles affected their views about Islam. He obtained clear statistical findings and produced a paper that is now under review at a respected journal. In the process he didn’t meet a single one of his subjects; not one of them was a Princeton undergraduate required to take surveys for a class.

We’re sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.

Once javascript and access to those URLs are allowed, please refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com

Nowadays, Alfredo García studies graffiti. His dissertation in sociology at Princeton University is an ethnography of a changing Miami neighborhood, which means he spends his time chatting up strangers, arranging interviews, and climbing ladders with street artists. But during his second year at Princeton, in 2012, he tried something else. With a bit of out-of-pocket money, he surveyed 420 people about how looking at different kinds of fake Facebook profiles affected their views about Islam. He obtained clear statistical findings and produced a paper that is now under review at a respected journal. In the process he didn’t meet a single one of his subjects; not one of them was a Princeton undergraduate required to take surveys for a class.

Mr. García used Mechanical Turk, an Amazon.com-owned platform that describes itself as “artificial artificial intelligence.” What it offers has been called crowd-work, or digital piecework, or crowdsourcing—thousands of people around the world sitting at their computers and doing discrete tasks for pay. Each of Mr. García’s subjects earned a quarter for filling out a survey less than 10 minutes long—$1.50 an hour, that is.

In addition to being the go-to place to have scans of receipts transcribed, or websites checked for not-safe-for-work content, Mechanical Turk has quietly become an indispensable part of how academic research in many fields is done. Mr. García talks about it with a nonchalance that belies the magnitude of the shift of which he is a part. “It’s the new norm,” he says.

After submitting his paper, he pivoted back toward the in-person research methods he is using in Miami. But others in his department learned about Mechanical Turk from him and are using it in their dissertations, alongside more traditional (and more expensive) techniques.

Maria Abascal, for instance, was amazed when her Mechanical Turk survey on perceptions of skin color drew 600 responses within four hours. “I think it’s an absolutely wonderful resource,” she says, “especially for graduate students who may not have a lot of time or money for their projects.” She now makes a point of paying at least New Jersey’s minimum wage.

ADVERTISEMENT

The platform is also having an influence on the methods of these up-and-coming sociologists. Because of it, experiment-based research designs are increasingly common in their department—akin to the methods in economics and psychology departments, where Mechanical Turk is even more popular. The platform’s low cost and accessibility, however, are possible thanks to a barely regulated virtual labor market.

Stimulated by shrinking university research budgets and chronic underemployment everywhere, academic researchers are among Mechanical Turk’s chief employers; they have also collaborated with workers in an effort to make the platform fairer and more useful. It is at once an opportunity, a sweatshop, and a game. But this many-sidedness means that crowd-work raises questions of methodology and ethics that are especially thorny—and, for some, inconvenient.

The Granger Collection, New York

Amazon’s Mechanical Turk derived its name from “The Turk,” an 18th-century chess-playing “automaton” depicted here in an 1845 wood engraving. In actuality the Turk’s moves were guided by a real person hidden inside the apparatus.

Mechanical Turk went online in late 2005. It was a so-called Jeff project, a priority of Amazon’s founder, Jeff Bezos. The name came from the Turk, a celebrated 18th-century chess-playing robot that was eventually revealed to rely on a human chess master hiding inside. Amazon promotes Mechanical Turk, like its namesake, as essentially a piece of technology—just one among its panoply of web services. In an age when machines are taking over more and more human tasks, it simplifies those remaining ones that really do require human effort, with an interface that makes the workers inside seem, as much as possible, like just part of the software.

After nine years, the mturk.com website is still, officially, in beta testing. It betrays little evidence of having been updated in all that time; despite Mr. Bezos’ initial enthusiasm for it, the platform has since undergone only basic maintenance and modest tweaks. (Amazon responded to interview requests with its usual penchant for silence in the media.) While newer crowd-work platforms continue to proliferate—CrowdFlower, Clickworker, CloudCrowd, and so on—Mechanical Turk remains the standard, especially for researchers looking for a large and diverse pool of subjects.

It offers potential employers—or “requesters"—access to a pool of more than 500,000 “workers"—who often call themselves “Turkers"—though the active population appears to be closer to 10,000. The hundreds of thousands of discrete jobs available at any given time are called HITs, or “human intelligence tasks,” which can take as little as a few seconds and pay just a few cents. Amazon recommends, but doesn’t require, a pay rate of at least 10 cents per minute—more than a dollar shy of the U.S. minimum hourly wage of $7.25. Featured “case studies” on the website include projects by Darpa and the U.S. Army Research Lab.

ADVERTISEMENT

One of the first researchers to use Mechanical Turk was Siddharth Suri, who now works at Microsoft Research New York City. As a graduate student in computer science at the University of Pennsylvania, Mr. Suri began studying behavior on social networks using undergraduate students. He ran into a problem, however, when he later took a job at Yahoo Research. “I wanted to do behavioral experiments, but we had no undergraduates and no classrooms,” he says. “Necessity was the mother of invention.” Starting in 2008, he turned to Mechanical Turk. Not only did it allow him to continue his research, but he could do more of it, and more quickly.

“The biggest benefit to moving your experiments online is that it allows you to iterate faster,” Mr. Suri says. He believes that, years from now, social scientists will look back and recognize how much more rapid the advance of knowledge became with the advent of crowd-work. “I think it’s going to be the turning point where things went to the next level.”

Soon, Mr. Suri and his fellow pioneers were publishing data about Mechanical Turk itself. They established that Turkers respond to surveys at least as thoroughly and honestly as conventional subjects, and they are more diverse. They also cost much less to entice, and paying them generously doesn’t necessarily produce better results.

Andy Baio, Waxy.org, “The Faces of Mechanical Turk.”

Faceless no more: “Turkers” who take surveys online are starting to demand recognition and better treatment.

Though Turkers come from all over the world, they live predominantly in the United States and India, the two countries where Amazon pays them with actual money; others can receive only gift cards to Amazon.com, what some Turkers call the “company store.” Turkers in the United States skew female, and are more likely to Turk part-time and partly for fun, while Indians, skewing male, are somewhat more likely to depend on the Mechanical Turk income and less likely to enjoy the work. The early papers’ main point, though: This is a pretty good source of data.

Researchers now make habitual use of Mechanical Turk for both collecting and processing data; one can hire Turkers to fill out a survey, and then again to comb through the results and look for patterns. One can’t, however, perform studies that depend on nonadult participants, or physical measurement, or a subject’s full attention. Longitudinal studies are possible but not straightforward. While Mechanical Turk surveys tend to be more representative of the U.S. population than the usual group that shows up for in-person surveys, they’re less representative than expensive, large-scale probability samples.

ADVERTISEMENT

The list of Mechanical Turk’s uses is long. Mr. Suri has experimented with corralling Turkers to generate maps for disaster-stricken regions, based on raw data posted to social networks. It’s a fast and easy way to cull lots of information into something useful for first responders. But as the Harvard Law School professor Jonathan Zittrain has suggested, these studies also carry troubling prospects. Similar techniques could be used by a repressive government to identify faces in a crowd of protesters—and it could be done in such bits and pieces that Turkers wouldn’t know what they were helping to do. As convenient as it might be to consider the workers mere cogs in a value-free system, they contend otherwise.

Rochelle LaPlante has a different name for what she does all day depending on whom she’s talking to. For those able to wrap their heads around what it might mean, she’s a digital worker. For others, she’s a freelancer or a transcriber, since transcribing audio is something she spends a lot of her time doing. But, aside from a few other gigs, she’s basically a Turker.

Ms. LaPlante, 33, first learned about Mechanical Turk in 2007, when she had a full-time job in social work for the city of Seattle. A friend who worked for Amazon told her about the platform, and she started doing HITs in her free time. She left social work in 2009 and moved to Los Angeles, and since 2012 she has been working online to help cover her family’s expenses. Her tasks are as varied as validating information posted online, tagging someone’s photographs, and taking academic surveys.

“Some days are great, and there are some days that I make $5,” says Ms. LaPlante. The bad days are often bad because requesters can take a worker’s work and then “reject” it, leaving the worker with no pay and a bad rating. There is no due process in the event of disputes. Amazon refuses to get involved.

It helps, at least, that she’s not alone. Ms. LaPlante is a moderator of mturkgrind, one of the several online forums where Turkers interact. They share information about lucrative HITs, circulate warnings about bad requesters, and carry out the sort of workaday banter normally associated with office water coolers. Once Ms. LaPlante became involved in the social life surrounding Mechanical Turk, she saw her income rise. Getting decently compensated for Turking, however, requires a lot of uncompensated work.

ADVERTISEMENT

“You go to the grocery store and see a candy bar, and you think, ‘Is that worth two surveys?’”

Turkers report spending about half their time doing academic surveys, though it varies widely from person to person. Even when filling out a survey, however, they’re not just doing that. Serious Turkers may have two or three screens in front of them, with multiple tasks under way in different browser tabs. They could be chatting with others—including those taking the same survey—in a chat room, Facebook group, or forum. They’re adept at using browser plug-ins to help them work more efficiently. “It’s fun to think about, to gamify it,” Ms. LaPlante says. But this kind of workflow can also incline one toward a crippling anxiety about minute costs and benefits.

“You go to the grocery store and see a candy bar, and you think, ‘Is that worth two surveys?’” she says.

Horror stories abound. The most common ones involve having surveys rejected for no apparent reason. And some surveys are just unpleasant. One that became particularly notorious on the forums began by asking whether the respondent was religious. Turkers tend to be more secular than the general population, as is Ms. LaPlante; but even those who answered “no” at the outset were bombarded with questions about the details of their religious life. “Two hours of religious questions for people who were not religious,” she recalls. “Pressuring you from all these different angles—it went on and on.”

On the whole, though, Turkers tend to like surveys—a recent study found that for U.S.-based Turkers they were the most popular type of task. They break up the tedium of duller kinds of batch work, and they also tend to pay slightly better. Some Turkers say they appreciate the opportunity to take part in producing research. Inevitably, too, they see a lot of the same material over and over again, which raises questions for researchers of whether the results are reliable. Ms. LaPlante, for instance, is well-practiced with the Trolley Problem, a thought experiment often used and reused in survey design. In the rush to get to the next HIT, Turkers may provide a prefab answer without internalizing the subtleties that the researcher meant to convey.

“Some, if not most, are a copy and paste of every other survey,” a 37-year-old male Turker in upstate New York said in response to a Mechanical Turk survey for this article. (The respondents were paid $1.50 to fill out a 16-question survey, with an advertised rate of $18 per hour. In keeping with Amazon policy, they were not asked to give their names.) A 34-year-old woman in Hawaii wrote, “I do not have to spend much time thinking of an answer if I have seen it a hundred times before.” The journal Behavior Research Methods published a paper in 2013 that described such “nonnaïveté" as a “commons dilemma” and proposed a series of tips for how researchers could avoid it. For example: Avoid the Trolley Problem.

ADVERTISEMENT

Institutional review boards may or may not be of help. Their policies range from ignoring crowd-work altogether to making practical suggestions about how to keep participants’ personal data secure and confidential. Pomona College’s IRB, notably, explicitly waives its $15-per-hour minimum rate for Mechanical Turk—according to Jessica Borelli, an assistant professor of psychology, for fear that paying so much more than the platform’s norm might constitute “participant coercion.”

A 2013 U.S. Department of Health and Human Services document on IRBs acknowledges, “Current human subjects regulations, originally written over 30 years ago, do not address many issues raised by the unique characteristics of Internet research.” IRBs, anyway, exist to protect subjects; they’re not always equipped for overseeing labor relations or ensuring digital security.

Turkers aren’t waiting for academe to catch up. Last summer they drafted a set of Guidelines for Academic Requesters, which appears on an advocacy website called Dynamo. They’ve been calling on scholars who use the platform to commit to following their guidelines, and so far nearly 50 have signed on—a small fraction of the researcher population. They’re now designing a badge that approved researchers can affix to their surveys.

The original impetus for the guidelines came when a researcher posted fabricated reviews of requesters as a means of studying how the Turker community functions. Such deception, the guidelines stress, is off limits. Many of the other stipulations stem from the Turkers’ extreme vulnerability on the platform: Don’t pay below minimum wage. If you reject someone’s work, make sure it’s for a good reason. Identify yourself and provide reliable information about what it will take to complete your study.

One other guideline, which Turkers talk about again and again: Communicate. Answer questions, respond to grievances. When requesters are unresponsive, the guidelines suggest a process of escalation—through the IRB, and then through the requester’s colleagues and administrators. Despite Amazon’s attempt to create a platform that disguises human labor as software, it seems, the people inside want recourse to that basic human right of asking a question, of submitting a complaint.

ADVERTISEMENT

“Turkers,” the workers on Mechanical Turk, have begun a letter-writing campaign to Jeff Bezos, Amazon’s billionaire founder. Among other things, they want a mechanism of recourse against unscrupulous employers.


The most widely publicized campaign on Dynamo so far has been even more basic. It’s a letter-writing campaign from Turkers to Jeff Bezos, asking him to recognize their humanity in myriad ways—like creating better forms of recourse in case of unfair requesters or simply presenting them publicly as skilled, valuable, real-live human beings.

Siddharth Suri, Rochelle LaPlante, and several dozen others involved in the study and practice of crowd-work gathered in New York last November for “Digital Labor: Sweatshops, Picket Lines, Barricades,” a conference spread out among the New School’s scattered buildings west of Union Square. Its instigator was Trebor Scholz, an East German emigré and associate professor of media studies, who sees organizing conferences as an art form in the spirit of the 1960s Fluxus movement. Social scientists sat in panels lasting two and a half hours alongside Turkers, artists, and activists. (I was a speaker as well, on labor history.) Mr. Scholz believes that understanding something like Mechanical Turk means thinking about a lot more than surveys.

“Artists were really among the first to bring these issues to the fore,” he says. Starting in 2007, for instance, a pair of artists instigated Ten Thousand Cents, in which 10,000 Turkers earned one cent apiece for drawing a tiny part of a $100 bill, not knowing what the final product would be. At the New School conference, a London-based artist named Byron Peters presented “Songs for Non-Work,” a compilation of audio from Turkers whom he paid not to work for a minute at a time.

Stefanie Li

Trebor Scholz, an associate professor of media studies at the New School, assigns Mechanical Turk labor to his students, who have recoiled at the conditions. “They were shocked—not only by how low the pay was, but also how hard it is,” he says.

Mr. Scholz assigns Mechanical Turk labor as an exercise for his students, who have recoiled at the conditions. “They were shocked—not only by how low the pay was, but also how hard it is,” he says. They see requesters reject work that they feel sure was done correctly. They experience an environment nearly devoid of protections, rigged to deliver work without the trouble of a worker.

“The crowdsourcing industry is wiping away a century of labor struggles,” Mr. Scholz says. “A democracy shouldn’t tolerate workplaces like that.”

ADVERTISEMENT

Researchers at the conference trying to understand the new world of crowd-work described their frustrations with the opacity of Amazon’s system. The company doesn’t disclose much about the platform’s internal workings. “It’s hard to contextualize what you’re doing,” said Mark Graham, an associate professor at the Oxford Internet Institute—in part because Mechanical Turk allows only U.S.-based requesters. “I’ve yet to hear of a sensible methodology.”

In another session, a Turker named Manish Bhatia, on video chat from India, described related difficulties. “Amazon doesn’t make anything clear to us,” he said. “As workers we are helpless.” Amazon has stopped approving new accounts from India on the platform, and he said existing accounts sell for as much as $600 on the black market.

The Turkers, however, tended to be more forgiving of the platform than were the scholarly critics. When someone at the conference suggested publishing a statement denouncing crowd-work, Ms. LaPlante said, “Workers would hate that.” Turkers don’t want to encourage more of the polemics against Amazon that periodically appear, ostensibly on their behalf—partly out of fear that the company might tire of the bad publicity and pull the plug on Mechanical Turk altogether. Amazon’s meager investment over the years has aroused fear the platform may be dying. Part of what has been keeping it alive has been the ingenuity of Turkers and researchers working together.

Among the imbalances built into Mechanical Turk is that requesters can reject or even block workers, but workers can do nothing of the sort with misbehaving requesters. One group of researchers has tried to right that imbalance. As a “tactical media art project” that grew out of a graduate seminar at the University of California at Irvine, and after extended consultation with Turkers, Lilly Irani and Six Silberman developed a browser plug-in called Turkopticon. With it, Turkers can rate requesters and see one another’s ratings, as if that were a feature of the mturk.com website. Mr. Silberman, a Ph.D. student at Irvine, and Ms. Irani, an assistant professor of communication at the University of California at San Diego, have described the project as “interrupting worker invisibility.”

Turkopticon has become a fixture of the ecosystem that surrounds Mechanical Turk, which also includes the various forums, several Reddit feeds, and private Facebook groups. The newest addition is Dynamo—home of the academic guidelines and the letter-writing campaign—thanks to a team led by Niloufar Salehi, a Stanford computer-science Ph.D. student. On the one hand, these are all ways of turning an isolated, alienating workplace into something better. On the other, they let Amazon off the hook.

ADVERTISEMENT

“Turkopticon’s existence sustains and legitimizes AMT by helping safeguard its workers,” Ms. Irani and Mr. Silberman have acknowledged. “Ideally, however, we hoped that Amazon would change its systems design to include worker safeguards. This has not happened.”

Others are trying to do what Amazon has not. At the University of Oxford, a website called Prolific Academic is being designed with ethical research in mind. It has a firm minimum wage of $7 per hour, and workers will be able to rate researchers. Longitudinal studies and prescreening are basic features, not hacks. But its challenge, as with so many crowd-work platforms, will be to attract a critical mass of workers. Over the course of the New School conference, a number of those taking part discovered that they’d been thinking about the same thing: What if crowd-workers could create, and run, a platform of their own?

On the final afternoon of the conference, people involved in many aspects of the crowd-work scene—from Turkers like Ms. LaPlante to academic activists like Mr. Silberman, Ms. Irani, and Ms. Salehi—gathered for a power-mapping session, placing colorful sticky notes on the wall to chart out the structure, and the possible futures, of this new economy.

They debated how important Mechanical Turk will continue to be, as opposed to competing platforms. They differed on whether crowd-work points to a liberating new future of independent labor, or a dystopia in the making.

By the end there was more argument than consensus, and more helplessness than hope. The room held a sense of unsatisfied urgency not explainable by the methodological and ethical quandaries of Mechanical Turk alone. The trouble, rather, seemed to stem from a shared intuition: that the pursuit of justice on Mechanical Turk concerns more than the platform itself, that the platform represents a microcosm of the future of work as a whole.

ADVERTISEMENT

Nathan Schneider is a freelance journalist and the author, most recently, of Thank You, Anarchy: Notes From the Occupy Apocalypse (University of California Press).

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Scholarship & Research
ADVERTISEMENT
ADVERTISEMENT
  • Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
    Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
  • The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
    The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
  • Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
    Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
  • Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
    Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin