Researchers at Carnegie Mellon University have found that “crowd-sourced” articles written piecemeal by dispersed writers stack up well against those drafted by one author.
“I am pleasantly surprised,” said Aniket Kittur, an assistant professor at the university’s Human-Computer Interaction Institute and one of the lead researchers on the project. The research team developed a framework it calls CrowdForge to split up and recombine complex, creative human tasks such as writing.
Articles created with CrowdForge rated well not only against those created by individual authors, Mr. Kittur said, but against those available on the same topics on a portion of Wikipedia devoted to short, clear entries.
CrowdForge starts with “small slices at a time and turns them into a complex artifact,” said Mr. Kittur. The framework provides guidelines for how to break down a project, assign portions to writers, and reassemble the pieces. The system also includes a method to evaluate the quality of the created product.
In experiments that led to the creation of CrowdForge, Mr. Kittur took large writing projects and then separated them into smaller tasks that were then made available to members of Amazon’s Mechanical Turk community, an online group of participants willing to work on online projects. Those who signed up were allowed to pick from tasks including creating an outline for an article, writing facts about a topic, combining those facts into prose, merging lines of prose into paragraphs, and finally turning paragraphs into a complete article. Many of the small tasks can be completed separately and simultaneously, taking advantage of a limited amount of time, Mr. Kittur said.
CrowdForge assigned the same simple tasks for any project to several people, Mr. Kittur explains.
Once the work was completed, Mr. Kittur allowed members of the Amazon community who had not been involved in the creation of the articles to rate the work based on a national rubric used for grading papers. The rubric considered the proper elements of writing, including “flow and content,” Mr. Kittur said.
“We are just starting to realize the potential of crowd-sourcing,” he added.
Mr. Kittur worked with Robert E. Kraut, a professor of human-computer interaction, and Boris Smus, a master’s student at the institute.