• Solutions
  • Resources
  • Pricing
  • Buy Credits

Crowdsourced Publishing: Automating the Scientific Editorial Process

Posted by David Rothschild on Feb 10, 2011 1:14:00 PM

Many recent trends point to the standard journalistic editorial process getting a major overhaul to streamline and automate publishing.  assembly line resized 600

One interesting ongoing experiment is ‘My Boss is a Robot,’ a project out of Carnegie Mellon that will test the bounds of how scientific journalism can be automated.


“…the goal of our experiment: to create an automated system for producing quality journalism using Mechanical Turk’s army of untrained workers. Here a rough guide to how we plan on getting there.We’re going to keep the story simple. The subject will be a newly-released scientific paper and the story length will be roughly 500 words. We’re aiming for a standard-issue piece of science journalism, not a long-form essay or in-depth investigation.”

Mechanical Turk is Amazon’s crowd-sourcing product that allows individuals, developers, or organizations to assign small, detailed tasks to an army of online users.  Mechanical Turk has been used for a number of projects and goals in the past, from creating cheap labor for business endeavors to aiding in humanitarian efforts.  

One standout example involved a famous computer scientist, Jim Gray, who was lost at sea in 2007.  Amazon aided in the effort to find him by allowing Mechanical Turk users to scan satellite images of the ocean area where he was last seen and identify objects that looked like his boat.

So will Mechanical Turk’s processing work in the world of scientific journalism? 

One of MyBossisaRobot’s creators, Jim Giles, admits he isn’t sure if the experiment will be successful or not:

“Are we confident that our experiment will work? Absolutely not. In fact, we’d be surprised if it did. But we think it will be interesting to try. Equally importantly, we hope that chronicling our progress here will trigger debate about the future of journalism and of crowd-sourcing.”

Essentially, at the crux of this experiment and any other type of crowd-sourced journalism is the process of breaking down the editorial process into smaller tasks.  Different types of editors perform a variety of tasks on a daily basis, including copy editing, fact checking, verifying sources and scanning issues for plagiarism. 

In the world of crowd-sourcing,  all of these jobs are getting split into smaller tasks and assigned to a wide variety of individuals with specialized goals (or un-specialized depending on how you view it).

Since the beginning, humans have always looked for ways to compartmentalize work. The transition from hunter gatherer to agriculture to modern day society shows a steady progression towards specialization and automation.  

Internet technologies and specifically crowd-sourcing programs represent the next step in this progression. 

Although these technologies mark a clear path towards increased productivity and automation, they also need to be kept in check to prevent a decline in quality and potential instances of duplicate content.

Particularly in scientific journalism, where rising instances of plagiarism are disrupting research on a regular basis, an automated process has the potential to cause more problems.  

As professionals are taken out of the process, there is more room for errors and lapses in quality.  In order to keep quality up in this world of tiny tasks, specialized technologies need to be introduced into the standard ‘assembly line.’  

Plagiarism software is one example of a specialized technology that can augment a new generation of streamlined editing and publishing by ensuring that the content getting passed along is original.


Giles, Jim. “the experiment.” 3 February 2011.

Mims, Christopher. “Can This Journalist Be Replaced by Software and Mechanical Turk?” 2 February 2011. Technology Review.