Comparing three of the most popular crowdsourcing platforms

For our comparison we analysed CrowdFlower, Amazon Mechanical Turk and Clickworker, three of the most popular crowdsourcing platforms. We considered a few selected aspects only, not the entire capability of the platforms, so keep in mind that we primarily thought of example tasks similar to sentiment annotation or named entity recognition.

Selected, important features for comparing the platforms

Custom Templates: Most crowdsourcing platforms offer a range of pre-defined templates for the various kinds of jobs. But in many cases, those are not perfect for your specific needs. Hence it is an important feature to be able to design your own templates with some kind of markup language for design and programming language for the logic.

Private Crowd: The possibility to use our own expert team and exclude all external workers from the job is important.

Free Jobs for Private Crowd: Since we want to be able to have our own expert team, there may be no need to pay them extra. So the platform should provide the option of offering jobs to private contributors without payment.

Entrance Test: When using a non-expert crowd, it is very important to have contributors to qualify for a task. This is done in form of test questions.

Random Test Questions: The random test questions are necessary to keep contributors focused after they have successfully finished the entrance test. Pre-annotated questions are mixed into the running tasks without them noticing. If they fail too many of those, they will be excluded from working on the job.

Contributor Filtering: On more complex tasks that require certain skills or properties, we want to be able to include certain contributors and/or exclude others.

  • Language: The requirement for contributors to speak a certain language is a necessity for multilingual, text based tasks.
  • Geography: You may want to exclude/include certain nations from your task. This could also be used to a limited extend for the language part, if not available.
  • Knowledge: Some platforms offer crowdworkers the possibility to add interests and skills to their profile. This could be a degree in computer science or interest in the stock market. Be aware that those properties are difficult to verify, they certainly are no replacement for test questions.
  • Experience: Crowdworkers like to hone their skills, gather experience and get rewarded as a result. Platforms often provide a level-based system that indicates a contributor’s experience and accuracy. One may want to allow only the most experienced workers on your job, in exchange for paying more per task.

Requirements for the API

To effectively use crowdsourcing in automated tasks, like gathering sentiment annotations from twitter messages, it is important to have a sophisticated API. One wants to be able to programmatically submit data (preferably in real-time), create tasks, edit templates and gather results.

Authentication: When performing API requests, there needs to be some way of authentify yourself as the owner of a job or account.

Tasks: Is it possible to create and update whole jobs with the API? It would be very in- convenient if we had to create tasks on the platforms website manually, when doing automated crowdsourcing.

Data: A very important feature is to be able to submit data to jobs, preferably in real time. When talking about data in this context, you can e.g. think of single tweets or a set of them.

Settings: The API may not yet be fully developed, hence it is important to know if all settings are configurable via API calls.

Results: To be able to generate and download a report on the results of your job is indispensable. If possible, we also want to be able to get results on single questions.

Templates: The possibility to create tasks with API calls is a great thing, but in order to be flexible and automated we need to have a way of defining the design and logic of our interface. This could be done e.g. by sending a string of code or entire files.

External Hosting: When hosting jobs on your own website or creating surveys with tools like Qualtrics you still may want to be able to use external crowds. This could reach from providing a link to your website up to a full integration into the platform, including usage of their statistical tools.

Pricing: Most existing crowdsourcing platforms have different pricing models depending on your needs. There could be monthly fees and/or the requirement of paying a fraction of what you pay each crowdworker.

CrowdFlower is a company specialized in crowdsourcing and data mining, founded 2007 and based in San Francisco. They do not just provide crowdworker networks for performing microtasks but a lot more, like integrated machine learning and cloud solutions. A job or task is created as one big project that is divided into pages, which contain one or more questions. Contributors are paid per finished page.

Amazon Mechanical Turk is a crowdsourcing marketplace owned by Amazon. Requesters can offer jobs which are here called HITs (Human Intelligence Task). HITs are not to be seen as bigger projects but really just small tasks like annotating one tweet. So you may be creating thousands of them.

Clickworker was founded 2005. It is a crowdsourcing platform with currently 1 million users and offers different kinds of service types. You can either design your job by yourself in the self-service marketplace or design it together with the sales-team, if paying for full service. With the self- service marketplace you can create simple tasks like ’Text’, ’Translation’, ’Survey’ and ’Sentiment- Analysis’. For more complex tasks like Web Research, Categorization & Tagging, Product Data Management, Surveys the full service is needed.

Table of comparison







Custom Templates Yes, with own templating language CML or using HTML/CSS/JS Yes, via rich text editor or code mode i.e. full range of possibilities No, just predefined templates are available
Private Crowd Yes, private and/or public channels are allowed Yes, by using specific qualifications as flags Yes private and/or public teams are allowed
Free Jobs for Private Crowd Yes, if private channels selected only Partly, you still have
to pay the basic fee to Amazon per task
Entrance test Yes, freely define questions as test questions Yes, with the help of qualification tests Only for enterprise models and for surveys with external tools
Random Test Questions Yes, automatically inserted  randomly, if enough test questions available Not available by default but can be implemented Only for enterprise models and for surveys with external tools
Contributor Filtering Yes, filtering on language, geography and experience is possible Yes, via language, geography, experience and ‘Premium Qualifications’ Yes, language (not for surveys) geography, skill and qualification is possible
API Full API support for Authentication, Tasks, Data, Settings, Results and Templates is available Full API support for Authentication, Tasks, Data, Settings, Results and Templates is available Full API support for Authentication, Tasks, Data, Settings and Results is available
External Hosting Yes Yes Yes
Links to Pricing information Pricing CrowdFlower Pricing MTurk Pricing Clickworker


After carefully reviewing the crowdsourcing platforms we can state that all of them have particular benefits and drawbacks. None of them is the easy-to-use, highly flexible and low cost service that one might prefer above all others.

Even though CrowdFlower is easy-to-use they still provide a high level of flexibility. In addition, their contributor quality management is by far the most sophisticated. The downside is a high monthly fee that has to be paid if you want to go beyond the free service. On the other hand, a free researcher pricing model that provides such high functionality is not to be taken for granted.

Amazon Mechanical Turk offers the highest flexibility. But the creation of tasks can get very complex quickly. You have to implement most logic for quality management by yourself and sometimes apply ‘dirty’ workarounds. On the upside, there is no monthly fee for using the service.

Clickworker seems to be specialised for text creation and translation tasks, at least for the free pricing model. They offer any kind of customized solutions for your specific task but this might cost a lot, since they do all the work for you. They provide great customer support, they answered our questions very quickly without exception.

For our testrun, that we will present in the next post of this series, we decided to go with CrowdFlower.

For our purpose,  a test run for stock market tweets and sentiment annotation,  their service seemed to fit most perfectly. It is easy to use and extensive quality control is well-suited for our needs. Additionally, the 25000 data rows for the free model is enough to gather work with some data of the test-run.

Stay tuned – more details on crowdsourcing platforms to follow.

This blog post was written by SSIX partners Alfonso Noriega, Sebastian Strumegger, Sophie Reischl at Redlink GmbH.
For the latest update, like us on Facebook, follow us on Twitter and join us on LinkedIn.