The 12th International Semantic Web Conference
and the 1st Australasian Semantic Web Conference
21-25 October 2013, Sydney, Australia

Microtask Crowdsourcing to Solve Semantic Web Problems

Tutorial webpage


  • Gianluca Demartini (eXascale Infolab, University of Fribourg, Switzerland)
  • Elena Simperl (Web Science and Internet Group, University of Southampton, United Kingdom)
  • Maribel Acosta (Institute AIFB, Karlsruhe Institute of Technology, Germany)


Microtask crowdsourcing platforms, as one of the most popular instance of social computing technologies, are increasingly used to support massively collaborative projects on semantic content management. In this tutorial we will introduce the most popular approaches to microtask crowdsourcing for Semantic Web problems, as a mean to realize hybrid human-machine content management architectures. We will explain the core notions and technologies, including Amazon Mechanical Turk, CrowdFlower and specifically purposed tools building upon the functionality of these platforms. We will address questions related to quality assurance, resource management, and workflow design, and discuss a series of technical and socio-economical challenges and open issues related to the application of microtask crowdsourcing in given Semantic Web scenarios.

Tuesday - Oct 22 - SMC


Microtask Crowdsourcing

[Full Day]

9:00 - 10:30

Introduction of the tutorial topics and presenters

Microtask crowdsourcing fundamentals:

  • Introduction of the core concepts and definitions behind microtask crowdsourcing, including task design, interface and experience design, quality assurance, resource management, incentives and motivators.
  • Overview of well-known microtask crowdsourcing platforms
  • Examples of hybrid human-machine systems for semantic technologies, databases, and information retrieval.
10:30 - 11:00 Morning Tea
11:00 - 12:45

Amazon’s Mechanical Turk hands-on:

  • Use of the Mechanical Turk Web interface for requesters to easily crowdsource different types of microtasks
  • Use of the Mechanical Turk SDK with examples of programmatic task creation, status review, and result recollection
  • Use of CrowdFlower as alternative to Mechanical Turk
12:45 - 13:45 Lunch
13:45 - 15:30

Microtask management and quality contol:

  • Main components of a microtask crowdsourcing platforms
  • Extensions for complex tasks, task design patterns, time and performance estimation, work assignment
  • Quality control, including relationship between UI design on quality, qualification tests, master workers
  • Quality assessment: majority voting, machine learning techniques, manual assessment through the crowd
  • Payment strategies
15:30 - 16:00 Afternoon Tea
16:00 - 17:30

Applications in semantic content management:

  • Discussion of two extended examples of how to use microtask crowdsourcing for entity resolution and Linked Data curation

Current research directions in microtask crowdsourcing:

  • The future of crowd work
  • Improve quality of work via motivation of the crowd
  • Ethics
  • Crowd training and management
  • Crowd career trajectories


  • Summary of the tutorial and discussion with participants on open issues and potential new applications of microtask crowdsourcing to semantic technologies.