Experiences surveying the crowd: reflections on methods, participation, and reliability Conference Paper uri icon

abstract

  • Crowdsourcing services such as Amazon's Mechanical Turk (MTurk) provide new venues for recruiting participants and conducting studies; hundreds of surveys may be offered to workers at any given time. We reflect on the results of six related studies we performed on MTurk over a two year period. The studies used a combination of open-ended questions and structured hypothetical statements about story-like scenarios to engage the efforts of 1252 participants. We describe the method used in the studies and reflect on what we have learned about identified best practices. We analyze the aggregated data to profile the types of Turkers who take surveys and examine how the characteristics of the surveys may influence data reliability. The results point to the value of participant engagement, identify potential changes in MTurk as a study venue, and highlight how communication among Turkers influences the data that researchers collect. Copyright 2013 ACM.

name of conference

  • Proceedings of the 5th Annual ACM Web Science Conference

published proceedings

  • Proceedings of the 5th Annual ACM Web Science Conference

author list (cited authors)

  • Marshall, C. C., & Shipman, F. M.

citation count

  • 46

complete list of authors

  • Marshall, Catherine C||Shipman, Frank M

publication date

  • January 2013