CrowdSelect Conference Paper uri icon


  • © 2016 ACM. Crowdsourcing allows many people to complete tasks of various difficulty with minimal recruitment and administration costs. However, the lack of participant accountability may entice people to complete as many tasks as possible without fully engaging in them, jeopardizing the quality of responses. In this paper, we present a dynamic and time efficient solution to the task assignment problem in crowdsourcing platforms. Our proposed approach, CrowdSelect, offers a theoretically proven algorithm to assign workers to tasks in a cost efficient manner, while ensuring high accuracy of the overall task. In contrast to existing works, our approach makes minimal assumptions on the probability of error for workers, and completely removes the assumptions that such probability is known apriori and that it remains consistent over time. Through experiments over real Amazon Mechanical Turk traces and synthetic data, we find that CrowdS-elect has a significant gain in term of accuracy compared to state-of-the-art algorithms, and can provide a 17.5% gain in answers' accuracy compared to previous methods, even when there are over 50% malicious workers.

author list (cited authors)

  • Qiu, C., Squicciarini, A. C., Carminati, B., Caverlee, J., & Khare, D. R.

citation count

  • 9

editor list (cited editors)

  • Mukhopadhyay, S., Zhai, C., Bertino, E., Crestani, F., Mostafa, J., Tang, J., ... Sondhi, P.

publication date

  • October 2016


  • ACM  Publisher