Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Incentivizing high quality crowdwork

Incentivizing high quality crowdwork Incentivizing High Quality Crowdwork CHIEN-JU HO Cornell University and ALEKSANDRS SLIVKINS and SIDDHARTH SURI and JENNIFER WORTMAN VAUGHAN Microsoft Research We study the causal effects of financial incentives on the quality of crowdwork. We focus on performance-based payments (PBPs), bonus payments awarded to workers for producing high quality work. We design and run randomized behavioral experiments on the popular crowdsourcing platform Amazon Mechanical Turk with the goal of understanding when, where, and why PBPs help, identifying properties of the payment, payment structure, and the task itself that make them most effective. We provide examples of tasks for which PBPs do improve quality. For such tasks, the effectiveness of PBPs is not too sensitive to the threshold for quality required to receive the bonus, while the magnitude of the bonus must be large enough to make the reward salient. We also present examples of tasks for which PBPs do not improve quality. Our results suggest that for PBPs to improve quality, the task must be effort-responsive: the task must allow workers to produce higher quality work by exerting more effort. We also give a simple method to determine if a task is effort-responsive a priori. Furthermore, our experiments suggest http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png ACM SIGecom Exchanges Association for Computing Machinery

Loading next page...
 
/lp/association-for-computing-machinery/incentivizing-high-quality-crowdwork-Tmo0725WmS
Publisher
Association for Computing Machinery
Copyright
Copyright © 2016 by ACM Inc.
ISSN
1551-9031
DOI
10.1145/2904104.2904108
Publisher site
See Article on Publisher Site

Abstract

Incentivizing High Quality Crowdwork CHIEN-JU HO Cornell University and ALEKSANDRS SLIVKINS and SIDDHARTH SURI and JENNIFER WORTMAN VAUGHAN Microsoft Research We study the causal effects of financial incentives on the quality of crowdwork. We focus on performance-based payments (PBPs), bonus payments awarded to workers for producing high quality work. We design and run randomized behavioral experiments on the popular crowdsourcing platform Amazon Mechanical Turk with the goal of understanding when, where, and why PBPs help, identifying properties of the payment, payment structure, and the task itself that make them most effective. We provide examples of tasks for which PBPs do improve quality. For such tasks, the effectiveness of PBPs is not too sensitive to the threshold for quality required to receive the bonus, while the magnitude of the bonus must be large enough to make the reward salient. We also present examples of tasks for which PBPs do not improve quality. Our results suggest that for PBPs to improve quality, the task must be effort-responsive: the task must allow workers to produce higher quality work by exerting more effort. We also give a simple method to determine if a task is effort-responsive a priori. Furthermore, our experiments suggest

Journal

ACM SIGecom ExchangesAssociation for Computing Machinery

Published: Mar 16, 2016

There are no references for this article.