Let’s Agree to Disagree: Fixing Agreement Measures for Crowdsourcing

Checco, A. orcid.org/0000-0002-0981-3409, Roitero, K., Maddalena, E. et al. (2 more authors) (2017) Let’s Agree to Disagree: Fixing Agreement Measures for Crowdsourcing. In: Dow, S. and Kalai, T., (eds.) Proceedings of the Fifth AAAI Conference on Human Computation and Crowdsourcing (HCOMP-17). Fifth AAAI Conference on Human Computation and Crowdsourcing (HCOMP-17), 23–26 October 2017, Québec City, Québec, Canada. AAAI Press , Palo Alto, California , pp. 11-20.

Abstract

Metadata

Authors/Creators:
Copyright, Publisher and Additional Information: © 2017, Association for the Advancement of Artificial Intelligence (www.aaai.org).
Keywords: crowdsourcing; inter-rater agreement; reliability
Dates:
  • Accepted: 25 June 2017
  • Published: October 2017
Institution: The University of Sheffield
Academic Units: The University of Sheffield > Faculty of Social Sciences (Sheffield) > Information School (Sheffield)
Depositing User: Symplectic Sheffield
Date Deposited: 25 Oct 2017 14:13
Last Modified: 16 Apr 2018 11:23
Published Version: https://aaai.org/ocs/index.php/HCOMP/HCOMP17/paper...
Status: Published
Publisher: AAAI Press
Refereed: Yes

Export

Statistics