Qarout, R.K., Checco, A. orcid.org/0000-0002-0981-3409 and Bontcheva, K. orcid.org/0000-0001-6152-9600 (2018) Investigating stability and reliability of crowdsourcing output. In: CEUR Workshop Proceedings. SAD 2018 and CrowdBias 2018, 05 Jul 2018, Zürich, Switzerland. CEUR Workshop Proceedings , pp. 83-87.
Abstract
This research proposes to investigate the reliability of the output of crowdsourcing platforms and its consistency over time. We study the effect of design interface and instructions and identify critical differences between two platforms that have been used widely in research and data collection and evaluation. Our findings will help to uncover data reliability problems and to propose changes in crowdsourcing platforms that can mitigate the inconsistencies of human contributions.
Metadata
Item Type: | Proceedings Paper |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © 2018 The Author(s). Reproduced in accordance with the publisher's self-archiving policy. |
Keywords: | Crowdsourcing; task design; platforms |
Dates: |
|
Institution: | The University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Engineering (Sheffield) > Department of Computer Science (Sheffield) The University of Sheffield > Faculty of Social Sciences (Sheffield) > Information School (Sheffield) |
Funding Information: | Funder Grant number EUROPEAN COMMISSION - HORIZON 2020 732328 |
Depositing User: | Symplectic Sheffield |
Date Deposited: | 23 Jan 2019 14:31 |
Last Modified: | 24 Jan 2019 06:34 |
Published Version: | http://ceur-ws.org/Vol-2276/paper10.pdf |
Status: | Published |
Publisher: | CEUR Workshop Proceedings |
Refereed: | Yes |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:141502 |