Crowdsourcing for web genre annotation

Asheghi, NR, Sharoff, S and Markert, K (2016) Crowdsourcing for web genre annotation. Language Resources and Evaluation, 50 (3). pp. 603-641. ISSN 1574-020X

Abstract

Metadata

Item Type: Article
Authors/Creators:
  • Asheghi, NR
  • Sharoff, S
  • Markert, K
Copyright, Publisher and Additional Information:

© The Author(s) 2016. This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Keywords: Genres on the web; Reliability testing; Annotation guidelines; Crowdsourcing
Dates:
  • Published: September 2016
  • Published (online): 9 January 2016
Institution: The University of Leeds
Academic Units: The University of Leeds > Faculty of Arts, Humanities and Cultures (Leeds) > School of Languages Cultures & Societies (Leeds) > Translation Studies (Leeds)
Depositing User: Symplectic Publications
Date Deposited: 31 May 2020 15:28
Last Modified: 31 May 2020 15:29
Status: Published
Publisher: Springer Verlag
Identification Number: 10.1007/s10579-015-9331-6
Open Archives Initiative ID (OAI ID):

Export

Statistics