Knowledge distillation for quality estimation

Gajbhiye, A., Fomicheva, M., Alva-Manchego, F. et al. (4 more authors) (2021) Knowledge distillation for quality estimation. In: Zong, C., Xia, F., Li, W. and Navigli, R., (eds.) Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 01-06 Aug 2021, Bangkok, Thailand (virtual conference). Association for Computational Linguistics (ACL) , pp. 5091-5099. ISBN 9781954085541

Abstract

Metadata

Authors/Creators:
Copyright, Publisher and Additional Information: © 2021 The Association for Computational Linguistics. Licensed on a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/).
Dates:
  • Published (online): 28 July 2021
  • Published: August 2021
Institution: The University of Sheffield
Academic Units: The University of Sheffield > Faculty of Engineering (Sheffield) > Department of Computer Science (Sheffield)
Funding Information:
FunderGrant number
European Commission - HORIZON 2020825303
Depositing User: Symplectic Sheffield
Date Deposited: 29 Jul 2021 13:07
Last Modified: 29 Jul 2021 13:08
Status: Published
Publisher: Association for Computational Linguistics (ACL)
Refereed: Yes
Identification Number: https://doi.org/10.18653/v1/2021.findings-acl.452
Related URLs:

Export

Statistics