DisCo: Distilled Student Models Co-training for Semi-supervised Text Mining

Jiang, W., Miao, Q., Lin, C. et al. (4 more authors) (2023) DisCo: Distilled Student Models Co-training for Semi-supervised Text Mining. In: Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: Industry Track. The 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP), 06-10 Dec 2023, Singapore. ACL , Stroudsburg, PA , pp. 4015-4030. ISBN 979-8-89176-060-8

Abstract

Metadata

Authors/Creators:
Copyright, Publisher and Additional Information: ACL materials are Copyright © 1963–2023 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License.
Dates:
  • Accepted: 7 October 2023
  • Published: December 2023
Institution: The University of Leeds
Academic Units: The University of Leeds > Faculty of Engineering & Physical Sciences (Leeds) > School of Computing (Leeds)
Depositing User: Symplectic Publications
Date Deposited: 30 Oct 2023 17:13
Last Modified: 21 Dec 2023 10:13
Published Version: https://aclanthology.org/2023.emnlp-main.244/
Status: Published
Publisher: ACL
Related URLs:

Export

Statistics