Parallel data-local training for optimizing Word2Vec embeddings for word and graph embeddings

Moon, G.E., Newman-Griffis, D. orcid.org/0000-0002-0473-4226, Kim, J. et al. (3 more authors) (2020) Parallel data-local training for optimizing Word2Vec embeddings for word and graph embeddings. In: 2019 IEEE/ACM Workshop on Machine Learning in High Performance Computing Environments (MLHPC). 2019 IEEE/ACM Workshop on Machine Learning in High Performance Computing Environments (MLHPC), 18 Nov 2019, Denver, CO, USA. IEEE , pp. 44-55. ISBN 978-1-7281-5986-7

Abstract

Metadata

Authors/Creators:
Copyright, Publisher and Additional Information: © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works. Reproduced in accordance with the publisher's self-archiving policy.
Keywords: gradient methods; graph theory; matrix algebra; multiprocessing systems; neural nets; parallel algorithms; sampling methods; stochastic processes; text analysis; unsupervised learning
Dates:
  • Published: 9 January 2020
Institution: The University of Sheffield
Academic Units: The University of Sheffield > Faculty of Social Sciences (Sheffield) > Information School (Sheffield)
Depositing User: Symplectic Sheffield
Date Deposited: 16 Feb 2023 16:18
Last Modified: 17 Feb 2023 14:43
Published Version: http://dx.doi.org/10.1109/mlhpc49564.2019.00010
Status: Published
Publisher: IEEE
Refereed: Yes
Identification Number: https://doi.org/10.1109/mlhpc49564.2019.00010
Related URLs:

Export

Statistics