Weeks, M., Hodge, V., O'Keefe, S. orcid.org/0000-0001-5957-2474 et al. (2 more authors) (2003) Improved AURA k-Nearest Neighbour approach. In: Mira, J and Alvarez, JR, (eds.) ARTIFICIAL NEURAL NETS PROBLEM SOLVING METHODS, PT II. 7th International Work Conference on Artificial and Natural Neural Networks, 03-06 Jun 2003, MENORCA. Lecture Notes in Computer Science . SPRINGER-VERLAG BERLIN , BERLIN , pp. 663-670. ISBN 3-540-40211-X
The k-Nearest Neighbour (kNN) approach is a widely-used technique for pattern classification. Ranked distance measurements to a known sample set determine the classification of unknown samples. Though effective, kNN, like most classification methods does not scale well with increased sample size. This is due to their being a relationship between the unknown query and every other sample in the data space. In order to make this operation scalable, we apply AURA to the kNN problem. AURA is a highly-scalable associative-memory based binary neural-network intended for high-speed approximate search and match operations on large unstructured datasets. Previous work has seen AURA methods applied to this problem as a scalable, but approximate kNN classifier. This paper continues this work by using AURA in conjunction with kernel-based input vectors, in order to create a fast scalable kNN classifier, whilst improving recall accuracy to levels similar to standard kNN implementations.
|Copyright, Publisher and Additional Information:||Copyright © 2003 Springer-Verlag. This is an author produced version of a chapter published in Lecture Notes in Computer Science. This paper has been peer-reviewed but does not include the final publisher proof-corrections or journal pagination.The original publication is available at http://springerlink.metapress.com/openurl.asp?genre=article&issn=0302-9743&volume=2687&spage=663|
|Institution:||The University of York|
|Academic Units:||The University of York > Computer Science (York)|
|Depositing User:||Sherpa Assistant|
|Date Deposited:||08 Nov 2005|
|Last Modified:||21 Nov 2016 12:11|