Luo, S, Yuan, W, Adelson, E et al. (2 more authors) (2018) ViTac: Feature Sharing between Vision and Tactile Sensing for Cloth Texture Recognition. In: 2018 IEEE International Conference on Robotics and Automation (ICRA). 2018 IEEE International Conference on Robotics and Automation (ICRA), 21-25 May 2018, Brisbane, Australia. Institute of Electrical and Electronics Engineers , pp. 2722-2727. ISBN 978-1-5386-3081-5
Abstract
Vision and touch are two of the important sensing modalities for humans and they offer complementary information for sensing the environment. Robots could also benefit from such multi-modal sensing ability. In this paper, addressing for the first time (to the best of our knowledge) texture recognition from tactile images and vision, we propose a new fusion method named Deep Maximum Covariance Analysis (DMCA) to learn a joint latent space for sharing features through vision and tactile sensing. The features of camera images and tactile data acquired from a GelSight sensor are learned by deep neural networks. But the learned features are of a high dimensionality and are redundant due to the differences between the two sensing modalities, which deteriorates the perception performance. To address this, the learned features are paired using maximum covariance analysis. Results of the algorithm on a newly collected dataset of paired visual and tactile data relating to cloth textures show that a good recognition performance of greater than 90% can be achieved by using the proposed DMCA framework. In addition, we find that the perception performance of either vision or tactile sensing can be improved by employing the shared representation space, compared to learning from unimodal data.
Metadata
Item Type: | Proceedings Paper |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © IEEE 2018. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. |
Keywords: | Visualization, Tactile sensors, Cameras, Task analysis, Surface topography |
Dates: |
|
Institution: | The University of Leeds |
Academic Units: | The University of Leeds > Faculty of Engineering & Physical Sciences (Leeds) > School of Civil Engineering (Leeds) The University of Leeds > Faculty of Engineering & Physical Sciences (Leeds) > School of Computing (Leeds) |
Funding Information: | Funder Grant number EPSRC EP/N010523/1 |
Depositing User: | Symplectic Publications |
Date Deposited: | 19 Jan 2018 11:52 |
Last Modified: | 19 Nov 2018 14:04 |
Status: | Published |
Publisher: | Institute of Electrical and Electronics Engineers |
Identification Number: | 10.1109/ICRA.2018.8460494 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:126359 |