Martinez-Hernandez, U., Boorman, L. orcid.org/0000-0001-5189-0232 and Prescott, T. orcid.org/0000-0003-4927-5390 (2017) Multisensory Wearable Interface for Immersion and Telepresence in Robotics. IEEE Sensors Journal, 17 (8). pp. 2534-2541. ISSN 1530-437X
Abstract
The idea of being present in a remote location has inspired researchers to develop robotic devices, that make humans to experience the feeling of telepresence. These devices need of multiple sensory feedback to provide a more realistic telepresence experience. In this paper, we develop a wearable interface for immersion and telepresence that provides to human with the capability of both to receive multisensory feedback from vision, touch, and audio, and to remotely control a robot platform. Multimodal feedback from a remote environment is based on the integration of sensor technologies coupled to the sensory system of the robot platform. Remote control of the robot is achieved by a modularised architecture, which allows to visually exploring the remote environment. We validated our paper with multiple experiments where participants, located at different venues, were able to successfully control the robot platform while visually exploring, touching, and listening a remote environment. In our experiments, we used two different robotic platforms: 1) the iCub humanoid robot and 2) the Pioneer LX mobile robot. These experiments show that our wearable interface is comfortable, easy to use, and adaptable to different robotic platforms. Furthermore, we observed that our approach allows humans to experience a vivid feeling of being present in a remote environment.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works. Reproduced in accordance with the publisher's self-archiving policy. |
Keywords: | Mobile robots; Cameras; Robot vision systems; Visualization |
Dates: |
|
Institution: | The University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Science (Sheffield) > Department of Psychology (Sheffield) |
Funding Information: | Funder Grant number EUROPEAN COMMISSION - FP6/FP7 WYSIWYD - 612139 ARTS AND HUMANITIES RESEARCH COUNCIL (AHRC) AH/M002950/1 |
Depositing User: | Symplectic Sheffield |
Date Deposited: | 20 Apr 2017 08:27 |
Last Modified: | 23 Mar 2018 03:56 |
Published Version: | https://doi.org/10.1109/JSEN.2017.2669038 |
Status: | Published |
Publisher: | Institute of Electrical and Electronics Engineers |
Refereed: | Yes |
Identification Number: | 10.1109/JSEN.2017.2669038 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:115079 |