Sani, M.F., Ascione, R. and Dogramadzi, S. orcid.org/0000-0002-0009-7522 (2021) Mapping surgeons hand/finger movements to surgical tool motion during conventional microsurgery using machine learning. Journal of Medical Robotics Research, 06 (03n04). ISSN 2424-905X
Abstract
Purpose: Recent developments in robotics and artificial intelligence (AI) have led to significant advances in healthcare technologies enhancing robot-assisted minimally invasive surgery (RAMIS) in some surgical specialties. However, current human–robot interfaces lack intuitive teleoperation and cannot mimic surgeon’s hand/finger sensing required for fine motion micro-surgeries. These limitations make teleoperated robotic surgery not less suitable for, e.g. cardiac surgery and it can be difficult to learn for established surgeons. We report a pilot study showing an intuitive way of recording and mapping surgeon’s gross hand motion and the fine synergic motion during cardiac micro-surgery as a way to enhance future intuitive teleoperation.
Methods: We set to develop a prototype system able to train a Deep Neural Network (DNN) by mapping wrist, hand and surgical tool real-time data acquisition (RTDA) inputs during mock-up heart micro-surgery procedures. The trained network was used to estimate the tools poses from refined hand joint angles. Outputs of the network were surgical tool orientation and jaw angle acquired by an optical motion capture system.
Results: Based on surgeon’s feedback during mock micro-surgery, the developed wearable system with light-weight sensors for motion tracking did not interfere with the surgery and instrument handling. The wearable motion tracking system used 12 finger/thumb/wrist joint angle sensors to generate meaningful datasets representing inputs of the DNN network with new hand joint angles added as necessary based on comparing the estimated tool poses against measured tool pose. The DNN architecture was optimized for the highest estimation accuracy and the ability to determine the tool pose with the least mean squared error. This novel approach showed that the surgical instrument’s pose, an essential requirement for teleoperation, can be accurately estimated from recorded surgeon’s hand/finger movements with a mean squared error (MSE) less than 0.3%.
Conclusion: We have developed a system to capture fine movements of the surgeon’s hand during micro-surgery that could enhance future remote teleoperation of similar surgical tools during micro-surgery. More work is needed to refine this approach and confirm its potential role in teleoperation.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © 2021 The Author(s). This is an Open Access article published by World Scientific Publishing Company. It is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 (CC BY-NC) License which permits use, distribution and reproduction in any medium, provided that the original work is properly cited and is used for non-commercial purposes. See: https://creativecommons.org/licenses/by-nc/4.0 |
Keywords: | Robot-assisted surgery; minimally invasive surgery; machine learning; hand tracking; real-time low-cost hand tracking; feature extraction |
Dates: |
|
Institution: | The University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Engineering (Sheffield) > Department of Automatic Control and Systems Engineering (Sheffield) |
Funding Information: | Funder Grant number European Commission Directorate-General for Research and Innovation 732515 |
Depositing User: | Symplectic Sheffield |
Date Deposited: | 05 Jan 2022 11:34 |
Last Modified: | 15 Mar 2022 13:16 |
Status: | Published |
Publisher: | World Scientific Publishing Company |
Refereed: | Yes |
Identification Number: | 10.1142/s2424905x21500045 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:181959 |
Download
Filename: Mapping_Surgeons_Hand_Finger_Movements_t.pdf
Licence: CC-BY-NC 4.0