Unsupervised Trajectory Segmentation for Surgical Gesture Recognition in Robotic Training - LIRMM - Laboratoire d’Informatique, de Robotique et de Microélectronique de Montpellier Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Biomedical Engineering Année : 2016

Unsupervised Trajectory Segmentation for Surgical Gesture Recognition in Robotic Training

Résumé

Dexterity and procedural knowledge are two critical skills that surgeons need to master to perform accurate and safe surgical interventions. However, current training systems do not allow us to provide an in-depth analysis of surgical gestures to precisely assess these skills. Our objective is to develop a method for the automatic and quantitative assessment of surgical gestures. To reach this goal, we propose a new unsupervised algorithm that can automatically segment kinematic data from robotic training sessions. Without relying on any prior information or model, this algorithm detects critical points in the kinematic data that define relevant spatio-temporal segments. Based on the association of these segments, we obtain an accurate recognition of the gestures involved in the surgical training task. We, then, perform an advanced analysis and assess our algorithm using datasets recorded during real expert training sessions. After comparing our approach with the manual annotations of the surgical gestures, we observe 97.4% accuracy for the learning purpose and an average matching score of 81.9% for the fully automated gesture recognition process. Our results show that trainees workflow can be followed and surgical gestures may be automatically evaluated according to an expert database. This approach tends toward improving training efficiency by minimizing the learning curve.
Fichier principal
Vignette du fichier
Unsupervised Trajectory Segmentation for Surgical Gesture Recognition in Robotic Training.pdf (4.72 Mo) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte
Loading...

Dates et versions

lirmm-01217023 , version 1 (18-10-2015)

Identifiants

Citer

Fabien Despinoy, David Bouget, Germain Forestier, Cédric Penet, Nabil Zemiti, et al.. Unsupervised Trajectory Segmentation for Surgical Gesture Recognition in Robotic Training. IEEE Transactions on Biomedical Engineering, 2016, 63 (6), pp.1280-1291. ⟨10.1109/TBME.2015.2493100⟩. ⟨lirmm-01217023⟩
392 Consultations
1073 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More