Unsupervised Trajectory Segmentation for Surgical Gesture Recognition in Robotic Training

Abstract : Dexterity and procedural knowledge are two critical skills surgeons need to master to perform accurate and safe surgical interventions. However, current training systems do not allow to provide an in-depth analysis of surgical gestures to precisely assess these skills. Our objective is to develop a method for the automatic and quantitative assessment of surgical gestures. To reach this goal, we propose a new unsupervised algorithm that can automatically segment kinematic data from robotic training sessions. Without relying on any prior information or model, this algorithm detects critical points in the kinematic data which define relevant spatio-temporal segments. Based on the association of these segments, we obtain an accurate recognition of the gestures involved in the surgical training task. We then perform an advanced analysis and assess our algorithm using datasets recorded during real expert training sessions. After comparing our approach with the manual annotations of the surgical gestures, we observe 97.4% accuracy for the learning purpose and an average matching score of 81.9% for the fully-automated gesture recognition process. Our results show that trainees workflow can be followed and surgical gestures may be automatically evaluated according to an expert database. This approach tends towards improving training efficiency by minimizing the learning curve.
Type de document :
Article dans une revue
IEEE Transactions on Biomedical Engineering, Institute of Electrical and Electronics Engineers, 2016, 63 (6), pp.1280-1291. 〈10.1109/TBME.2015.2493100〉
Liste complète des métadonnées

Littérature citée [44 références]  Voir  Masquer  Télécharger

https://hal-lirmm.ccsd.cnrs.fr/lirmm-01217023
Contributeur : Fabien Despinoy <>
Soumis le : dimanche 18 octobre 2015 - 16:54:08
Dernière modification le : samedi 1 décembre 2018 - 15:46:02
Document(s) archivé(s) le : mardi 19 janvier 2016 - 10:31:05

Fichier

Unsupervised Trajectory Segmen...
Fichiers éditeurs autorisés sur une archive ouverte

Identifiants

Citation

Fabien Despinoy, David Bouget, Germain Forestier, Cédric Penet, Nabil Zemiti, et al.. Unsupervised Trajectory Segmentation for Surgical Gesture Recognition in Robotic Training. IEEE Transactions on Biomedical Engineering, Institute of Electrical and Electronics Engineers, 2016, 63 (6), pp.1280-1291. 〈10.1109/TBME.2015.2493100〉. 〈lirmm-01217023〉

Partager

Métriques

Consultations de la notice

1448

Téléchargements de fichiers

624