Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot

Abstract : Advancement in brain computer interfaces (BCI) technology allows people to actively interact in the world through surrogates. Controlling real humanoid robots using BCI as intuitively as we control our body represents a challenge for current research in robotics and neuroscience. In order to successfully interact with the environment the brain integrates multiple sensory cues to form a coherent representation of the world. Cognitive neuroscience studies demonstrate that multisensory integration may imply a gain with respect to a single modality and ultimately improve the overall sensorimotor performance. For example, reactivity to simultaneous visual and auditory stimuli may be higher than to the sum of the same stimuli delivered in isolation or in temporal sequence. Yet, knowledge about whether audio-visual integration may improve the control of a surrogate is meager. To explore this issue, we provided human footstep sounds as audio feedback to BCI users while controlling a humanoid robot. Participants were asked to steer their robot surrogate and perform a pick-and-place task through BCI-SSVEPs. We found that audio-visual synchrony between footsteps sound and actual humanoid's walk reduces the time required for steering the robot. Thus, auditory feedback congruent with the humanoid actions may improve motor decisions of the BCI's user and help in the feeling of control over it. Our results shed light on the possibility to increase robot's control through the combination of multisensory feedback to a BCI user.
Type de document :
Article dans une revue
Frontiers in Neurorobotics, Frontiers, 2014, 8 (20), pp.001-008. <http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4060053/>. <10.3389/fnbot.2014.00020>
Liste complète des métadonnées


https://hal-lirmm.ccsd.cnrs.fr/lirmm-01058974
Contributeur : François Keith <>
Soumis le : jeudi 28 août 2014 - 21:35:41
Dernière modification le : vendredi 9 juin 2017 - 10:42:30
Document(s) archivé(s) le : samedi 29 novembre 2014 - 10:56:30

Fichier

2014_frontiers_tidoni-audio-vi...
Fichiers éditeurs autorisés sur une archive ouverte

Identifiants

Collections

IDH | LIRMM | MIPS

Citation

Emmanuele Tidoni, Pierre Gergondet, Abderrahmane Kheddar, Salvatore Aglioti. Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot. Frontiers in Neurorobotics, Frontiers, 2014, 8 (20), pp.001-008. <http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4060053/>. <10.3389/fnbot.2014.00020>. <lirmm-01058974>

Partager

Métriques

Consultations de
la notice

205

Téléchargements du document

318