From Perception to Semantics: An Environment Representation Model Based on Human-Robot Interactions

Yohan Breux 1 Sébastien Druon 1 René Zapata 1
1 EXPLORE - Robotique mobile pour l'exploration de l'environnement
LIRMM - Laboratoire d'Informatique de Robotique et de Microélectronique de Montpellier
Abstract : A robot, in order to be autonomous, needs some kind of representation of its surrounding environment. From a general point of view, basic robotic tasks (such as localization, mapping, object handling, etc.) can be carried out with only very simple geometric primitives, usually extracted from raw sensor data. But whenever an interaction with a human being is involved, robots must have an understanding of concepts expressed in human natural language. In most approaches, this is done through a prebuilt ontology. In this paper, we try to bridge the gap between data driven methods and semantic based approaches by introducing a 3-layer environment model based on “instances”: sensor data based observations of concepts stored in a knowledge graph. We will focus on our original object-oriented ontology construction and illustrate the flow of our model in a simple showcase.
Document type :
Conference papers
Complete list of metadatas

https://hal-lirmm.ccsd.cnrs.fr/lirmm-01926183
Contributor : Sebastien Druon <>
Submitted on : Monday, November 19, 2018 - 9:43:20 AM
Last modification on : Tuesday, May 21, 2019 - 11:54:49 AM

Identifiers

Collections

Citation

Yohan Breux, Sébastien Druon, René Zapata. From Perception to Semantics: An Environment Representation Model Based on Human-Robot Interactions. RO-MAN: Robot and Human Interactive Communication, Aug 2018, Nanjing, China. pp.672-677, ⟨10.1109/ROMAN.2018.8525527⟩. ⟨lirmm-01926183⟩

Share

Metrics

Record views

94