Using Oscillatory Neural Network for Pattern Recognition and Mobile Robot Control - LIRMM - Laboratoire d’Informatique, de Robotique et de Microélectronique de Montpellier
Conference Papers Year : 2020

Using Oscillatory Neural Network for Pattern Recognition and Mobile Robot Control

Abstract

Neuro-inspired computing employs technologies that enable brain-inspired computing hardware for more efficient and adaptive intelligent systems. Mimicking the human brain and nervous system, these computing architectures are excellent candidates for solving complex and large-scale associative learning problems. In the framework of EU H2020 NeurONN project, we implement and explore energy-efficient neuromorphic computing based on oscillatory neural networks (ONN) [1] using metal-insulator-transition (MIT) for emulating “neurons” and 2D material memristors for emulating “synapses” to achieve a truly neuro-inspired computing paradigm for enabling AI at the edge [2-4]. To show the proof-of-concept, we have implemented a digital version of the ONN on a programmable logic component (FPGA). We use the FPGA logic resources to implement an ONN of sixty neurons. The ONN is a Hopfield-type neural network, and it works like an associative memory. During the learning stage, the synapse weight values are calculated from several reference images. To display ONN functionality in real applications, we will present two demonstrators: The first demonstrator implements a pattern recognition application. The ONN learns several digits and then it tries to recognize fuzzy digits in an image. The test bench includes a camera for image acquisition, the images are then scaled down to the right size and sent to the neural network which associates the fuzzy digit with a digit it has learned. For the second demonstrator, we have designed an ONN that allows a mobile robot to make decisions to avoid obstacles (see figure below). The robot is equipped with three proximity sensors. The values of these sensors are coded in the form of a 60 pixels image. ONN learns from 8 reference images, each one corresponding to the encoded data from proximity sensors. There are 3 proximity sensors embedded on the robot. The data coming from the sensors are encoded into 6 values. Thus, there is a total of 216 decision possibilities based on the sensor data. ONN provided a 100% success rate to make a decision (i.e. avoid obstacles) based on the received images by ONN and the memorized images on ONN.
Fichier principal
Vignette du fichier
Abstract-Demo-ATS-Final.pdf (543.87 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

lirmm-03023088 , version 1 (25-11-2020)

Identifiers

  • HAL Id : lirmm-03023088 , version 1

Cite

Madeleine Abernot, Thierry Gil, Aida Todri-Sanial. Using Oscillatory Neural Network for Pattern Recognition and Mobile Robot Control. SOPHI.A SUMMIT, Nov 2020, Sophia Antipolis, France. ⟨lirmm-03023088⟩
196 View
137 Download

Share

More