Neuromorphic Sensory Integration for Combining Sound Source Localization and Collision Avoidance

Thorben Schoepe, Daniel Gutierrez-Galan, Juan Pedro Dominguez-Morales, Angel Jimenez-Fernandez, Alejandro Linares-Barranco, Elisabetta Chicca

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademic

11 Citations (Scopus)

Abstract

Animals combine various sensory cues with previously acquired knowledge to safely travel towards a target destination. In close analogy to biological systems, we propose a neuromorphic system which decides, based on auditory and visual input, how to reach a sound source without collisions. The development of this sensory integration system, which identifies the shortest possible path, is a key achievement towards autonomous robotics. The proposed neuromorphic system comprises two event based sensors (the eDVS for vision and the NAS for audition) and the SpiNNaker processor. Open loop experiments were performed to evaluate the system performances. In the presence of acoustic stimulation alone, the heading direction points to the direction of the sound source with a Pearson correlation coefficient of 0.89. When visual input is introduced into the network the heading direction always points at the direction of null optical flow closest to the sound source. Hence, the sensory integration network is able to find the shortest path to the sound source while avoiding obstacles. This work shows that a simple, task dependent mapping of sensory information can lead to highly complex and robust decisions.
Original languageEnglish
Title of host publication2019 IEEE Biomedical Circuits and Systems Conference (BioCAS)
PublisherIEEE
Number of pages4
DOIs
Publication statusPublished - 2019
Externally publishedYes

Cite this