TY - GEN
T1 - Neuromorphic Sensory Integration for Combining Sound Source Localization and Collision Avoidance
AU - Schoepe, Thorben
AU - Gutierrez-Galan, Daniel
AU - Dominguez-Morales, Juan Pedro
AU - Jimenez-Fernandez, Angel
AU - Linares-Barranco, Alejandro
AU - Chicca, Elisabetta
PY - 2019
Y1 - 2019
N2 - Animals combine various sensory cues with previously acquired knowledge to safely travel towards a target destination. In close analogy to biological systems, we propose a neuromorphic system which decides, based on auditory and visual input, how to reach a sound source without collisions. The development of this sensory integration system, which identifies the shortest possible path, is a key achievement towards autonomous robotics. The proposed neuromorphic system comprises two event based sensors (the eDVS for vision and the NAS for audition) and the SpiNNaker processor. Open loop experiments were performed to evaluate the system performances. In the presence of acoustic stimulation alone, the heading direction points to the direction of the sound source with a Pearson correlation coefficient of 0.89. When visual input is introduced into the network the heading direction always points at the direction of null optical flow closest to the sound source. Hence, the sensory integration network is able to find the shortest path to the sound source while avoiding obstacles. This work shows that a simple, task dependent mapping of sensory information can lead to highly complex and robust decisions.
AB - Animals combine various sensory cues with previously acquired knowledge to safely travel towards a target destination. In close analogy to biological systems, we propose a neuromorphic system which decides, based on auditory and visual input, how to reach a sound source without collisions. The development of this sensory integration system, which identifies the shortest possible path, is a key achievement towards autonomous robotics. The proposed neuromorphic system comprises two event based sensors (the eDVS for vision and the NAS for audition) and the SpiNNaker processor. Open loop experiments were performed to evaluate the system performances. In the presence of acoustic stimulation alone, the heading direction points to the direction of the sound source with a Pearson correlation coefficient of 0.89. When visual input is introduced into the network the heading direction always points at the direction of null optical flow closest to the sound source. Hence, the sensory integration network is able to find the shortest path to the sound source while avoiding obstacles. This work shows that a simple, task dependent mapping of sensory information can lead to highly complex and robust decisions.
U2 - 10.1109/BIOCAS.2019.8919202
DO - 10.1109/BIOCAS.2019.8919202
M3 - Conference contribution
BT - 2019 IEEE Biomedical Circuits and Systems Conference (BioCAS)
PB - IEEE
ER -