Live Demonstration: Neuromorphic Sensory Integration for Combining Sound Source Localization and Collision Avoidance

T. Schoepe, D. Gutierrez-Galan, J. P. Dominguez-Morales, A. Jimenez-Fernandez, A. Linares-Barranco, E. Chicca

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

93 Downloads (Pure)


The brain is able to solve complex tasks in real time by combining different sensory cues with previously acquired knowledge. Inspired by the brain, we designed a neuromorphic demonstrator which combines auditory and visual input to find an obstacle free direction closest to the sound source. The system consists of two event-based sensors (the eDVS for vision and the NAS for audition) mounted onto a pan-tilt unit and a spiking neural network implemented on the SpiNNaker platform. By combining the different sensory information, the demonstrator is able to point at a sound source direction while avoiding obstacles in real time.
Original languageEnglish
Title of host publication2020 IEEE International Symposium on Circuits and Systems (ISCAS)
Number of pages1
ISBN (Electronic)978-1-7281-3320-1
ISBN (Print)978-1-7281-3321-8
Publication statusPublished - 2020
EventIEEE International Symposium on Circuits and Systems (ISCAS) 2020 - Seville, Spain
Duration: 10-Oct-202021-Oct-2020


ConferenceIEEE International Symposium on Circuits and Systems (ISCAS) 2020

Cite this