Specialized visual sensor coupled to a dynamic neural field for embedded attentional process

M. Rasamuel*, L. Khacef, L. Rodriguez, Benoit Miramond

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

1 Citation (Scopus)

Abstract

Machine learning has recently taken the leading role in machine vision through deep learning algorithms. It has brought the best results in object detection, recognition and tracking. Nevertheless, these systems are computationally expensive since they need to process the whole images from the camera for producing such results. Consequently, they require important hardware resources that limit their use for embedded applications. In the other hand, we find a more efficient mechanism in biological systems. The brain, indeed, enables an attentional process to focus on the relevant information from the environment, and hence process only a sub-part of the visual field at a time. In this work, we implement a brain-inspired attentional process through dynamic neural fields that is integrated in two types of specialized visual sensors: frame-based and event-based cameras. We compare the obtained results on tracking performances and power consumption in the context of embedded recognition and tracking.
Original languageEnglish
Title of host publicationSAS 2019 - 2019 IEEE Sensors Applications Symposium, Conference Proceedings
PublisherIEEE
Number of pages6
ISBN (Print)978-1-5386-7713-1
DOIs
Publication statusPublished - 6-May-2019
Externally publishedYes
Event 2019 IEEE Sensors Applications Symposium (SAS) - Sophia Antipolis, France
Duration: 11-Mar-201913-Mar-2019

Conference

Conference 2019 IEEE Sensors Applications Symposium (SAS)
Country/TerritoryFrance
CitySophia Antipolis
Period11/03/201913/03/2019

Fingerprint

Dive into the research topics of 'Specialized visual sensor coupled to a dynamic neural field for embedded attentional process'. Together they form a unique fingerprint.

Cite this