Efficient Spiking and Artificial Neural Networks for Event Cameras

Alexander Kugele

Onderzoeksoutput

293 Downloads (Pure)

Samenvatting

Event cameras have become attractive alternatives to regular frame-based cameras in many scenarios, from consumer electronics over surveillance to autonomous driving.
Their novel sensor paradigm of asynchronously detecting brightness changes in a scene make them faster, more energy-efficient and less susceptible to global illumination.
Processing these event streams calls for algorithms that are as efficient as the camera itself, while being competitive to frame-based computer vision on tasks like object recognition and detection.

This thesis contributes methods to obtain efficient neural networks for classification and object detection in event streams.
We adopt ANN-to-SNN (artificial neural network to spiking neural network) conversion to handle sequential data like videos or event streams to improve state-of-the-art in accuracy and energy-efficiency.
We propose a novel network architecture called hybrid SNN-ANN, to train a mixed SNN and ANN network using surrogate gradients.
These hybrid networks are more efficient, even compared to trained and converted SNNs.
To detect objects with only a small number of events, we propose a filter and a memory, both improving results during inference.

Our networks advance the state-of-the-art in event stream processing and contribute to the success of event cameras.
Given suitable neuromorphic hardware, our spiking neural networks enable event cameras to be used in scenarios with a limited energy budget.
Our proposed hybrid architecture can guide the design of novel hybrid neuromorphic devices that combine efficient sparse and dense processing.
Originele taal-2English
KwalificatieDoctor of Philosophy
Toekennende instantie
  • Rijksuniversiteit Groningen
Begeleider(s)/adviseur
  • Chicca, Elisabetta, Supervisor
  • Pfeiffer, Michael, Co-supervisor, Externe Persoon
Datum van toekenning19-dec.-2023
Plaats van publicatie[Groningen]
Uitgever
DOI's
StatusPublished - 2023

Citeer dit