Efficient Spiking and Artificial Neural Networks for Event Cameras

Alexander Kugele

Research output: ThesisThesis fully internal (DIV)

231 Downloads (Pure)

Abstract

Event cameras have become attractive alternatives to regular frame-based cameras in many scenarios, from consumer electronics over surveillance to autonomous driving.
Their novel sensor paradigm of asynchronously detecting brightness changes in a scene make them faster, more energy-efficient and less susceptible to global illumination.
Processing these event streams calls for algorithms that are as efficient as the camera itself, while being competitive to frame-based computer vision on tasks like object recognition and detection.

This thesis contributes methods to obtain efficient neural networks for classification and object detection in event streams.
We adopt ANN-to-SNN (artificial neural network to spiking neural network) conversion to handle sequential data like videos or event streams to improve state-of-the-art in accuracy and energy-efficiency.
We propose a novel network architecture called hybrid SNN-ANN, to train a mixed SNN and ANN network using surrogate gradients.
These hybrid networks are more efficient, even compared to trained and converted SNNs.
To detect objects with only a small number of events, we propose a filter and a memory, both improving results during inference.

Our networks advance the state-of-the-art in event stream processing and contribute to the success of event cameras.
Given suitable neuromorphic hardware, our spiking neural networks enable event cameras to be used in scenarios with a limited energy budget.
Our proposed hybrid architecture can guide the design of novel hybrid neuromorphic devices that combine efficient sparse and dense processing.
Original languageEnglish
QualificationDoctor of Philosophy
Awarding Institution
  • University of Groningen
Supervisors/Advisors
  • Chicca, Elisabetta, Supervisor
  • Pfeiffer, Michael, Co-supervisor, External person
Award date19-Dec-2023
Place of Publication[Groningen]
Publisher
DOIs
Publication statusPublished - 2023

Cite this