Fall detection by egocentric vision

Research output: ThesisThesis fully internal (DIV)

261 Downloads (Pure)

Abstract

Xueyi Wang's thesis explores the critical domain of fall detection from various angles and approaches. The research underscores the importance of egocentric vision-based fall detection systems in both academic and industrial contexts due to their potential to advance research, foster interdisciplinary collaboration, and make a meaningful real-world impact.

Chapter 1 serves as the thesis's introduction, offering a comprehensive overview of the research's background, objectives, and methodology. It highlights the rationale behind selecting an egocentric vision for fall detection and sets the stage for the entire study.

Chapter 2 provides an extensive literature review on fall detection systems, showcasing the evolution of sensor technology and algorithms over six years. The study emphasizes the challenges posed by the absence of benchmark datasets and the potential of sensor fusion. It also explores the broader applications of autonomous health monitoring systems beyond fall detection.

Chapter 3 proposes a methodology for fall detection using RGB video streams from an egocentric camera. Our approach leverages transfer learning and a classification model, demonstrating promising results in detecting falls, particularly in indoor settings.

Chapter 4 presents a practical fall detection and activity classification system that utilizes an RGB wearable camera and a late decision fusion technique. The effectiveness of this approach is confirmed through external validation, with the neck-worn camera proving advantageous for enhanced performance and usability.

Chapter 5 compares event-based approaches (RSNN and LSTM) with traditional CNN on RGB frames for online fall detection. While event-based methods exhibit reduced effectiveness, they offer lower computational complexity, making them suitable for embedded fall detection solutions.

Additionally, a new dataset, the largest among those collected from egocentric cameras, is introduced in the thesis. This dataset, comprising 10,278 clips, covers various falls and non-fall activities, incorporating multi-modal data captured under diverse conditions, providing a more realistic representation of real-world scenarios. The dataset's details, including its comprehensive design and collection process, are provided for future investigations in fall detection and related fields.
Original languageEnglish
QualificationDoctor of Philosophy
Awarding Institution
  • University of Groningen
Supervisors/Advisors
  • Karastoyanova, Dimka, Supervisor
  • Azzopardi, George, Supervisor
Award date4-Mar-2024
Place of Publication[Groningen]
Publisher
DOIs
Publication statusPublished - 2024

Fingerprint

Dive into the research topics of 'Fall detection by egocentric vision'. Together they form a unique fingerprint.

Cite this