Fall Detection Research By Egocentric Cameras
Published:
Overview
Fall-related injuries are a leading cause of accidental death and hospitalization, especially among the elderly. Traditional fall detection systems rely on wearable sensors or fixed ambient cameras, which suffer from limited coverage, privacy concerns, or user discomfort.
This project explores a non-intrusive, wearable camera based approach to fall detection using body-worn egocentric cameras. Over the course of five years (2019–2024), we built a comprehensive research pipeline spanning dataset creation, novel algorithm design, and multi-modal fusion:
- EGOFALLS Dataset – One of the first large-scale visual-audio fall detection benchmarks from an egocentric perspective, now open-sourced and adopted by multiple international research teams.
Wang, X. (2024). Egofalls: A visual-audio dataset and benchmark for fall detection using egocentric cameras. International Conference on Pattern Recognition (ICPR). - Non-intrusive First-Person Vision Fall Detection – An end-to-end fall detection system using body-worn egocentric cameras, combining optical flow and deep learning to achieve accurate and privacy-preserving fall detection without requiring external infrastructure.
Wang, X., Talavera, E., Karastoyanova, D., & Azzopardi, G. (2023). Fall detection with a non-intrusive and first-person vision approach. IEEE Sensors Journal. - Event Camera + Spiking Neural Network – A visual-audio fusion framework combining neuromorphic event cameras with spiking neural networks, achieving robust detection across diverse environmental and lighting conditions.
Wang, X., Risi, N., Talavera, E., Chicca, E., Karastoyanova, D., & Azzopardi, G. (2023). Fall detection with event-based data: A case study. International Conference on Computer Analysis of Images and Patterns (CAIP), pp. 33-42. - Literature Survey – A comprehensive survey of elderly fall detection systems, providing a structured taxonomy of existing approaches.
Wang, X., Ellul, J., & Azzopardi, G. (2020). Elderly fall detection systems: A literature survey. Frontiers in Robotics and AI, 7, 71.
EGOFALLS Dataset
EGOFALLS is a visual-audio dataset and benchmark for fall detection using egocentric cameras, covering 12 activity types including 4 kinds of falls and 8 kinds of non-falls.
Data Collection
| Item | Detail |
|---|---|
| Duration | 2018 – 2022 |
| Location | Groningen, Netherlands |
| Equipment | OnReal G1 (RGB), CAMMHD Bodycams (RGB and Infrared) |
| Subjects | 14 (12 male, 2 female), age 20–60 |
| Camera Position | Neck and Waist |
| Environment | Indoor and Outdoor |
| Modalities | Vision and Audio |
Dataset Structure
Quantity and type of video clips per participant (C1 and C2 refer to camera 1 and camera 2):
The dataset is organized within individual directories corresponding to each subject:
The data follows a hierarchical structure:
Related Publications
Wang, X. (2024). Egofalls: A visual-audio dataset and benchmark for fall detection using egocentric cameras. International Conference on Pattern Recognition (ICPR).
Wang, X., Talavera, E., Karastoyanova, D., & Azzopardi, G. (2023). Fall detection with a non-intrusive and first-person vision approach. IEEE Sensors Journal.
Wang, X., Risi, N., Talavera, E., Chicca, E., Karastoyanova, D., & Azzopardi, G. (2023). Fall detection with event-based data: A case study. International Conference on Computer Analysis of Images and Patterns (CAIP), pp. 33-42.
Wang, X., Talavera, E., Karastoyanova, D., & Azzopardi, G. (2021). Fall detection and recognition from egocentric visual data: A case study. International Conference on Pattern Recognition (ICPR), pp. 431-443.
Wang, X., Ellul, J., & Azzopardi, G. (2020). Elderly fall detection systems: A literature survey. Frontiers in Robotics and AI, 7, 71.
