Automated human fall recognition from visual data

Albawendi, SG ORCID logoORCID: https://orcid.org/0000-0002-3439-7777, 2019. Automated human fall recognition from visual data. PhD, Nottingham Trent University.

[thumbnail of Suad Albawendi Final 2019.pdf]
Preview
Text
Suad Albawendi Final 2019.pdf - Published version

Download (10MB) | Preview

Abstract

Falls are one of the greatest risks for the older adults living alone at home. This research presents a novel visual-based fall detection approach to support independent living for older adults in an indoor environment. The aim of the research was to investigate appropriate methods for detecting falls through analysing the motion and shape of the human body.

Several techniques for automatically detecting falls have been proposed. The existing technologies can be classified into three main groups of fall detectors, namely: ambient device-based, wearable sensor-based and computer vision-based techniques. Ambient device-based techniques use vibration or pressure sensors to capture the sound and vibration for detecting the presence and position of a person. Although these devices are inexpensive and do not disturb the user, the detection rate is rather low and many false alarms are generated. Wearable devices use different sensors such as accelerometer and gyroscopes to capture the human body movement information and detect falls. However, older adults often forget to wear them. Wearable sensors are also known to be too invasive as they require wearing and carrying various uncomfortable devices. Much work has been undertaken to investigate the use of visual-based sensors for fall detection using single, multiple, and omnidirectional cameras.

The proposed research reported in this thesis uses a single camera to detect a moving object using a background subtraction algorithm. The next step is to extract robust features which describe the change in human shape and to discriminate falls from other activities like lying and sitting. These features are based on motion, change in the human shape feature, projection histogram features and temporal change of head position. Features extracted from the human silhouette are finally fed into various machine learning classifiers for fall detection evaluation.

The ability to distinguish a fall action depends mainly on the quality of the classifier inputs, therefore, the features of the extracted human silhouette play a key role in the effectiveness and robustness of detecting human falls. In this research, the timed Motion History Image (tMHI) method is applied for motion segmentation. In addition, the motion information was combined with other features extracted from the fitted ellipse around the human body to discriminate actual fall from other activities.

Fall detection methods can be divided into two main categories; thresh- old based methods and machine learning-based methods. This research presents threshold-based methods to distinguish between Activities of Daily Living (ADL) and falls. Fall events can be detected if the measured features values higher than pre-determined threshold values. Results show that falls can be distinguished from ADL with an accuracy of 99:82%, using our recording dataset. In addition, various machine learning methods were compared to evaluate their abilities to accurately detecting falls. Experimental results show efficiency and reliability of the proposed fall detection approach with high fall detection rate of 99:60% and low false alarm 2:62% tested with UR Fall Detection dataset. Additionally, A set of experiments have been conducted using our recording dataset, the results indicate that the proposed approach achieves high fall detection rate 99:94% and low false alarm 0:02%.

Item Type: Thesis
Creators: Albawendi, S.G.
Date: February 2019
Divisions: Schools > School of Science and Technology
Record created by: Linda Sullivan
Date Added: 13 Aug 2019 08:38
Last Modified: 13 Aug 2019 08:38
URI: https://irep.ntu.ac.uk/id/eprint/37256

Actions (login required)

Edit View Edit View

Statistics

Views

Views per month over past year

Downloads

Downloads per month over past year