Position-aware indoor human activity recognition and fall detection

Shahid, MM, Machado, P ORCID logoORCID: https://orcid.org/0000-0003-1760-3871, Bird, JJ ORCID logoORCID: https://orcid.org/0000-0002-9858-1231, Yahaya, SW ORCID logoORCID: https://orcid.org/0000-0002-0394-6112, Inoue, S, Lotfi, A ORCID logoORCID: https://orcid.org/0000-0002-5139-6565 and Ihianle, IK ORCID logoORCID: https://orcid.org/0000-0001-7445-8573, 2026. Position-aware indoor human activity recognition and fall detection. IEEE Sensors Journal. ISSN 1530-437X

[thumbnail of 2558309_Ihianle.pdf]
Preview
Text
2558309_Ihianle.pdf - Post-print

Download (4MB) | Preview

Abstract

With increasing life expectancy, particularly in developed nations, the proportion of elderly individuals is rising rapidly, necessitating advanced systems for continuous monitoring and timely intervention to support independent living and enhance safety in assisted care environments. Falls are among the leading causes of hospitalisations and deaths related to injuries in this demographic, highlighting the urgent need for intelligent fall detection systems. However, most existing solutions struggle with real-world deployment due to incomplete anomaly modelling and a lack of contextual location awareness. This paper introduces a novel position-aware indoor activity recognition and fall detection approach that uses spatial and motion data to detect falls with high accuracy and contextual relevance. The system integrates Ultra-Wideband (UWB) positioning technology with a Multilayer Perceptron (MLP) model to achieve indoor localisation. Furthermore, accelerometer and gyroscope data are used for activity monitoring, which is processed using a hybrid deep learning architecture that combines a Variational Autoencoder (VAE), Convolutional Neural Networks (CNN), and Long Short-Term Memory (LSTM) networks. This architecture takes advantage of temporal and spatial feature extraction for improved fall detection. The localisation module achieves over 96% accuracy. For activity recognition, the VAE CNN-LSTM model achieving fall detection accuracy exceeding 97%. A late fusion decision layer combines spatial and activity-level insights to enable precise detection and localisation of fall events within indoor environments. The proposed system is validated in a real-world smart home setting and demonstrates strong performance in terms of accuracy, scalability, and adaptability.

Item Type: Journal article
Publication Title: IEEE Sensors Journal
Creators: Shahid, M.M., Machado, P., Bird, J.J., Yahaya, S.W., Inoue, S., Lotfi, A. and Ihianle, I.K.
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 12 January 2026
ISSN: 1530-437X
Identifiers:
Number
Type
10.1109/JSEN.2026.3651213
DOI
2558309
Other
Rights: © 2026 IEEE. This article has been accepted for publication in IEEE Sensors Journal. This is the author's version which has not been fully edited and content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2026.3651213. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Divisions: Schools > School of Science and Technology
Record created by: Melissa Cornwell
Date Added: 02 Feb 2026 10:58
Last Modified: 02 Feb 2026 10:58
URI: https://irep.ntu.ac.uk/id/eprint/55161

Actions (login required)

Edit View Edit View

Statistics

Views

Views per month over past year

Downloads

Downloads per month over past year