Tangible fidgeting interfaces for mental wellbeing recognition using deep learning applied to physiological sensor data

Woodward, K. ORCID: 0000-0003-3302-1345, 2021. Tangible fidgeting interfaces for mental wellbeing recognition using deep learning applied to physiological sensor data. PhD, Nottingham Trent University.

Kieran Woodward 2021.pdf - Published version

Download (26MB) | Preview


The momentary assessment of an individual's affective state is critical to the monitoring of mental wellbeing and the ability to instantly apply interventions. This thesis introduces the concept of tangible fidgeting interfaces for affective recognition from design and development through to evaluation. Tangible interfaces expand upon the affordance of familiar physical objects as the ability to touch and fidget may help to tap into individuals' psychological need to feel occupied and engaged. Embedding digital technologies within interfaces capitalises on motor and perceptual capabilities and allows for the direct manipulation of data, offering people the potential for new modes of interaction when experiencing mental wellbeing challenges.

Tangible interfaces present an ideal opportunity to digitally enable physical fidgeting interactions along with physiological sensor monitoring to unobtrusively and comfortable measure non-visable changes in affective state. This opportunity initiated the investigation of factors that would bring about the designing of more effective intelligent solutions using participatory design techniques to engage people in designing solutions relevant to themselves.

Adopting an artificial intelligence approach using physiological signals creates the possibility to quantify affect with high levels of accuracy. However, labelling is an indispensable stage of data pre-processing that is required before classification and can be extremely challenging with multi-model sensor data. New techniques are introduced for labelling at the point of collection coupled with a pilot study and a systematic performance comparison of five custom built labelling interfaces.

When classifying labelled physiological sensor data, individual differences between people limit the generalisability of models. To address this challenge, a transfer learning approach has been developed that personalises affective models using few labelled samples. This approach to personalise models and improve cross-domain performance is completed on-device, automating the traditionally manual process, saving time and labour. Furthermore, monitoring trajectories over long periods of time inherits some critical limitations in relation to the size of the training dataset. This shortcoming may hinder the development of reliable and accurate machine learning models. A second framework has been developed to overcome the limitation of small training datasets using an image-encoding transfer learning approach.

This research offers the first attempt at the development of tangible interfaces using artificial intelligence towards building a real-world continuous affect recognition system in addition to offering real-time feedback to perform as interventions. This exploration of affective interfaces has many potential applications to help improve quality of life for the wider population.

Item Type: Thesis
Creators: Woodward, K.
Date: March 2021
Rights: The copyright in this work is held by the author. You may copy up to 5% of this work for private study, or personal, non-commercial research. Any re-use of the information contained within this document should be fully referenced, quoting the author, title, university, degree level and pagination. Queries or requests for any other use, or if a more substantial copy is required, should be directed to the author.
Divisions: Schools > School of Science and Technology
Record created by: Linda Sullivan
Date Added: 29 Oct 2021 13:39
Last Modified: 29 Oct 2021 13:39
URI: https://irep.ntu.ac.uk/id/eprint/44547

Actions (login required)

Edit View Edit View


Views per month over past year


Downloads per month over past year