Bird, JJ ORCID: https://orcid.org/0000-0002-9858-1231, Faria, DR, Manso, LJ, Ayrosa, PPS and Ekárt, A, 2021. A study on CNN image classification of EEG signals represented in 2D and 3D. Journal of Neural Engineering, 18 (2): 026005. ISSN 1741-2560
Preview |
Text
1640811_Bird.pdf - Post-print Download (941kB) | Preview |
Abstract
Objective: The novelty of this study consists of the exploration of multiple new approaches of data pre-processing of brainwave signals, wherein statistical features are extracted and then formatted as visual images based on the order in which dimensionality reduction algorithms select them. This data is then treated as visual input for 2D and 3D convolutional neural networks (CNNs) which then further extract 'features of features'.
Approach: Statistical features derived from three electroencephalography (EEG) datasets are presented in visual space and processed in 2D and 3D space as pixels and voxels respectively. Three datasets are benchmarked, mental attention states and emotional valences from the four TP9, AF7, AF8 and TP10 10–20 electrodes and an eye state data from 64 electrodes. Seven hundred twenty-nine features are selected through three methods of selection in order to form 27 × 27 images and 9 × 9 × 9 cubes from the same datasets. CNNs engineered for the 2D and 3D preprocessing representations learn to convolve useful graphical features from the data.
Main results: A 70/30 split method shows that the strongest methods for classification accuracy of feature selection are One Rule for attention state and Relative Entropy for emotional state both in 2D. In the eye state dataset 3D space is best, selected by Symmetrical Uncertainty. Finally, 10-fold cross validation is used to train best topologies. Final best 10-fold results are 97.03% for attention state (2D CNN), 98.4% for Emotional State (3D CNN), and 97.96% for Eye State (3D CNN).
Significance: The findings of the framework presented by this work show that CNNs can successfully convolve useful features from a set of pre-computed statistical temporal features from raw EEG waves. The high performance of K-fold validated algorithms argue that the features learnt by the CNNs hold useful knowledge for classification in addition to the pre-computed features.
Item Type: | Journal article |
---|---|
Publication Title: | Journal of Neural Engineering |
Creators: | Bird, J.J., Faria, D.R., Manso, L.J., Ayrosa, P.P.S. and Ekárt, A. |
Publisher: | IOP Publishing |
Date: | April 2021 |
Volume: | 18 |
Number: | 2 |
ISSN: | 1741-2560 |
Identifiers: | Number Type 10.1088/1741-2552/abda0c DOI 1640811 Other |
Rights: | This is the Accepted Manuscript version of an article accepted for publication in Journal of Neural Engineering. IOP Publishing Ltd is not responsible for any errors or omissions in this version of the manuscript or any version derived from it. The Version of Record is available online at https://doi.org/10.1088/1741-2552/abda0c. |
Divisions: | Schools > School of Science and Technology |
Record created by: | Jeremy Silvester |
Date Added: | 27 Jan 2023 16:11 |
Last Modified: | 27 Jan 2023 16:11 |
URI: | https://irep.ntu.ac.uk/id/eprint/48101 |
Actions (login required)
Edit View |
Statistics
Views
Views per month over past year
Downloads
Downloads per month over past year