Using Mismatch Negativity to Detect Selective Auditory Attention by Convolutional Neural Network
Keywords:Selective Attention, Mismatch Negativity, Electroencephalography, Tensor Decomposition, Feature Extraction, Convolution Neural Network
Background: In every moment of life, the brain processes a lot of combinations of several sounds. This processing includes stream separation and attended selection, among others. Recent studies show that listener’s attention could be decoded by analyzing electroencephalography (EEG). Method: In this research, a new method for classifying EEG signals is introduced when 40 subjects were asked to listen to two concurrent speech signals and attend to one of them in a dichotic scenario. The mismatch negativity (MMN) component, as one of the important evoked related potentials (ERPs), plays a crucial role during the attentional process which can be extracted by the non-negative Tucker decomposition method. Then, linear and nonlinear features are extracted. Most of these features are significant according to the analysis of variance (ANOVA) test. Finally, a combination of selected significant features is employed to train and test the convolutional neural network (CNN) classifier. The proposed auditory attention detection method is compared with the three recent and common methods as the baseline systems. Results: The proposed method detects the attended speaker by MMN with a classification accuracy of 98.21%. To introduce a practical application of the proposed method, six near-ear electrodes are selected to detect attended speech. The classification performance equal to 71.75% shows that using only these electrodes in hearing-aids could be a promising strategy in detecting attentional behavior. Comparison with Existing Methods: Comparing to three conventional auditory attention detection methods, we find that the proposed approach shows higher accuracy with MMN as input data. Conclusion: The results open a new perspective to design neural-based brain-computer interfaces (BCI) using selective auditory attention.
How to Cite
The papers published in the journal are licensed under a Creative Commons Attribution (CC BY 4.0) International License.