Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A deep neural network-based transfer learning to enhance the performance and learning speed of BCI systems

A deep neural network-based transfer learning to enhance the performance and learning speed of... Brain–computer interfaces (BCIs) suffer from a lack of classification accuracy when the number of electroencephalography (EEG) trials is low. This is therefore during the learning of a BCI for a subject, there is no clear protocol to use the captured knowledge of other trained BCIs. To overcome this, we have proposed a new parallel deep neural structure containing long short-term memory and multi-layer perception. Furthermore, a subject-to-subject transfer learning is exploited to improve both performance and learning speed. First, the proposed combinatorial classifier is trained over different subjects, then for each new case, a copy of this learned network is adopted to be fine-tuned by the EEG features of the new subject. The proposed method is assessed on an EEG dataset of motor imagery movements and compared to the support vector machines. The proposed method provides superior classification results and significantly speed up the learning process of the deep network. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Brain-Computer Interfaces Taylor & Francis

A deep neural network-based transfer learning to enhance the performance and learning speed of BCI systems

A deep neural network-based transfer learning to enhance the performance and learning speed of BCI systems

Brain-Computer Interfaces , Volume 8 (1-2): 12 – Apr 3, 2021

Abstract

Brain–computer interfaces (BCIs) suffer from a lack of classification accuracy when the number of electroencephalography (EEG) trials is low. This is therefore during the learning of a BCI for a subject, there is no clear protocol to use the captured knowledge of other trained BCIs. To overcome this, we have proposed a new parallel deep neural structure containing long short-term memory and multi-layer perception. Furthermore, a subject-to-subject transfer learning is exploited to improve both performance and learning speed. First, the proposed combinatorial classifier is trained over different subjects, then for each new case, a copy of this learned network is adopted to be fine-tuned by the EEG features of the new subject. The proposed method is assessed on an EEG dataset of motor imagery movements and compared to the support vector machines. The proposed method provides superior classification results and significantly speed up the learning process of the deep network.

Loading next page...
 
/lp/taylor-francis/a-deep-neural-network-based-transfer-learning-to-enhance-the-0m2R3oTuLn

References (47)

Publisher
Taylor & Francis
Copyright
© 2021 Informa UK Limited, trading as Taylor & Francis Group
ISSN
2326-2621
eISSN
2326-263x
DOI
10.1080/2326263X.2021.1943955
Publisher site
See Article on Publisher Site

Abstract

Brain–computer interfaces (BCIs) suffer from a lack of classification accuracy when the number of electroencephalography (EEG) trials is low. This is therefore during the learning of a BCI for a subject, there is no clear protocol to use the captured knowledge of other trained BCIs. To overcome this, we have proposed a new parallel deep neural structure containing long short-term memory and multi-layer perception. Furthermore, a subject-to-subject transfer learning is exploited to improve both performance and learning speed. First, the proposed combinatorial classifier is trained over different subjects, then for each new case, a copy of this learned network is adopted to be fine-tuned by the EEG features of the new subject. The proposed method is assessed on an EEG dataset of motor imagery movements and compared to the support vector machines. The proposed method provides superior classification results and significantly speed up the learning process of the deep network.

Journal

Brain-Computer InterfacesTaylor & Francis

Published: Apr 3, 2021

Keywords: Transfer learning; deep neural networks (DNN); brain–computer interface (BCI); long short-term memory (LSTM); electroencephalography (EEG)

There are no references for this article.