Compressing Subject-specific Brain-Computer Interface Models into One Model by Superposition in Hyperdimensional Space
Open access
Datum
2020Typ
- Conference Paper
Abstract
Accurate multiclass classification of electroencephalography (EEG) signals is still a challenging task towards the development of reliable motor imagery brain-computer interfaces (MI-BCIs). Deep learning algorithms have been recently used in this area to deliver a compact and accurate model. Reaching high-level of accuracy requires to store subjects-specific trained models that cannot be achieved with an otherwise compact model trained globally across all subjects. In this paper, we propose a new methodology that closes the gap between these two extreme modeling approaches: we reduce the overall storage requirements by superimposing many subject-specific models into one single model such that it can be reliably decomposed, after retraining, to its constituent models while providing a trade-off between compression ratio and accuracy. Our method makes the use of unexploited capacity of trained models by orthogonalizing parameters in a hyperdimensional space, followed by iterative retraining to compensate noisy decomposition. This method can be applied to various layers of deep inference models. Experimental results on the 4-class BCI competition IV-2a dataset show that our method exploits unutilized capacity for compression and surpasses the accuracy of two state-of-the-art networks: (1) it compresses the smallest network, EEGNet [1], by 1.9×, and increases its accuracy by 2.41% (74.73% vs. 72.32%); (2) using a relatively larger Shallow ConvNet [2], our method achieves 2.95 x compression as well as 1.4% higher accuracy (75.05% vs. 73.59%). © 2020 EDAA. Mehr anzeigen
Persistenter Link
https://doi.org/10.3929/ethz-b-000387117Publikationsstatus
publishedExterne Links
Buchtitel
2020 Design, Automation and Test in Europe Conference and Exhibition (DATE)Seiten / Artikelnummer
Verlag
IEEEKonferenz
Thema
Hyperdimensional computing; Motor imagery; Convolutional Neural NetworksOrganisationseinheit
03996 - Benini, Luca / Benini, Luca
Förderung
780215 - Computation-in-memory architecture based on resistive devices (EC)
Zugehörige Publikationen und Daten
Is part of: https://doi.org/10.3929/ethz-b-000661764
Anmerkungen
Due to the Corona virus (COVID-19) the conference was conducted virtually.