Personal tools

Toward Superposition of Brain-Computer Interface Models

From iis-projects

Jump to: navigation, search
Emotiv-epoc-14-channel-mobile-eeg.jpg

Description

A brain–computer interface is a device that enables communication and control without movement. The device aims to recognize the human’s intentions from spatiotemporal neural activity typically recorded by a large set of electroencephalogram (EEG) electrodes. What makes it particularly challenging, however, is its susceptibility to errors in the recognition of human intentions. Indeed, the recent success of deep learning networks—based on the artificial neural nets of the past—is finding ever expanding applications suggesting its usage for a highly-accrue brain–computer interface. In order to cope with high inter-subject variance encountered in EEG signals, existing brain–computer interfaces train one model per subject. The main goal of this project is to provide a framework for superposing a number of trained BCI models into a single model ultimately enabling its use case beyond model compression.


Status: Completed

Philipp Rupp

Supervision: Michael Hersche, Abbas Rahimi
Date: 6/2019


Prerequisites

  • Machine Learning
  • Python Programming


Character

40% Theory
60% Programming

Professor

Luca Benini

↑ top

  • [1] Cheung et al., Superposition of many models into one
  • [2] Schirrmeister et. al., Deep learning with convolutional neural networks for EEG decoding and visualization