By presenting visual stimuli, evoked brain signals (VEPs (visual evoked potentials)) can be recorded from an individual by placing just a few electrodes on his/her scalp. From these signals, commands can be extracted for (example) controlling a wheelchair (2D).
This would make it possible for a paralyzed person to control his/her own wheelchair in 4 different directions with the use of only the brain signals. There is no reason to believe that this method cannot be extended to a drone (3D).
The aim of this project will be to design the signal acquisition, feature extraction, and classification parts of this BCI system. The interface to the drone is done in collaboration with DTU Space.
Goals: The successful completion of this project should lead to:
1. A publication of an article describing the work.
2. Bridging with DTU Space
3. A product, that can be displayed on a fair such as open house (Åbent Hus) at DTU and/or KU
Allowed no of students: 1-2
Experience with programming in MATLAB and a basic understanding of EEG and signal processing are necessities.