18 May 2013

Controlling Robots with Thoughts

Facial grimaces generate major electrical activity (EEG signals) across our heads, and the same happens when Angel concentrates on a symbol, such as a flashing light, on a computer monitor. In both cases the electrodes read the activity in the brain. The signals are then interpreted by a processor which in turn sends a message to the robot to make it move in a predefined way.


The user can make use of the movements of the eyes, eyebrows and other parts of their face. Using the eyebrows they can select which of the robot's joints they want to move. The user can focus on a selection of lights on the screen. The robot’s movements depend on which light the user selects and the type of activity generated in the brain.

More information: