Scientists led by Eduardo Iáñez of Miguel Hernandez University have for the first time combined a number of desirable features into a single brain-computer interface that is noninvasive, spontaneous and asynchronous.
About that asynchronicity: it turns out that, because of the bandwidth limitations of recording brain activity through EEGs -- external electrodes placed on the outside of the head -- previous attempts at noninvasive brain computer interfaces required that users only direct the computer during certain time slots.
Iáñez and colleagues' approach gets around this limitation by using four different models, each with assumptions that are sometimes the opposite others. This way, however a subject's brain happens to be wired up, all the computer has to figure out is whether they mean "left" or "right" in order to direct a robot arm in two dimensions.
From Iáñez, E., Furió, M. C., Azorín, J. M., Huizzi, J. A., and Fernández, E. 2009. Brain-Robot Interface for Controlling a Remote Robot Arm. In Proceedings of the 3rd international Work-Conference on the interplay between Natural and Artificial Computation: Part Ii: Bioinspired Applications in Artificial and Natural Computation (Santiago de Compostela, Spain, June 22 - 26, 2009). J. Mira, J. M. Ferrández, J. R. Álvarez, F. Paz, and F. J. Toledo, Eds. Lecture Notes In Computer Science. Springer-Verlag, Berlin, Heidelberg, 353-361.
This paper describes a technique based on electroencephalography (EEG) to control a robot arm. This technology could eventually allow people with severe disabilities to control robots that can help them in daily living activities. The EEG-based Brain Computer Interface (BCI) developed consists in register the brain rhythmic activity through a electrodes situated on the scalp in order to differentiate one cognitive process from rest state and use it to control one degree of freedom of the robot arm. In the paper the processing and classifier algorithm are described and an analysis of their parameters has been made with the objective of find the optimum configuration that allow obtaining the best results.
And from José L. Sirvent, José M. Azorín, Eduardo Iáñez, Andrés Úbeda and Eduardo Fernández P300-Based Brain-Computer Interface for Internet Browsing, Trends in Practical Applications of Agents and Multiagent Systems, Advances in Soft Computing, 2010, Volume 71/2010, 615-622L:
This paper describes the implementation of a Brain-Computer Interface (BCI) for controlling Internet browsing. The system uses electroencephalographic (EEG) signals to control the computer by evoked potentials through the P300 paradigm. This way, using visual stimulus, the user is able to control the Internet navigation via a virtual mouse and keyboard. The system has been developed under the BCI2000 platform. This paper also shows the experimental results obtained by different users.
More please, and faster.