In reference to More Smartness:
This morning I happened to see this on Deric Bownd's Mindblog: A non-invasive brain-machine interface?
In it he blogs about a study that used magneto- and electro-encephalographic recordings only, to differentiate out signals for moving hands, with up to 67% accuracy. Here is the abstract of Hand Movement Direction Decoded from MEG and EEG by Waldert et al:
"Brain activity can be used as a control signal for brain–machine interfaces (BMIs). A powerful and widely acknowledged BMI approach, so far only applied in invasive recording techniques, uses neuronal signals related to limb movements for equivalent, multidimensional control of an external effector. Here, we investigated whether this approach is also applicable for noninvasive recording techniques. To this end, we recorded whole-head MEG during center-out movements with the hand and found significant power modulation of MEG activity between rest and movement in three frequency bands: an increase for 7 Hz (low-frequency band) and 62–87 Hz (high- band) and a decrease for 10–30 Hz (β band) during movement. Movement directions could be inferred on a single-trial basis from the low-pass filtered MEG activity as well as from power modulations in the low-frequency band, but not from the β and high- bands. Using sensors above the motor area, we obtained a surprisingly high decoding accuracy of 67% on average across subjects. Decoding accuracy started to rise significantly above chance level before movement onset. Based on simultaneous MEG and EEG recordings, we show that the inference of movement direction works equally well for both recording techniques. In summary, our results show that neuronal activity associated with different movements of the same effector can be distinguished by means of noninvasive recordings and might, thus, be used to drive a noninvasive BMI."
Brain-machine interface research looks to be a red-hot field of endeavor. Anything that doesn't involve having to plug actual wires into actual brains is likely to have a higher chance of catching on as a widespread innovation. I am truly amazed at this kind of work. I look forward to finding out how much easier life may become for those who have had spinal cord injuries, stroke etc., and for the caregivers of these people, including those in my own profession involved in neurorehabilitation.
This paper was from last year; the link does not appear to have citations listed. The Nicolelis monkey/treadmill research was conducted this year. Here are more papers on Brain-Machine interface, from the wikipedia link to Miguel Nicolelis:
Lebedev, M.A., Carmena, J.M., O’Doherty, J.E., Zacksenhouse, M., Henriquez, C.S., Principe, J.C., Nicolelis, M.A.L. (2005) Cortical ensemble adaptation to represent actuators controlled by a brain machine interface. J. Neurosci. 25: 4681-4693.
Santucci, D.M., Kralik, J.D., Lebedev , M.A., Nicolelis, M.A.L. (2005) Frontal and parietal cortical ensembles predict single-trial muscle activity during reaching movements. Eur. J. Neurosci., 22: 1529-1540.
Carmena, J.M., Lebedev, M.A., Crist, R.E., O’Doherty, J.E., Santucci, D.M., Dimitrov, D.F., Patil, P.G., Henriquez, C.S., Nicolelis, M.A.L. (2003) Learning to control a brain-machine interface for reaching and grasping by primates. PLoS Biology, 1: 193-208.
Nicolelis MA (2003) Brain-machine interfaces to restore motor function and probe neural circuits. Nat Rev Neurosci. 4: 417-422.
Wessberg J, Stambaugh CR, Kralik JD, Beck PD, Laubach M, Chapin JK, Kim J, Biggs SJ, Srinivasan MA, Nicolelis MA. (2000) Real-time prediction of hand trajectory by ensembles of cortical neurons in primates. Nature 16: 361-365.