Brain-machine interface


shot by both sides
Honda Technology Uses Brain Waves to Control Bots

Have tried to find some video footage of the demo online, but to no avail. If you have better luck, be sure to post links here, eh.

There was also a cover story on a similar field of research in New Scientist the other week, namely the ability to use MRI scanning to decipher It's still very early days, but the initial indications are that brain activity is both measurable and consistent when perceiving shapes.

Actually, it's a bit spooky.

polystyle desu

Memories of green
Thinking it - Moves it

Yea, caught the demo clip (maybe from Honda themselves ?) on Japanese NHK over last week
and it was topic at yesterday's picnic in the park.
We have a friend who is slowly becoming paralyzed and so there was some riffing on the possibilities for those who cannot move on their own ...


in the sea
Well, they've actually had EEG-based brain-controlled interfaces for a good while now, which have been used with some success on quadriplegics and the like. The main advance of this new interface is that they are able to actually match mental states with the associated physical action, and so have a direct mapping of mental state to robot motion. The EEG stuff only works by detecting various brain waves (alpha and beta rhythms, says a quick google), which you can learn to control using biofeedback, though it means the mapping between what you have to think with your mind and what the system actually does is completely arbitrary (though it is consistent). Also, it seems like EEG is a lot slower than this new technique. BUT, the advantage of EEG is that it only requires a couple of electrodes strapped to the head, while this Honda technique requires a huge damn fMRI machine, and I don't think anyone's developed an MRI that's even remotely mobile yet...