Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"who I would like to call can not be inferred from the electrical activity on my scalp"

If I read the blog post correctly the claim is not passive mind reading. The claim is that the user has some training to issue the sorts of thought commands that can be reliably picked up. If I think "call mom" it doesn't do anything. But if I think "Up Up Down Down Left Right Left Right" maybe it can interpret that correctly as my preassigned shorthand for "call mom".



They're a little fuzzy, but the exact quote from the video is: "IBM scientists are researching how to link your devices, such as a computer or a smart phone. So you just need to think about calling someone, and it happens." From the context it is implied that this will happen in 5 years. In their in-depth blog post about the topic, they say: "...I could wonder what the traffic will be like on the way home and this information would pop up in front of me."

So I think they are actually describing a "call mom" scenario, and I strongly doubt that the information to detect that is at all present in an EEG signal.


I bet it's harder not to compulsively think "Up Up Down Down Left Right Left Right" than it is to dial a phone.


Maybe if it recognized just thinking about that sequence, but I think having to deliberately focus on each direction (kind of like pressing a series of buttons in your mind, one after the other) would be too hard to accidentally do like that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: