I would say rather the capability may be 5 years away. Whether consumers want it - I'm skeptical. I knew someone who for reasons I won't go into had a computer that they had to control with their eyes (basically has a webcam that tracks the eyes and moves the cursor and then clicks when you wink). It made me realize the further integration of computing control and a human's anatomy/biology can create more problems because there is a lack of a filtering mechanism. When you type on a computer you choose what your computer does by making deliberate actions rather than your computer monitoring you and interpreting your actions. The problem with the latter is there are many things you do that does not involve your computer... pick up the phone, throw a ball for your dog, talk to a coworker, etc. When your computer is monitoring you for input it never knows when the action is for it and when it is not. So in the case of the computers based on eye control, the experience is very problematic when you have to look somewhere else for any reason.
Now taking it a step further I can't even imagine how out of control a computer would be based on someone's mind. Our minds randomly fire off thoughts non-stop - its actually incredibly hard to concentrate on one deliberate thing for a long time (if you've ever tried meditation you realize this very quickly). How a computer could filter actions for it and actions that are just the randomness of the brain seems like it would be incredibly difficult in that there really isn't a definitive line there at all.
You seem to be describing a problem that needs to be solved rather than a situation meaning that there is anything wrong with the technology. I used to have speech control turned on on my computer - you can set it to listen to everything which is basically the situation you describe with all its problems. Alternatively you can ask for it to listen for a keyword or only to listen when you press a key. I imagine similar solutions will present themselves for brain-computer interfaces.
I think there is some truth to that but I'd say the one difference between speech recognition and brain recognition is that speech is a voluntary action you control, while your thoughts have a largely involuntary component to it. Involuntary meaning when someone says "an idea just popped into my head" - the idea seemingly was not an action deliberately triggered. Think if while you were "mind typing" an email and suddenly the thought "god I hate my boss" popped into your head. If the filtering mechanism of the computer was poor, your computer might, assuming it was being helpful, shoot off an email to your boss saying "god I hate you". I guess what I'm saying is the way by which you could filter your own thoughts to be interpreted by your computer or not seems like an incredibly difficult proposition.
Yes, but I think we can also distinguish, in our own minds, the difference between a thought that is fleeting or passing and a thought that we want to take action on. Similarly, a successful brain-computer interface should be able to make that distinction.
Now taking it a step further I can't even imagine how out of control a computer would be based on someone's mind. Our minds randomly fire off thoughts non-stop - its actually incredibly hard to concentrate on one deliberate thing for a long time (if you've ever tried meditation you realize this very quickly). How a computer could filter actions for it and actions that are just the randomness of the brain seems like it would be incredibly difficult in that there really isn't a definitive line there at all.