New York Times - 4/28/13 by Nick Bilton
“The current brain technologies are like trying to listen to a conversation in a football stadium from a blimp,” said John Donoghue, a neuroscientist and director of the Brown Institute for Brain Science. “To really be able to understand what is going on with the brain today you need to surgically implant an array of sensors into the brain.”Last week, engineers sniffing around the programming code for Google Glass found hidden examples of ways that people might interact with the wearable computers without having to say a word. Among them, a user could nod to turn the glasses on or off. A single wink might tell the glasses to take a picture.
But don’t expect these gestures to be necessary for long. Soon, we might interact with our smartphones and computers simply by using our minds. In a couple of years, we could be turning on the lights at home just by thinking about it, or sending an e-mail from our smartphone without even pulling the device from our pocket. Farther into the future, your robot assistant will appear by your side with a glass of lemonade simply because it knows you are thirsty.
Researchers in Samsung’s Emerging Technology Lab are testing tablets that can be controlled by your brain, using a cap that resembles a ski hat studded with monitoring electrodes, the MIT Technology Review, the science and technology journal of the Massachusetts Institute of Technology, reported this month.
Follow me on Twitter. Please subscribe to our news feed. Get regular updates via Email. Contact us for advertising inquiries.