CAMBRIDGE, Mass. - In 1960, computer scientist J.C.R. Licklider published Man-Computer Symbiosis, a paper that outlined his dream for interactive computing and helped pave the way to the creation of the graphical user interface.
This week, at the Massachusetts Institute of Technology's Emerging Technology Conference here, another scientist suggested that there may soon come a time when people no longer have to touch a keyboard or a mouse, or even speak a command, in order to perform a computer function.
Instead, said Gerwin Schalk, a research scientist at the Wadsworth Center, a public health laboratory run by the New York state government, a person can think of a command and the computer will respond.
"What I'm here to tell you is that this is not science fiction. This is an emerging reality," Schalk said.
Schalk said a slow interface is a problem for human-computer interaction. Humans are forced to translate what they are thinking into digital commands that computers can understand, a process that creates I/O bottlenecks from the very start.
Neurotechnology, a $145 billion market that is growing at 9% annually, has already achieved key milestones in man-computer symbiosis.
Researchers are working with the brain's alpha waves -- neural oscillations in the frequency range of 8 and 12 Hz -- to create rich syntactic representations that can be used to communicate directly with computers, Schalk said.
Schalk presented attendees a video showing how test subjects can control computer games through the use of electrodes attached to the surface of their brains. The test subjects were already wired for treatment of illnesses such as epilepsy.
In one demonstration in the video, a patient used thoughts to shoot monsters in the video game Doom. The patient used a joystick to move the gun back and forth but used his thoughts to cause the gun the shoot -- accurately.
In another demonstration, Schalk showed how a computer can tell the difference between someone thinking the sounds, "Ah" or "Ooh."
A third demonstration a computer detecting the sound level of music a person was listening to and track it moment-by-moment.
"We're about that close," Schalk said, pinching his thumb and index finger together, "to being able to play back the music just by listening to the brain."
Yet another demonstration showed in how scientists can track in real time which part of the brain reacts to physical movement, from sticking a tongue out to trying to solve a Rubik's Cube puzzle.
Such technology could allow users to command a computer without touching it.
The two major obstacles to making real-time, thought-controlled computers a reality are mainly engineering problems, said Schalk. Scientists need to create better sensors to detect alpha waves and better ways to identify the brain's signals (it's language), he added.
"In the end, it will just take time and money to fix them," he said. "Direct computer interaction with the brain has the potential to become a general purpose technology ... at the same scale at information technology, computing and the telephone."
Lucas Mearian covers storage, disaster recovery and business continuity, financial services infrastructure and health care IT for Computerworld. Follow Lucas on Twitter at @lucasmearian, or subscribe to Lucas's RSS feed . His e-mail address is firstname.lastname@example.org.