Ever wish your computer could read your mind? Two researchers on opposite sides of the United States hope to develop human-computer interfaces that will make more direct connections between human brains and computers possible. At Columbia University in New York, Paul Sajda received a $758,000 grant from the Defense Advanced Research Projects Agency (DARPA) last year to focus on a visual computer interface (VCI) technology, which can be used to analyze vast amounts of images very quickly. Instead of replacing human vision and image processing, Sajda and his colleagues are trying to tap into those vast capabilities.
“No computer vision technology comes close to our ability to analyze and recognize objects in the face of noise, occlusion and changing sizes of objects,” says Sajda, director of the Laboratory for Intelligent Imaging and Neural Computing at Columbia University.
But while the human brain can process and interpret images much more accurately than any computer, registering that information, managing the analysis and recording results has been difficult and time consuming. Currently, the easiest ways for humans to indicate interest in an image to a computer involve pressing a key, clicking a mouse or speaking, all of which have a narrow bandwidth and slow the process down, says Sajda.
Connecting a human directly to the computer and using brainwaves as input is a much faster way of transferring information between the two objects — the brain and the computer. Sajda uses an electroencephalogram to read the electrical impulses generated in the brain of a person viewing a succession of images. The trick is to intercept neural frequencies that are related to a decision regarding the value of an image without slowing down the display of multiple images. Sajda’s group shows subject images at rates as high as 10 images per second, and measures changes in brain activity as each image is shown. In this triage process, interesting images are automatically tagged for later, more thorough, analysis.
An immediate application of this technology is analysis of the huge volumes of still and video images being collected by the intelligence community. “There are video cameras going up all over the world that could capture something important, but there aren’t enough expert eyes to view them,” says Sajda. “A combination of the computer screening certain images and the human brain tagging them as important creates a dramatic increase in efficiency.”
Other applications for VCI include radiology, where a physician must examine hundreds of images a day while quickly scanning for abnormalities, and air traffic control. But DARPA is most interested in having federal agents sift through large video data streams.
Misha Pavel is approaching the human computer interfaces from another direction. Pavel and colleagues at Oregon Health & Science University in Beaverton are trying to see if by examining brain state, a machine can determine whether a human is receptive to additional information and how that information should be provided.
“It’s not just computers; it could be anything,” Pavel says. “In the future, all machines will contain computers,” says Pavel, who is a professor of biomedical engineering and director of Point of Care Laboratory at OHSU. Pavel hopes to directly connect machines to the user experience. For example, a cell phone would automatically go to voice mail when its owner is fighting heavy traffic but ring when that driver is cruising on a deserted rural interstate.
Imagine that a startup has created a device that is controlled by brainwaves in the cerebral cortex of the player. Would you consider using a product where the interface is controlled directly by your brain, as opposed to a keyboard?
46% Maybe, but only after years of testing
37% Yes, sign me up for that beta test
17% No, sounds dangerous
Source: CDW poll of 289 BizTech readers
A new generation of computer games is on the way — video games controlled by the human brain. While the technology sounds as if it came straight off the pages of a science fiction novel, it is in fact based on recent developments in neurology.
By pairing a biosensor and a signal processing system with a device that translates incoming data from the brain, it is possible to transmit those data feeds to a computer. So say the gaming startups NeuroSky and Emotiv.
How It Works
The brain consists of neurons, which work by transmitting electrical (as well as chemical) signals. This system is based on the same principles as the electroencephalogram (EEG) test procedure, in which the computer records the electrical potentials generated by nerve cells in the cerebral cortex. Neurological research has shown that different brainwave patterns indicate different emotional states, such as awareness, a meditative state or drowsiness. These waves are recognized by the device, which measures them and makes a conclusion as to what command should be performed. Each distinguishable combination of brainwaves triggers an execution of the appropriate algorithm. For example, brainwaves that indicate a high level of concentration while a specific object is selected on the screen cause the object to levitate.
NeuroSky. The San Jose, Calif., company used a single electrode medical device in order to gain necessary information from the brainwaves and turned it into a consumer gadget. Their invention is able to detect commands originating from facial movements such as smiles or winks, as well as commands originating in emotional states that are detected through brain signals. Their device monitors the player’s concentration level, so that when the player focuses enough on a single thought and the measured level reaches a specified threshold, the system generates a signal that is wirelessly transmitted to the computer.
The computer, in turn, generates the appropriate reaction according to a pre-defined algorithm. In a video game simulation, players have been able to control objects on the screen by the brainwaves they generate. Players could also order objects to move slower or faster and levitate them higher or lower. NeuroSky’s CEO, Stanley Yang, hopes to shrink the device to thumbnail size, enabling easy wear for consumers.
Emotiv. San Francisco-based Emotiv is another startup channeling electrical signals emitted by the brain into actions on a computer. Like NeuroSky, Emotiv developed a system that can distinguish facial expressions, conscious thoughts and emotions; it then converts the collected data into machine instructions. Emotiv’s EEG-based headset contains 16 electrodes that sense brainwave patterns indicating various conditions and wirelessly transmit the information to the computer.
In testing, players can lift and arrange stones by just thinking about their actions in a video game setting. Another application enables two players to compete in a game, where they can push and lift each other’s avatars using brain functions. Emotiv plans to wirelessly connect to game platforms, such as consoles and PCs, by as early as next year.
Both NeuroSky’s and Emotiv’s products include a headset equipped with static electrodes. However, the headsets require a customized scheme of electrode locations for each player, as some details of the brain structure can differ. In addition, the system is highly sensitive, and therefore there is a significant difficulty in gaining full control of the game, requiring 100 percent concentration from the player.
— Iddo Genuth
Transforming the human-computer interface into a direct connection between brain and machine is years away from broad commercial application, but the research promises to achieve many things: