Brain-Computer Interface Systems
Dylan McKeever & Andrew Stevenson
VIDEO: “Berlin Brain Computer Interface”
Recent scientific work in the field of neuroscience has allowed people to directly interface their brains with a number of technologies. These technologies are often referred to as “brain-computer interfaces.” A brain-computer interface is defined as a direct connection between a human (or animal) brain with an external device. These connections range from non-invasive technologies that recognize brain signals externally, to invasive technologies that involve surgery and direct electrode implantation. While many of these technologies aim to restore function to disabled people, others aim to improve upon or augment existing functions.
Many brain-computer interface technologies are created for the purpose of restoring ability to the disabled. For example, the company Cyberkinetics specializes in developing technology that can allow the disabled to interface with normal household items. In 2004, the company made a device that allowed tetraplegic Matthew Nagle to control a television and a home computer through electrodes implanted in his skull. The technology called BrainGate utilized a “Utah Array” to recognize Nagle’s motor cortex patterns and translate them into information to his computer. Through BrainGate, Nagle was able to “think his TV on and off, change channels and alter the volume thanks to the technology and software linked to devices in his home” (BBC).
VIDEO: “KSLTV News: Neural Microelectronics”
Additionally, technology has been developed to aid the blind and deaf. The work of Dr. William Dobelle on artificial vision has led to partial restorative vision, allowing blind patients to see through visual cortex electrodes and tiny cameras mounted on glasses. Although the technology is not perfect, patients such as Jens Naumann were able to see a black and white matrix of pixels with the Dobelle interface and do complicated maneuvers such as operating a motor vehicle. In a recent interview, Naumann praised the technology saying, “With this device, you don't lose anything. You actually have a fifth sense restored, and that is what I just absolutely adore with this device.”(Gupta).
For the deaf, non-invasive cochlear implants are the usual devices used for restoring hearing. However, auditory brain implants have also become a viable choice. Similar to the bionic eye, the implant uses a tiny microphone connected to electrodes planted on the brain stem to restore hearing sensation. Previous technology often used non-invasive techniques that only restored partial hearing. Newer procedures involve directly stimulating the brain stem to help restore the sense” (Graham-Rowe).
Although many of these brain-computer interfaces aim to restore function, the technologies also offer the possibility of improved or even advanced ability. For example, Matthew Nagle’s BrainGate interface allows him to control a computer in a way that fully-functioning people cannot. Additionally, if the bionic eye and bionic ear are improved upon enough, they may one day be more preferable than natural functions. Thus, many of these technologies offer cyborg abilities that can restore, augment, and improve human abilities.
Another area of brain-computer interface technology falls under augmentative aims. Oftentimes, these technologies are designed to improve upon existing abilities or alter their function. One example of this is interfacing with computers to play video games. For example, developers at MIT Media Labs in Europe have designed a game called Mind Balance in which players use their brain waves to direct a virtual character across a tightrope. The technology is non-invasive and uses electroencephalography to detect brain waves through a cap placed over a player’s head. The signals are then processed through a C# program and translated to the game. One of the developers, Mr. Lalor, describes the work as their “first stab at creating a brain-computer interface controlled environment” (Twist) which could lead to video-gamers one day attaching their heads to mind consoles. While mind controlled video-games seem to be aimed at entertaining gamers, they also serve to help the physically disabled in helping learn to interface with devices. Grad students from Washington University were able to develop an interface that allowed a fourteen year old with epilepsy play Space Invaders on his computer using a combination of electrocorticography and electroencephalography.
Using similar technology, developers at NeuroSky have developed a video-game platform that uses gamer’s minds to play a first-person shooter game. In the game, players can focus on objects and move them with their minds.
VIDEO: BBC “Controlling Video Games With Your Mind”
Other types of non-invasive augmentative brain-computer interfaces include an electroencephalography cap that would allow people to operate home computers using only their mind. At the Cebit technology convention, an Austrian company G.TEC recently revealed a non-invasive BCI system that would allow users to control normal computer functions by thinking about them. Unfortunately, it is harder for systems to read brain signals through non-invasive means because they have to detect waves through a human skull. On the other hand, the technology is able to slowly interpret cursor movement, basic video games, and even spell words. The possibilities for wireless, thought controlled computers open a new frontier for how people interact with their computers. CEO of the company, Christopher Guger, explained that “ultimately, you could have wireless contacts embedded in the brain, and communicate with others just by thinking. -But then you really would have to worry about your wife finding out about your girlfriend” (Nicollai).
At the University of Washington, professor Rajesh Rao and his students have developed a though-controlled robot with the same kind of non-invasive electroencephalography used in similar procedures. With a thirty-two electrode cap, students were able to make the robot move forward, backwards, and pick up different objects. Rao hoped that they would soon be able to get to “the point of using actual objects that people might want the robot to gather, as well as having the robot move through multiple rooms” (Physorg). The possibilities for both medical and military use of a remote controlled robot are the most apparent for this new technology.
VIDEO: “Neural Systems Lab: BCI & Humanoid Robot”
Other augmentative technologies aim to enhance human abilities in order to help people perform tasks with more efficiency and accuracy. For example, DARPA is working on a brain-computer interface that would give people “cortically coupled computer vision” to read and identify images faster. This technology relies on the idea that the human brain can recognize images before they register within the human consciousness. When the brain recognizes an image, an electroencephalography cap will pick up the brain wave and record the data on a computer. This technology would hope to allow security guards and data analyzers the ability to riffle through vast amounts of information more quickly, filtering out only the important information. Steven Gordon from Babson College’s technology department argues that, “conceivably, the proposed solution could be applied in quasi-real-time to allow a single human to monitor ten times as many sites as he or she would otherwise monitor” (Lakshmi).
Similarly, a brain-computer interface system has been developed by an international team of neuroscientists to decipher people’s intentions. Using high-resolution brain scans, researchers were able to correctly predict a binary test 70% of the time in test volunteers. By reading patterns created by the medial prefrontal cortex, the system could figure out whether volunteers would add or subtract two numbers. Many of the scientists believe this technology could one day be used to detect more complex intentions such as crime and violence. However, the ethical issues brought up by reading minds and interpreting intentions before they are committed are very problematic. Professor Colin Blakemore, director of the research, admits that, Some of [the technology] is extremely desirable, because it will help with diagnosis, education and so on, but we need to be thinking the ethical issues through. It adds a whole new gloss to personal medical data and how it might be used” (Sample).
Not all BCI’s are deisgned for human use. Most of the initial experiments first happen on the animal test subjects. In 2003, researchers successfully connected a prosthetic arm to a Rhesus monkey through training at Duke University in North Carolina. Researcher Professor Miguel Nicolelis said, "our analyses of the brain signals showed that the animal learned to assimilate the robot arm into her brain as if it was her own arm” (BBC).
VIDEO: “BBC Horizon Human 2.0 News: Brain Control Monkey”
In a different experiment, researchers at New York University were able to control a rat’s movements through overriding brain electrodes. By stimulating the rat’s medial forebrain bundle, or “reward center”, scientists could control whether the rat moved left, right, forward or backward. Unlike many of the other BCI systems mentioned above, this system involves inserting commands into the brain while the others are mostly exporting information. The auditory and visual brain interfaces are the only examples of BCI’s that import information to the brain, but these set-ups are controlled by the user, not external forces. In this case, the rat is actually controlled by humans. The researchers see possibilities for military use such as clearing mine fields” (Graham-Rowe).
VIDEO: “BBC Horizon Human 2.0 News: Brain Control Rat”
Similarly, military engineers funded by DARPA have designed brain electrode systems that would be used to control the movement of sharks. By adjusting what the shark thinks it smells, researchers hope to control its movement. Currently, researchers are still working on recording what parts of the brain are activated when a shark uses its sense of smell. (Brown)
The technology of brain-computer interface systems has a number of viable uses. Whether they are used to help the physically and mentally disabled interface with household objects, allow users to remotely play video games and use computers, enable people to scan images, decipher intentions, or to control the movement of animals for military research, BCI’s are an inevitable part of our future.
BBC News “Brain Chip Reads Man’s Thoughts” Retrieved June 1, 2007.
Copyright © 2007 Andrew Cichowski, All Rights Reserved.