Jyoti Seminar
Jyoti Seminar
On
Submitted By
Ms. Jyoti Gupta
PhD Scholar (EE)
It is hoped that the future would bring a revolution as brain-computer interfaces are
constructed using nanotechnology and getting information in becomes as easy as
getting out.
i
TABLE OF CONTENTS
ABSTRACT ................................................................................................................... i
TABLE OF CONTENTS ............................................................................................ii
LIST OF FIGURES ....................................................................................................iii
1. INTRODUCTION.................................................................................................... 1
2. HISTORY ................................................................................................................. 3
2.1 BCI versus Neuroprosthetics ................................................................................ 3
2.2 Earlier Research ................................................................................................... 4
2.2.1 Animal BCI Research ............................................................................... 4
2.2.2 Early Work ................................................................................................ 4
2.3 Prominent Research Successes............................................................................. 5
2.4 Types of BCI ........................................................................................................ 8
2.4.1 Invasive BCIs ............................................................................................ 8
2.4.2 Partially-invasive BCIs ............................................................................. 9
2.4.3 Non-invasive BCIs .................................................................................. 10
3. HOW BRAIN COMPUTER INTERFACE WORKS ........................................ 12
3.1 The Electric Brain .............................................................................................. 13
4. BCI APPLICATIONS ........................................................................................... 14
4.1 Brain Computer Interface for Second Life ......................................................... 15
4.2 BCI in Controlling Computers and Other Devices ............................................ 16
4.3 Restoring Physical Disabilities........................................................................... 17
4.4 Communication .................................................................................................. 18
4.5 Robotics.............................................................................................................. 18
4.6 Virtual Reality .................................................................................................... 19
5. CHALLENGES AND OPPORTUNITIES OF BCI ........................................... 20
5.1 P300 BCI ............................................................................................................ 21
5.2 Volitional Control of Neural Activity: Implications for BCI ............................. 21
5.2.1 Volitional Activation Revealed by BCI and BMI Studies ...................... 22
5.2.2 Limitations on Control for BCI and BMI ............................................... 23
5.3 Implantable Recurrent Brain–Computer Interfaces ........................................... 24
5.4 BCI Input and Output ......................................................................................... 25
5.4.1 Cortical Plasticity .................................................................................... 26
5.5 Sensory Input...................................................................................................... 27
6. BCI INNOVATORS .............................................................................................. 29
8. CELL-CULTURE BCIs ........................................................................................ 30
9. ETHICAL CONSIDERATIONS .......................................................................... 31
10. CONCLUSION .................................................................................................... 32
REFERENCES..................................................................................................... 33
ii
LIST OF FIGURES
iii
BRAIN COMPUTER INTERFACE
1. INTRODUCTION
Here, the word Brain means the brain or nervous system of an organic life form
rather than the mind. Similarly the word Computer means any processing or
computational device, from simple circuits to silicon chips
1
BRAIN COMPUTER INTERFACE
2
BRAIN COMPUTER INTERFACE
2. HISTORY
Research on BCIs began in the 1970s, but it wasn't until the mid-1990s that the first
working experimental implants in humans appeared. Following years of animal
experimentation, early working implants in humans now exist, designed to restore
damaged hearing, sight and movement. The common thread throughout the research
is the remarkable cortical plasticity of the brain, which often adapts to BCIs, treating
prostheses controlled by implants as natural limbs. With recent advances in
technology and knowledge, pioneering researchers could now conceivably attempt to
produce BCIs that augment human functions rather than simply restoring them,
previously only the realm of science fiction.
The differences between BCIs and neuroprosthetics are mostly in the ways the
terms are used: neuroprosthetics typically connect the nervous system, to a device,
whereas the terms “BCIs” usually connect the brain (or nervous system) with a
computer system. Practical neuroprosthetics can be linked to any part of the nervous
system, for example peripheral nerves, while the term "BCI" usually designates a
narrower class of systems which interface with the central nervous system.
The terms are sometimes used interchangeably and for good reason. Neuroprosthetics
and BCI seek to achieve the same aims, such as restoring sight, hearing, movement,
ability to communicate, and even cognitive function. Both use similar experimental
methods and surgical techniques.
3
BRAIN COMPUTER INTERFACE
Several laboratories have managed to record signals from monkey and rat cerebral
cortexes in order to operate BCIs to carry out movement. Monkeys have navigated
computer cursors on screen and commanded robotic arms to perform simple tasks
simply by thinking about the task and without any motor output. Other research on
cats has decoded visual signals.
4
BRAIN COMPUTER INTERFACE
to record the firings of neurons in one area at a time because of technical limitations
imposed by his equipment.
There has been rapid development in BCIs since the mid-1990s. Several groups have
been able to capture complex brain motor centre signals using recordings from neural
ensembles (groups of neurons) and use these to control external devices, including
research groups led by Richard Andersen, John Donoghue, Phillip Kennedy, Miguel
Nicolelis, and Andrew Schwartz.
Phillip Kennedy and colleagues built the first intracortical brain-computer interface by
implanting neurotrophic-cone electrodes into monkeys.
Figure 2.2 Garrett Stanley's recordings of cat vision using a BCI implanted in the lateral geniculate
nucleus (top row: original image; bottom row: recording)
5
BRAIN COMPUTER INTERFACE
researchers decoded the signals to generate movies of what the cats saw and were able
to reconstruct recognisable scenes and moving objects.
Miguel Nicolelis has been a prominent proponent of using multiple electrodes spread
over a greater area of the brain to obtain neuronal signals to drive a BCI. Such neural
ensembles are said to reduce the variability in output produced by single electrodes,
which could make it difficult to operate a BCI.
After conducting initial studies in rats during the 1990s, Nicolelis and his colleagues
developed BCIs that decoded brain activity in owl monkeys and used the devices to
reproduce monkey movements in robotic arms. Monkeys have advanced reaching and
grasping abilities and good hand manipulation skills, making them ideal test subjects
for this kind of work.
By 2000, the group succeeded in building a BCI that reproduced owl monkey
movements while the monkey operated a joystick or reached for food. The BCI
operated in real time and could also control a separate robot remotely over Internet
protocol. But the monkeys could not see the arm moving and did not receive any
feedback, a so-called open-loop BCI.
Figure 2.3 Diagram of the BCI developed by Miguel Nicolelis and collegues for use on Rhesus
monkeys
6
BRAIN COMPUTER INTERFACE
Other labs that develop BCIs and algorithms that decode neuron signals include John
Donoghue from Brown University, Andrew Schwartz from the University of
Pittsburgh and Richard Andersen from Caltech. These researchers were able to
produce working BCIs even though they recorded signals from far fewer neurons than
Nicolelis (15–30 neurons versus 50–200 neurons).
Donoghue's group reported training rhesus monkeys to use a BCI to track visual
targets on a computer screen with or without assistance of a joystick (closed-loop
BCI). Schwartz's group created a BCI for three-dimensional tracking in virtual reality
and also reproduced BCI control in a robotic arm.. The group created headlines when
they demonstrated that a monkey could feed itself pieces of zucchini using a robotic
arm powered by the animal's own brain signals.
Andersen's group used recordings of premovement activity from the posterior parietal
cortex in their BCI, including signals created when experimental animals anticipated
receiving a reward.
7
BRAIN COMPUTER INTERFACE
Invasive BCI research has targeted repairing damaged sight and providing new
functionality to paralysed people. Invasive BCIs are implanted directly into the grey
matter of the brain during neurosurgery. As they rest in the grey matter, invasive
devices produce the highest quality signals of BCI devices but are prone to scar-tissue
build-up, causing the signal to become weaker or even lost as the body reacts to a
foreign object in the brain.
Jens Naumann, a man with acquired blindness, being interviewed about his vision
BCI on CBS's The Early Show
In vision science, direct brain implants have been used to treat non-congenital
(acquired) blindness. One of the first scientists to come up with a working brain
interface to restore sight was private researcher William Dobelle.
Dobelle's first prototype was implanted into "Jerry," a man blinded in adulthood, in
1978. A single-array BCI containing 68 electrodes was implanted onto Jerry’s visual
cortex and succeeded in producing phosphenes, the sensation of seeing light. The
system included cameras mounted on glasses to send signals to the implant. Initially,
the implant allowed Jerry to see shades of grey in a limited field of vision at a low
frame-rate. This also required him to be hooked up to a two-ton mainframe, but
shrinking electronics and faster computers made his artificial eye more portable and
now enable him to perform simple tasks unassisted.
In 2002, Jens Naumann, also blinded in adulthood, became the first in a series of 16
paying patients to receive Dobelle’s second generation implant, marking one of the
earliest commercial uses of BCIs. The second generation device used a more
sophisticated implant enabling better mapping of phosphenes into coherent vision.
Phosphenes are spread out across the visual field in what researchers call the starry-
night effect. Immediately after his implant, Jens was able to use his imperfectly
restored vision to drive slowly around the parking area of the research institute.
8
BRAIN COMPUTER INTERFACE
Researchers at Emory University in Atlanta led by Philip Kennedy and Roy Bakay
were first to install a brain implant in a human that produced signals of high enough
quality to simulate movement. Their patient, Johnny Ray, suffered from ‘locked-in
syndrome’ after suffering a brain-stem stroke. Ray’s implant was installed in 1998
and he lived long enough to start working with the implant, eventually learning to
control a computer cursor.
Tetraplegic Matt Nagle became the first person to control an artificial hand using a
BCI in 2005 as part of the first nine-month human trial of Cyberkinetics
Neurotechnology’s BrainGate chip-implant. Implanted in Nagle’s right precentral
gyrus (area of the motor cortex for arm movement), the 96-electrode BrainGate
implant allowed Nagle to control a robotic arm by thinking about moving his hand as
well as a computer cursor, lights and TV.
Partially invasive BCI devices are implanted inside the skull but rest outside the brain
rather than amidst the grey matter. They produce better resolution signals than non-
invasive BCIs where the bone tissue of the cranium deflects and deforms signals and
have a lower risk of forming scar-tissue in the brain than fully-invasive BCIs.
9
BRAIN COMPUTER INTERFACE
fires, the laser light pattern and wavelengths it reflects would change slightly. This
would allow researchers to monitor single neurons but require less contact with tissue
and reduce the risk of scar-tissue build-up.
As well as invasive experiments, there have also been experiments in humans using
non-invasive neuroimaging technologies as interfaces. Signals recorded in this way
have been used to power muscle implants and restore partial movement in an
experimental volunteer. Although they are easy to wear, non-invasive implants
produce poor signal resolution because the skull dampens signals, dispersing and
blurring the electromagnetic waves created by the neurons. Although the waves can
still be detected it is more difficult to determine the area of the brain that created them
or the actions of individual neurons.
Another research parameter is the type of waves measured. Birbaumer's later research
with Jonathan Wolpaw at New York State University has focused on developing
technology that would allow users to choose the brain signals they found easiest to
operate a BCI, including mu and beta waves.
10
BRAIN COMPUTER INTERFACE
A further parameter is the method of feedback used and this is shown in studies of
P300 signals. Patterns of P300 waves are generated involuntarily (stimulus-feedback)
when people see something they recognize and may allow BCIs to decode categories
of thoughts without training patients first. By contrast, the biofeedback methods
described above require learning to control brainwaves so the resulting brain activity
can be detected. In 2000, for example, research by Jessica Bayliss at the University of
Rochester showed that volunteers wearing virtual reality helmets could control
elements in a virtual world using their P300 EEG readings, including turning lights on
and off and bringing a mock-up car to a stop.
11
BRAIN COMPUTER INTERFACE
As the power of modern computers grows alongside our understanding of the human
brain, we move ever closer to making some pretty spectacular science fiction into
reality. Imagine transmitting signals directly to someone's brain that would allow
them to see, hear or feel specific sensory inputs. Consider the potential to manipulate
computers or machinery with nothing more than a thought. It isn't about convenience -
- for severely disabled people, development of a brain-computer interface (BCI)
could be the most important technological breakthrough in decades. In this article,
we'll learn all about how BCIs work, their limitations and where they could be headed
in the future.
12
BRAIN COMPUTER INTERFACE
The reason a BCI works at all is because of the way our brains function. Our brains
are filled with neurons, individual nerve cells connected to one another by dendrites
and axons. Every time we think, move, feel or remember something, our neurons are
at work. That work is carried out by small electric signals that zip from neuron to
neuron as fast as 250 mph. The signals are generated by differences in electric
potential carried by ions on the membrane of each neuron. Although the paths the
signals take are insulated by something called myelin, some of the electric signal
escapes. Scientists can detect those signals, interpret what they mean and use them to
direct a device of some kind. It can also work the other way around. For example,
researchers could figure out what signals are sent to the brain by the optic nerve when
someone sees the color red. They could rig a camera that would send those exact
signals into someone's brain whenever the camera saw red, allowing a blind person to
"see" without eyes.
13
BRAIN COMPUTER INTERFACE
4. BCI APPLICATIONS
One of the most exciting areas of BCI research is the development of devices that can
be controlled by thoughts. Some of the applications of this technology may seem
frivolous, such as the ability to control a video game by thought. If you think remote
control is convenient, imagine changing channels with your mind.
However, there's a bigger picture -- devices that would allow severely disabled people
to function independently. For a quadriplegic, something as basic as controlling a
computer cursor via mental commands would represent a revolutionary improvement
in quality of life. But how do we turn those tiny voltage measurements into the
movement of a robotic arm?
Early research used monkeys with implanted electrodes. The monkeys used a joystick
to control a robotic arm. Scientists measured the signals coming from the electrodes.
Eventually, they changed the controls so that the robotic arm was being controlled
only by the signals coming form the electrodes, not the joystick.
A more difficult task is interpreting the brain signals for movement in someone who
can't physically move their own arm. With a task like that, the subject must "train" to
14
BRAIN COMPUTER INTERFACE
use the device. With an EEG or implant in place, the subject would visualize
closing his or her right hand. After many trials, the software can learn the signals
associated with the thought of hand-closing. Software connected to a robotic hand is
programmed to receive the "close hand" signal and interpret it to mean that the robotic
hand should close. At that point, when the subject thinks about closing the hand, the
signals are sent and the robotic hand closes.
A similar method is used to manipulate a computer cursor, with the subject thinking
about forward, left, right and back movements of the cursor. With enough practice,
users can gain enough control over a cursor to draw a circle, access computer
programs and control a TV. It could theoretically be expanded to allow users to "type"
with their thoughts.
15
BRAIN COMPUTER INTERFACE
A research team led by professor Jun’ichi Ushiba of the Keio University Biomedical
Engineering Laboratory has developed a BCI system that lets the user walk an avatar
through the streets of Second Life while relying solely on the power of thought. To
control the avatar on screen, the user simply thinks about moving various body parts
— the avatar walks forward when the user thinks about moving his/her own feet, and
it turns right and left when the user imagines moving his/her right and left arms.
The system consists of a headpiece equipped with electrodes that monitor activity in
three areas of the motor cortex (the region of the brain involved in controlling the
movement of the arms and legs). An EEG machine reads and graphs the data and
relays it to the BCI, where a brain wave analysis algorithm interprets the user’s
imagined movements. A keyboard emulator then converts this data into a signal and
relays it to Second Life, causing the on-screen avatar to move. In this way, the user
can exercise real-time control over the avatar in the 3D virtual world without moving
a muscle.
Future plans are to improve the BCI so that users can make Second Life avatars
perform more complex movements and gestures. The researchers hope the mind-
controlled avatar, which was created through a joint medical engineering project
involving Keio’s Department of Rehabilitation Medicine and the Tsukigase
Rehabilitation Center, will one day help people with serious physical impairments
communicate and do business in Second Life.
16
BRAIN COMPUTER INTERFACE
Feedback plays an important role when learning to use a BCI. In BCI training, the
most commonly used feedback modality is visual feedback. Visual attention,
however, might be needed for application control: to drive a wheelchair, to observe
the environment, etc. It would be important to also test other feedback.
One of the most critical needs for people with severe physical disabilities is restoring
the ability to communicate. The field of BCI research and development has since
focused primarily on neuroprosthetics applications that aim at restoring damaged
hearing, sight and movement.
17
BRAIN COMPUTER INTERFACE
4.4 Communication
Communication systems that do not depend on the brain’s normal output pathways of
peripheral nerves and muscles. In these systems, users explicitly manipulate their
brain activity instead of using motor movements to produce signals that can be used to
control computers or communication devices.
The impact of this work is extremely high, especially to those who suffer from
devastating neuromuscular injuries and neurodegenerative diseases such as
amyotrophic lateral sclerosis, which eventually strips individuals of voluntary
muscular activity while leaving cognitive function intact
4.5 Robotics
Controlling robots with thought has long been a popular science fiction concept.
Recent work with BCIs, however, has shown that robotic control is indeed possible
with brain signals. Applications for neurally-controlled robots currently center on
assistive technologies—―helper‖ robots—but BCI control has been proposed for
military and industrial applications as well. One of the earliest BCI-controlled robots,
the experiment explored the effects of real- world feedback (the movement of the
robot) in conjunction with a P300-based BCI, which depends on user attention. The
robot was configured to perform the steps to make coffee, such as getting powdered
coffee, sugar, and cream, and stirring the mixture with a spoon. The results showed
that users can effectively attend to real-world feedback while operating an attention-
based BCI.
18
BRAIN COMPUTER INTERFACE
In the BCI research world that have more practical purposes. The early work in virtual
environments is described in Bayliss and Ballard (2000), which details a study of a
P300 BCI controlling a virtual apartment and a virtual driving simulator.
19
BRAIN COMPUTER INTERFACE
20
BRAIN COMPUTER INTERFACE
But there is a vast similarity among different signals making it difficult to decode.
21
BRAIN COMPUTER INTERFACE
imagined movements, cognitive imagery and shifts of attention. More direct evidence
comes from studies on operant conditioning of neural activity using biofeedback, and
from BCI/BMI studies in which neural activity controls cursors or peripheral devices.
Limits in the degree of accuracy of control in the latter studies can be attributed to
several possible factors. Some of these factors, particularly limited practice time, can
be addressed with long-term implanted BCIs.
The volitional control of cortical cell activity has now been dramatically demonstrated
in numerous BCI and BMI studies in which primates controlled the position of cursors
or robotic arms with cortical activity under closed-loop conditions. Under ‘open-loop’
conditions, the activity of neural populations could be linearly transformed to the 3-D
coordinates of the monkeys' hand as they retrieved food from a well and brought it to
their mouth. Interestingly, the conversion parameters obtained for one set of trials
provided increasingly poor predictions of future responses, indicating a source of drift
over tens of minutes in the open-loop condition. This problem was alleviated when the
monkeys observed the consequences of their neural activity in ‘real time’ and could
optimize cell activity to achieve the desired goal under ‘closed-loop’ conditions. For
example, monkeys could successfully acquire targets on a two-dimensional workspace
22
BRAIN COMPUTER INTERFACE
or in virtual 3-D space with a cursor driven by activity of 10–30 motor cortex neurons.
More recently, the weighted activity of cell ensembles recorded over many cortical
areas are used to control a robotic arm to reach and grasp objects. Significantly,
several of these studies also demonstrated the ability to extract movement predictions
from neurons in post central as well as pre central cortical areas. Pre central motor
cortex cells provided the most accurate predictions of force and displacement, but
neurons from many other areas also provided significant predictions. The prediction
accuracy increased with the number of cells included, albeit with diminishing returns.
23
BRAIN COMPUTER INTERFACE
remains possible that longer experience with the same neuronal ensembles could
improve the achievable accuracy.
Difficulty in achieving reliable control also comes from employing adaptive decoding
schemes. Although such adaptive algorithms are intended to automatically optimize
control, they create a moving target for volitional modulation; the neural activity
pattern that worked at one time may subsequently become less effective, requiring the
learning of new patterns.
Finally, the ability to learn optimal control may be limited by the short and
intermittent exposure times, dictated by the need to tether the subject to the requisite
instrumentation. For example, a paraplegic subject that could practice neural control
of a cursor only several hours a week demonstrated remarkable success in controlling
a cursor movement, but nevertheless achieved a limited degree of accuracy.
Intermittent sessions also involve possible changes in the recorded neuronal
population, requiring the subject to relearn the task with a slightly different population
of cells.
‘Neurochip’ reliably records the activity of the same single neurons and two related
arm muscles for weeks, storing raw and/or compressed data to memory for daily
downloading via an infrared link. The compact connections and self-contained
circuitry makes unit recordings remarkably stable despite the unconstrained
movements of the monkey in the cage. For many neurons the correlations between
neural and muscle activity remained relatively stable, which bodes well for prosthetic
applications.
The Neurochip can also operate in a recurrent loop mode, converting action potentials
of a cortical neuron to stimuli delivered elsewhere in the motor system. Thus the
cortical cell could directly control functional electrical stimulation of muscles, spinal
cord or other brain region. Continuous operation of such a recurrent BCI (R-BCI)
allows the subject to adapt to the artificial pathway and by appropriately modifying
24
BRAIN COMPUTER INTERFACE
the neural activity, to incorporate its operation into normal behavior. Such R-BCI has
obvious potential prosthetic applications in bridging lost biological connections,
particularly when multiple parallel channels are implemented.
One of the biggest challenges facing brain-computer interface researchers today is the
basic mechanics of the interface itself. The easiest and least invasive method is a set
of electrodes -- a device known as an electroencephalograph (EEG) -- attached to
the scalp. The electrodes can read brain signals. However, the skull blocks a lot of the
electrical signal, and it distorts what does get through.
To get a higher-resolution signal, scientists can implant electrodes directly into the
gray matter of the brain itself, or on the surface of the brain, beneath the skull. This
allows for much more direct reception of electric signals and allows electrode
placement in the specific area of the brain where the appropriate signals are generated.
This approach has many problems, however. It requires invasive surgery to implant
the electrodes, and devices left in the brain long-term tend to cause the formation of
scar tissue in the gray matter. This scar tissue ultimately blocks signals.
25
BRAIN COMPUTER INTERFACE
Regardless of the location of the electrodes, the basic mechanism is the same: The
electrodes measure minute differences in the voltage between neurons. The signal is
then amplified and filtered. In current BCI systems, it is then interpreted by a
computer program, although you might be familiar with older analogue
encephalographs, which displayed the signals via pens that automatically wrote out
the patterns on a continuous sheet of paper.
In the case of a sensory input BCI, the function happens in reverse. A computer
converts a signal, such as one from a video camera, into the voltages necessary to
trigger neurons. The signals are sent to an implant in the proper area of the brain, and
if everything works correctly, the neurons fire and the subject receive a visual image
corresponding to what the camera sees.
Another way to measure brain activity is with a Magnetic Resonance Image (MRI).
An MRI machine is a massive, complicated device. It produces very high-resolution
images of brain activity, but it can't be used as part of a permanent or semipermanent
BCI. Researchers use it to get benchmarks for certain brain functions or to map where
in the brain electrodes should be placed to measure a specific function. For example,
if researchers are attempting to implant electrodes that will allow someone to control
a robotic arm with their thoughts, they might first put the subject into an MRI and ask
him or her to think about moving their actual arm. The MRI will show which area of
the brain is active during arm movement, giving them a clearer target for electrode
placement.
For years, the brain of an adult human was viewed as a static organ. When you are a
growing, learning child, your brain shapes itself and adapts to new experiences, but
eventually it settles into an unchanging state -- or so went the prevailing theory.
Beginning in the 1990s, research showed that the brain actually remains flexible even
into old age. This concept, known as cortical plasticity, means that the brain is able
to adapt in amazing ways to new circumstances. Learning something new or partaking
in novel activities forms new connections between neurons and reduces the onset of
26
BRAIN COMPUTER INTERFACE
age-related neurological problems. If an adult suffers a brain injury, other parts of the
brain are able to take over the functions of the damaged portion. Why is this important
for BCIs? It means that an adult can learn to operate with a BCI, their brain forming
new connections and adapting to this new use of neurons. In situations where implants
are used, it means that the brain can accommodate this seemingly foreign intrusion
and develop new connections that will treat the implant as a part of the natural brain.
The most common and oldest way to use a BCI is a cochlear implant. For the
average person, sound waves enter the ear and pass through several tiny organs that
eventually pass the vibrations on to the auditory nerves in the form of electric signals.
If the mechanism of the ear is severely damaged, that person will be unable to hear
anything. However, the auditory nerves may be functioning perfectly well. They just
aren't receiving any signals.
Figure 5.4 Dr. Peter Brunner demonstrates the brain-computer interface at a conference in Paris.
A cochlear implant bypasses the nonfunctioning part of the ear, processes the sound
waves into electric signals and passes them via electrodes right to the auditory nerves.
The result: A previously deaf person can now hear. He might not hear perfectly, but it
allows him to understand conversations.
The processing of visual information by the brain is much more complex than that of
audio information, so artificial eye development isn't as advanced. Still, the principle
27
BRAIN COMPUTER INTERFACE
is the same. Electrodes are implanted in or near the visual cortex, the area of the brain
that processes visual information from the retinas. A pair of glasses holding small
cameras is connected to a computer and, in turn, to the implants. After a training
period similar to the one used for remote thought-controlled movement, the subject
can see. Again, the vision isn't perfect, but refinements in technology have improved
it tremendously since it was first attempted in the 1970s. Jens Naumann was the
recipient of a second-generation implant. He was completely blind, but now he can
navigate New York City's subways by himself and even drive a car around a parking
lot. In terms of science fiction becoming reality, this process gets very close. The
terminals that connect the camera glasses to the electrodes in Naumann's brain are
similar to those used to connect the VISOR (Visual Instrument and Sensory Organ)
worn by blind engineering officer Geordi La Forge in the "Star Trek: The Next
Generation" TV show and films, and they're both essentially the same technology.
However, Naumann isn't able to "see" invisible portions of the electromagnetic
spectrum.
28
BRAIN COMPUTER INTERFACE
6. BCI INNOVATORS
A few companies are pioneers in the field of BCI. Most of them are still in the
research stages, though a few products are offered commercially.
• Neural Signals is developing technology to restore speech to disabled people.
An implant in an area of the brain associated with speech (Broca's area) would
transmit signals to a computer and then to a speaker. With training, the subject
could learn to think each of the 39 phonemes in the English language and
reconstruct speech through the computer and speaker.
• NASA has researched a similar system, although it reads electric signals from
the nerves in the mouth and throat area, rather than directly from the brain.
They succeeded in performing a Web search by mentally "typing" the term
"NASA" into Google.
• Cyberkinetics Neurotechnology Systems is marketing the BrainGate, a neural
interface system that allows disabled people to control a wheelchair, robotic
prosthesis or computer cursor.
• Japanese researchers have developed a preliminary BCI that allows the user to
control their avatar in the online world Second Life
29
BRAIN COMPUTER INTERFACE
8. CELL-CULTURE BCIs
Researchers have also built devices to interface with neural cells and entire neural
networks in cultures outside animals. As well as furthering research on animal
implantable devices, experiments on cultured neural tissue have focused on building
problem-solving networks, constructing basic computers and manipulating robotic
devices. Research into techniques for stimulating and recording from individual
neurons grown on semiconductor chips is sometimes referred to as neuroelectronics
or neurochips.
World first Neurochip developed by Caltech researchers Jerome Pine and Michael
Maher Development of the first working neurochip was claimed by a Caltech team
led by Jerome Pine and Michael Maher in 1997. The Caltech chip had room for 16
neurons. In 2003, a team led by Theodore Berger at the University of Southern
California started work on a neurochip designed to function as an artificial or
prosthetic hippocampus. The neurochip was designed to function in rat brains and is
intended as a prototype for the eventual development of higher-brain prosthesis. The
hippocampus was chosen because it is thought to be the most ordered and structured
part of the brain and is the most studied area. Its function is to encode experiences for
storage as long-term memories elsewhere in the brain.
30
BRAIN COMPUTER INTERFACE
9. ETHICAL CONSIDERATIONS
Discussion about the ethical implications of BCIs has been relatively muted. This may
be because the research holds great promise in the fight against disability and BCI
researchers have yet to attract the attention of animal rights groups. It may also be
because BCIs are being used to acquire signals to control devices rather than the other
way round, although vision research is the exception to this.
Some of the ethical considerations that BCIs would raise under these circumstances
are already being debated in relation to brain implants and the broader area of mind
control.
31
BRAIN COMPUTER INTERFACE
10. CONCLUSION
The future will bring a revolution as BCI are constructed using nanotechnology and
information getting in and out is becoming easy. The challenge is to make it reliable
and durable. Prof. Stephen Hawkins never dreamt of communicating after being
paralyzed, but Brain Computing made it possible. This is itself a huge step forward.
In the future it can be used by those individuals whose injuries are less severe. The
most advanced work in designing work a brain –computer interface has stemmed
from the evolution of new electrodes next generation product may be able to provide
an individual with the ability to control devices that allows breathing, bladder, and
bowl movements. The brain computer interface system is an Investigational device
and is not approved for sells and is available only for a clinical trail. Once this fiction
gets a place in real world and also on human, it will be boon for those who have
suffered from congenital diseases.
32
BRAIN COMPUTER INTERFACE
REFERENCES
33