0% found this document useful (0 votes)
43 views37 pages

Jyoti Seminar

The document provides information about brain-computer interfaces (BCI). It discusses the history and types of BCI, how they work, applications, and challenges. The seminar report was submitted by Ms. Jyoti Gupta, a PhD scholar in electrical engineering at the National Institute of Technology, under the guidance of Dr. Pankaj Mukhija. It aims to help those who are paralyzed perform routine activities.

Uploaded by

jyotigupta.mait
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views37 pages

Jyoti Seminar

The document provides information about brain-computer interfaces (BCI). It discusses the history and types of BCI, how they work, applications, and challenges. The seminar report was submitted by Ms. Jyoti Gupta, a PhD scholar in electrical engineering at the National Institute of Technology, under the guidance of Dr. Pankaj Mukhija. It aims to help those who are paralyzed perform routine activities.

Uploaded by

jyotigupta.mait
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

A Seminar Report

On

Brain Computer Interface

Submitted By
Ms. Jyoti Gupta
PhD Scholar (EE)

Roll No. 233232103

Under the guidance of


Dr. Pankaj Mukhija
Assistant Professor
Electrical Engineering Department

NATIONAL INSTITUTE OF TECHNOLOGY


GT Karnal Road, Delhi-110036
ABSTRACT

A brain-computer interface (BCI) is a direct communication pathway between a


human or animal brain and an external device. One-way BCIs, computers either
accept commands from the brain or send signals to it but not both .Two-way BCIs
would allow brains and external devices to exchange information in both directions
but have yet to be successfully implanted in animals or humans. The common thread
throughout the research is the remarkable cortical plasticity of the brain, which often
adapts to BCIs, treating prostheses controlled by implants as natural limbs. The BCI
uses velocity predictions to control reaching movements and simultaneously predicted
hand gripping force. The primary goal of this technology is to help those who are
paralyzed to perform routine activities. The recent development of quantum
computing research would practically revolutionize the speed, accuracy and efficiency
of computer technology. This advancement could also help transmit the electrical
signals more accurately.

It is hoped that the future would bring a revolution as brain-computer interfaces are
constructed using nanotechnology and getting information in becomes as easy as
getting out.

i
TABLE OF CONTENTS

ABSTRACT ................................................................................................................... i
TABLE OF CONTENTS ............................................................................................ii
LIST OF FIGURES ....................................................................................................iii
1. INTRODUCTION.................................................................................................... 1
2. HISTORY ................................................................................................................. 3
2.1 BCI versus Neuroprosthetics ................................................................................ 3
2.2 Earlier Research ................................................................................................... 4
2.2.1 Animal BCI Research ............................................................................... 4
2.2.2 Early Work ................................................................................................ 4
2.3 Prominent Research Successes............................................................................. 5
2.4 Types of BCI ........................................................................................................ 8
2.4.1 Invasive BCIs ............................................................................................ 8
2.4.2 Partially-invasive BCIs ............................................................................. 9
2.4.3 Non-invasive BCIs .................................................................................. 10
3. HOW BRAIN COMPUTER INTERFACE WORKS ........................................ 12
3.1 The Electric Brain .............................................................................................. 13
4. BCI APPLICATIONS ........................................................................................... 14
4.1 Brain Computer Interface for Second Life ......................................................... 15
4.2 BCI in Controlling Computers and Other Devices ............................................ 16
4.3 Restoring Physical Disabilities........................................................................... 17
4.4 Communication .................................................................................................. 18
4.5 Robotics.............................................................................................................. 18
4.6 Virtual Reality .................................................................................................... 19
5. CHALLENGES AND OPPORTUNITIES OF BCI ........................................... 20
5.1 P300 BCI ............................................................................................................ 21
5.2 Volitional Control of Neural Activity: Implications for BCI ............................. 21
5.2.1 Volitional Activation Revealed by BCI and BMI Studies ...................... 22
5.2.2 Limitations on Control for BCI and BMI ............................................... 23
5.3 Implantable Recurrent Brain–Computer Interfaces ........................................... 24
5.4 BCI Input and Output ......................................................................................... 25
5.4.1 Cortical Plasticity .................................................................................... 26
5.5 Sensory Input...................................................................................................... 27
6. BCI INNOVATORS .............................................................................................. 29
8. CELL-CULTURE BCIs ........................................................................................ 30
9. ETHICAL CONSIDERATIONS .......................................................................... 31
10. CONCLUSION .................................................................................................... 32
REFERENCES..................................................................................................... 33

ii
LIST OF FIGURES

Figure 1.1 Why a Brain Computer Interface .................................................................1


Figure 1.2 General Brain Computer Interface ...............................................................2
Figure 2.1 Rats implanted with BCIs in Theodore Berger's experiments .....................4
Figure 2.2 Garrett Stanley's recordings of cat vision using a BCI implanted in the
lateral geniculate nucleus (top row: original image; bottom row:
recording) .....................................................................................................5
Figure 2.3 Diagram of the BCI developed by Miguel Nicolelis and collegues for
use on Rhesus monkeys ...............................................................................6
Figure 2.4 Dummy unit illustrating the design of a BrainGate interface.......................8
Figure 3.1 Working of BCI related to Brain ................................................................12
Figure 4.1 Working of BCI related to Body ................................................................14
Figure 4.2 Brain Computer Interface for Second Life .................................................15
Figure 4.3 Basic diagram of a handicapped controlling computer with BCI………...17

Figure 4.4 A lady controlling a robotic arm to feed herself………………………….19

Figure 5.1 Challenges and Opportunities of BCI.........................................................20


Figure 5.2 P300 BCI ....................................................................................................21
Figure 5.3 Input and Output of BCI .............................................................................25
Figure 5.4 Dr. Peter Brunner demonstrates the brain-computer interface at a
conference in Paris. ....................................................................................27
Figure 8.1 World first: Neurochip ...............................................................................30

iii
BRAIN COMPUTER INTERFACE

1. INTRODUCTION

A brain-computer interface (BCI), sometimes called a direct neural interface or a


brain-machine interface, is a direct communication pathway between a human or
animal brain (or brain cell culture) and an external device. In one-way BCIs,
computers either accept commands from the brain or send signals to it. Two-way
BCIs would allow brains and external devices to exchange information in both
directions but have yet to be successfully implanted in animals or humans.

Here, the word Brain means the brain or nervous system of an organic life form
rather than the mind. Similarly the word Computer means any processing or
computational device, from simple circuits to silicon chips

Figure 1.1 Why a Brain Computer Interface

1
BRAIN COMPUTER INTERFACE

Figure 1.2 General Brain Computer Interface

2
BRAIN COMPUTER INTERFACE

2. HISTORY

Research on BCIs began in the 1970s, but it wasn't until the mid-1990s that the first
working experimental implants in humans appeared. Following years of animal
experimentation, early working implants in humans now exist, designed to restore
damaged hearing, sight and movement. The common thread throughout the research
is the remarkable cortical plasticity of the brain, which often adapts to BCIs, treating
prostheses controlled by implants as natural limbs. With recent advances in
technology and knowledge, pioneering researchers could now conceivably attempt to
produce BCIs that augment human functions rather than simply restoring them,
previously only the realm of science fiction.

2.1 BCI versus Neuroprosthetics

Neuroprosthetics is an area of neuroscience concerned with neural prostheses —


using artificial devices to replace the function of impaired nervous systems or sensory
organs. The most widely used neuroprosthetic device is the cochlear implant, which
was implanted in approximately 100,000 people worldwide as of 2006. There are also
several neuroprosthetic devices that aim to restore vision, including retinal implants,
although this article only discusses implants directly into the brain.

The differences between BCIs and neuroprosthetics are mostly in the ways the
terms are used: neuroprosthetics typically connect the nervous system, to a device,
whereas the terms “BCIs” usually connect the brain (or nervous system) with a
computer system. Practical neuroprosthetics can be linked to any part of the nervous
system, for example peripheral nerves, while the term "BCI" usually designates a
narrower class of systems which interface with the central nervous system.

The terms are sometimes used interchangeably and for good reason. Neuroprosthetics
and BCI seek to achieve the same aims, such as restoring sight, hearing, movement,
ability to communicate, and even cognitive function. Both use similar experimental
methods and surgical techniques.

3
BRAIN COMPUTER INTERFACE

2.2 Earlier Research

2.2.1 Animal BCI Research

Figure 2.1 Rats implanted with BCIs in Theodore Berger's experiments

Several laboratories have managed to record signals from monkey and rat cerebral
cortexes in order to operate BCIs to carry out movement. Monkeys have navigated
computer cursors on screen and commanded robotic arms to perform simple tasks
simply by thinking about the task and without any motor output. Other research on
cats has decoded visual signals.

2.2.2 Early Work

Studies that developed algorithms to reconstruct movements from motor cortex


neurons, which control movement, date back to the 1970s. Work by groups led by
Schmidt, Fetz and Baker in the 1970s established that monkeys could quickly learn to
voluntarily control the firing rate of individual neurons in the primary motor cortex
via closed-loop operant conditioning, a training method using punishment and
rewards.

In the 1980s, Apostolos Georgopoulos at Johns Hopkins University found a


mathematical relationship between the electrical responses of single motor-cortex
neurons in rhesus macaque monkeys and the direction that monkeys moved their arms
(based on a cosine function). He also found that dispersed groups of neurons in
different areas of the brain collectively controlled motor commands but was only able

4
BRAIN COMPUTER INTERFACE

to record the firings of neurons in one area at a time because of technical limitations
imposed by his equipment.
There has been rapid development in BCIs since the mid-1990s. Several groups have
been able to capture complex brain motor centre signals using recordings from neural
ensembles (groups of neurons) and use these to control external devices, including
research groups led by Richard Andersen, John Donoghue, Phillip Kennedy, Miguel
Nicolelis, and Andrew Schwartz.

2.3 Prominent Research Successes

Phillip Kennedy and colleagues built the first intracortical brain-computer interface by
implanting neurotrophic-cone electrodes into monkeys.

Figure 2.2 Garrett Stanley's recordings of cat vision using a BCI implanted in the lateral geniculate
nucleus (top row: original image; bottom row: recording)

In 1999, researchers led by Garrett Stanley at Harvard University decoded neuronal


firings to reproduce images seen by cats. The team used an array of electrodes
embedded in the thalamus (which integrates all of the brain’s sensory input) of sharp-
eyed cats. Researchers targeted 177 brain cells in the thalamus lateral geniculate
nucleus area, which decodes signals from the retina. The cats were shown eight short
movies, and their neuron firings were recorded. Using mathematical filters, the

5
BRAIN COMPUTER INTERFACE

researchers decoded the signals to generate movies of what the cats saw and were able
to reconstruct recognisable scenes and moving objects.

Miguel Nicolelis has been a prominent proponent of using multiple electrodes spread
over a greater area of the brain to obtain neuronal signals to drive a BCI. Such neural
ensembles are said to reduce the variability in output produced by single electrodes,
which could make it difficult to operate a BCI.

After conducting initial studies in rats during the 1990s, Nicolelis and his colleagues
developed BCIs that decoded brain activity in owl monkeys and used the devices to
reproduce monkey movements in robotic arms. Monkeys have advanced reaching and
grasping abilities and good hand manipulation skills, making them ideal test subjects
for this kind of work.

By 2000, the group succeeded in building a BCI that reproduced owl monkey
movements while the monkey operated a joystick or reached for food. The BCI
operated in real time and could also control a separate robot remotely over Internet
protocol. But the monkeys could not see the arm moving and did not receive any
feedback, a so-called open-loop BCI.

Figure 2.3 Diagram of the BCI developed by Miguel Nicolelis and collegues for use on Rhesus
monkeys

6
BRAIN COMPUTER INTERFACE

Later experiments by Nicolelis using rhesus monkeys, succeeded in closing the


feedback loop and reproduced monkey reaching and grasping movements in a robot
arm. With their deeply cleft and furrowed brains, rhesus monkeys are considered to be
better models for human neurophysiology than owl monkeys. The monkeys were
trained to reach and grasp objects on a computer screen by manipulating a joystick
while corresponding movements by a robot arm were hidden. The monkeys were later
shown the robot directly and learned to control it by viewing its movements. The BCI
used velocity predictions to control reaching movements and simultaneously
predicted hand gripping force.

Other labs that develop BCIs and algorithms that decode neuron signals include John
Donoghue from Brown University, Andrew Schwartz from the University of
Pittsburgh and Richard Andersen from Caltech. These researchers were able to
produce working BCIs even though they recorded signals from far fewer neurons than
Nicolelis (15–30 neurons versus 50–200 neurons).

Donoghue's group reported training rhesus monkeys to use a BCI to track visual
targets on a computer screen with or without assistance of a joystick (closed-loop
BCI). Schwartz's group created a BCI for three-dimensional tracking in virtual reality
and also reproduced BCI control in a robotic arm.. The group created headlines when
they demonstrated that a monkey could feed itself pieces of zucchini using a robotic
arm powered by the animal's own brain signals.

Andersen's group used recordings of premovement activity from the posterior parietal
cortex in their BCI, including signals created when experimental animals anticipated
receiving a reward.

In addition to predicting kinematic and kinetic parameters of limb movements, BCIs


that predict electromyographic or electrical activity of muscles are being developed.
Such BCIs could be used to restore mobility in paralyzed limbs by electrically
stimulating muscles.

7
BRAIN COMPUTER INTERFACE

2.4 Types of BCI

2.4.1 Invasive BCIs

Invasive BCI research has targeted repairing damaged sight and providing new
functionality to paralysed people. Invasive BCIs are implanted directly into the grey
matter of the brain during neurosurgery. As they rest in the grey matter, invasive
devices produce the highest quality signals of BCI devices but are prone to scar-tissue
build-up, causing the signal to become weaker or even lost as the body reacts to a
foreign object in the brain.

Jens Naumann, a man with acquired blindness, being interviewed about his vision
BCI on CBS's The Early Show

In vision science, direct brain implants have been used to treat non-congenital
(acquired) blindness. One of the first scientists to come up with a working brain
interface to restore sight was private researcher William Dobelle.
Dobelle's first prototype was implanted into "Jerry," a man blinded in adulthood, in
1978. A single-array BCI containing 68 electrodes was implanted onto Jerry’s visual
cortex and succeeded in producing phosphenes, the sensation of seeing light. The
system included cameras mounted on glasses to send signals to the implant. Initially,
the implant allowed Jerry to see shades of grey in a limited field of vision at a low
frame-rate. This also required him to be hooked up to a two-ton mainframe, but
shrinking electronics and faster computers made his artificial eye more portable and
now enable him to perform simple tasks unassisted.

In 2002, Jens Naumann, also blinded in adulthood, became the first in a series of 16
paying patients to receive Dobelle’s second generation implant, marking one of the
earliest commercial uses of BCIs. The second generation device used a more
sophisticated implant enabling better mapping of phosphenes into coherent vision.
Phosphenes are spread out across the visual field in what researchers call the starry-
night effect. Immediately after his implant, Jens was able to use his imperfectly
restored vision to drive slowly around the parking area of the research institute.

8
BRAIN COMPUTER INTERFACE

BCIs focusing on motor neuroprosthetics aim to either restore movement in paralysed


individuals or provide devices to assist them, such as interfaces with computers or
robot arms.

Researchers at Emory University in Atlanta led by Philip Kennedy and Roy Bakay
were first to install a brain implant in a human that produced signals of high enough
quality to simulate movement. Their patient, Johnny Ray, suffered from ‘locked-in
syndrome’ after suffering a brain-stem stroke. Ray’s implant was installed in 1998
and he lived long enough to start working with the implant, eventually learning to
control a computer cursor.

Tetraplegic Matt Nagle became the first person to control an artificial hand using a
BCI in 2005 as part of the first nine-month human trial of Cyberkinetics
Neurotechnology’s BrainGate chip-implant. Implanted in Nagle’s right precentral
gyrus (area of the motor cortex for arm movement), the 96-electrode BrainGate
implant allowed Nagle to control a robotic arm by thinking about moving his hand as
well as a computer cursor, lights and TV.

2.4.2 Partially-invasive BCIs

Partially invasive BCI devices are implanted inside the skull but rest outside the brain
rather than amidst the grey matter. They produce better resolution signals than non-
invasive BCIs where the bone tissue of the cranium deflects and deforms signals and
have a lower risk of forming scar-tissue in the brain than fully-invasive BCIs.

Electrocorticography (ECoG) uses the same technology as non-invasive


electroencephalography (see below), but the electrodes are embedded in a thin plastic
pad that is placed above the cortex, beneath the dura mater. ECoG technologies were
first trialed in humans in 2004 by Eric Leuthardt and Daniel Moran from Washington
University in St Louis. In a later trial, the researchers enabled a teenage boy to play
Space Invaders using his ECoG implant. This research indicates that it is difficult to
produce kinematic BCI devices with more than one dimension of control using ECoG.
Light Reactive Imaging BCI devices are still in the realm of theory. These would
involve implanting a laser inside the skull. The laser would be trained on a single
neuron and the neuron's reflectance measured by a separate sensor. When the neuron

9
BRAIN COMPUTER INTERFACE

fires, the laser light pattern and wavelengths it reflects would change slightly. This
would allow researchers to monitor single neurons but require less contact with tissue
and reduce the risk of scar-tissue build-up.

2.4.3 Non-invasive BCIs

As well as invasive experiments, there have also been experiments in humans using
non-invasive neuroimaging technologies as interfaces. Signals recorded in this way
have been used to power muscle implants and restore partial movement in an
experimental volunteer. Although they are easy to wear, non-invasive implants
produce poor signal resolution because the skull dampens signals, dispersing and
blurring the electromagnetic waves created by the neurons. Although the waves can
still be detected it is more difficult to determine the area of the brain that created them
or the actions of individual neurons.

Recordings of brainwaves produced by an electroencephalogram


Electroencephalography (EEG) is the most studied potential non-invasive interface,
mainly due to its fine temporal resolution, ease of use, portability and low set-up cost.
But as well as the technology's susceptibility to noise, another substantial barrier to
using EEG as a brain-computer interface is the extensive training required before
users can work the technology. For example, in experiments beginning in the mid-
1990s, Niels Birbaumer of the University of Tübingen in Germany used EEG
recordings of slow cortical potential to give paralysed patients limited control over a
computer cursor. (Birbaumer had earlier trained epileptics to prevent impending fits
by controlling this low voltage wave.) The experiment saw ten patients trained to
move a computer cursor by controlling their brainwaves. The process was slow,
requiring more than an hour for patients to write 100 characters with the cursor, while
training often took many months.

Another research parameter is the type of waves measured. Birbaumer's later research
with Jonathan Wolpaw at New York State University has focused on developing
technology that would allow users to choose the brain signals they found easiest to
operate a BCI, including mu and beta waves.

10
BRAIN COMPUTER INTERFACE

A further parameter is the method of feedback used and this is shown in studies of
P300 signals. Patterns of P300 waves are generated involuntarily (stimulus-feedback)
when people see something they recognize and may allow BCIs to decode categories
of thoughts without training patients first. By contrast, the biofeedback methods
described above require learning to control brainwaves so the resulting brain activity
can be detected. In 2000, for example, research by Jessica Bayliss at the University of
Rochester showed that volunteers wearing virtual reality helmets could control
elements in a virtual world using their P300 EEG readings, including turning lights on
and off and bringing a mock-up car to a stop.

In 1999, researchers at Case Western Reserve University led by Hunter Peckham,


used 64-electrode EEG skullcap to return limited hand movements to quadriplegic
Jim Jatich. As Jatich concentrated on simple but opposite concepts like up and down,
his beta-rhythm EEG output was analysed using software to identify patterns in the
noise. A basic pattern was identified and used to control a switch: Above average
activity was set to on, below average off. As well as enabling Jatich to control a
computer cursor the signals were also used to drive the nerve controllers embedded in
his hands, restoring some movement. Electronic neural networks have been deployed
which shift the learning phase from the user to the computer. Experiments by
scientists at the Fraunhofer Society in 2004 using neural networks led to noticeable
improvements within 30 minutes of training. Experiments by Eduardo Miranda aim
to use EEG recordings of mental activity associated with music to allow the disabled
to express themselves musically through an encephalophone.

Magnetoencephalography (MEG) and functional magnetic resonance imaging


(fMRI) have both been used successfully as non-invasive BCIs. In a widely reported
experiment, FMRI allowed two users being scanned to play Pong in real-time by
altering their haemodynamic response or brain blood flow through biofeedback
techniques. FMRI measurements of haemodynamic responses in real time have also
been used to control robot arms with a seven second delay between thought and
movement.

11
BRAIN COMPUTER INTERFACE

3. HOW BRAIN COMPUTER INTERFACE WORKS

As the power of modern computers grows alongside our understanding of the human
brain, we move ever closer to making some pretty spectacular science fiction into
reality. Imagine transmitting signals directly to someone's brain that would allow
them to see, hear or feel specific sensory inputs. Consider the potential to manipulate
computers or machinery with nothing more than a thought. It isn't about convenience -
- for severely disabled people, development of a brain-computer interface (BCI)
could be the most important technological breakthrough in decades. In this article,
we'll learn all about how BCIs work, their limitations and where they could be headed
in the future.

Figure 3.1 Working of BCI related to Brain

12
BRAIN COMPUTER INTERFACE

3.1 The Electric Brain

The reason a BCI works at all is because of the way our brains function. Our brains
are filled with neurons, individual nerve cells connected to one another by dendrites
and axons. Every time we think, move, feel or remember something, our neurons are
at work. That work is carried out by small electric signals that zip from neuron to
neuron as fast as 250 mph. The signals are generated by differences in electric
potential carried by ions on the membrane of each neuron. Although the paths the
signals take are insulated by something called myelin, some of the electric signal
escapes. Scientists can detect those signals, interpret what they mean and use them to
direct a device of some kind. It can also work the other way around. For example,
researchers could figure out what signals are sent to the brain by the optic nerve when
someone sees the color red. They could rig a camera that would send those exact
signals into someone's brain whenever the camera saw red, allowing a blind person to
"see" without eyes.

13
BRAIN COMPUTER INTERFACE

4. BCI APPLICATIONS

One of the most exciting areas of BCI research is the development of devices that can
be controlled by thoughts. Some of the applications of this technology may seem
frivolous, such as the ability to control a video game by thought. If you think remote
control is convenient, imagine changing channels with your mind.

However, there's a bigger picture -- devices that would allow severely disabled people
to function independently. For a quadriplegic, something as basic as controlling a
computer cursor via mental commands would represent a revolutionary improvement
in quality of life. But how do we turn those tiny voltage measurements into the
movement of a robotic arm?

Early research used monkeys with implanted electrodes. The monkeys used a joystick
to control a robotic arm. Scientists measured the signals coming from the electrodes.
Eventually, they changed the controls so that the robotic arm was being controlled
only by the signals coming form the electrodes, not the joystick.

Figure 4.1 Working of BCI related to Body

A more difficult task is interpreting the brain signals for movement in someone who
can't physically move their own arm. With a task like that, the subject must "train" to

14
BRAIN COMPUTER INTERFACE

use the device. With an EEG or implant in place, the subject would visualize
closing his or her right hand. After many trials, the software can learn the signals
associated with the thought of hand-closing. Software connected to a robotic hand is
programmed to receive the "close hand" signal and interpret it to mean that the robotic
hand should close. At that point, when the subject thinks about closing the hand, the
signals are sent and the robotic hand closes.

A similar method is used to manipulate a computer cursor, with the subject thinking
about forward, left, right and back movements of the cursor. With enough practice,
users can gain enough control over a cursor to draw a circle, access computer
programs and control a TV. It could theoretically be expanded to allow users to "type"
with their thoughts.

Once the basic mechanism of converting thoughts to computerized or robotic action is


perfected, the potential uses for the technology are almost limitless. Instead of a
robotic hand, disabled users could have robotic braces attached to their own limbs,
allowing them to move and directly interact with the environment. This could even be
accomplished without the "robotic" part of the device. Signals could be sent to the
appropriate motor control nerves in the hands, bypassing a damaged section of the
spinal cord and allowing actual movement of the subject's own hands.

4.1 Brain Computer Interface for Second Life

Figure 4.2 Brain Computer Interface for Second Life

15
BRAIN COMPUTER INTERFACE

While recent developments in brain-computer interface (BCI) technology have given


humans the power to mentally control computers, nobody has used the technology in
conjunction with the Second Life online virtual world — until now.

A research team led by professor Jun’ichi Ushiba of the Keio University Biomedical
Engineering Laboratory has developed a BCI system that lets the user walk an avatar
through the streets of Second Life while relying solely on the power of thought. To
control the avatar on screen, the user simply thinks about moving various body parts
— the avatar walks forward when the user thinks about moving his/her own feet, and
it turns right and left when the user imagines moving his/her right and left arms.

The system consists of a headpiece equipped with electrodes that monitor activity in
three areas of the motor cortex (the region of the brain involved in controlling the
movement of the arms and legs). An EEG machine reads and graphs the data and
relays it to the BCI, where a brain wave analysis algorithm interprets the user’s
imagined movements. A keyboard emulator then converts this data into a signal and
relays it to Second Life, causing the on-screen avatar to move. In this way, the user
can exercise real-time control over the avatar in the 3D virtual world without moving
a muscle.

Future plans are to improve the BCI so that users can make Second Life avatars
perform more complex movements and gestures. The researchers hope the mind-
controlled avatar, which was created through a joint medical engineering project
involving Keio’s Department of Rehabilitation Medicine and the Tsukigase
Rehabilitation Center, will one day help people with serious physical impairments
communicate and do business in Second Life.

4.2 BCI in Controlling Computers and Other Devices


Brain Computer Interfaces (BCIs) enable motor disabled and healthy persons to
operate electrical devices and computers directly with their brain activity. BCI
recognizes and classifies different brain activation patterns associated with real
movements and movement attempts made by tetraiplegic persons. One of the aims is

16
BRAIN COMPUTER INTERFACE

to examine whether subjects with no previous experience of BCIs could achieve


satisfactory performance after a short training period.
It is important to understand the signals used in BCI applications to concentrated on
motor cortex activity. Like most other BCI groups, we measure the electric activity of
the brain using electroencephalography (EEG) & also examined the feasibility of
magneto encephalography, MEG for BCI use.

Feedback plays an important role when learning to use a BCI. In BCI training, the
most commonly used feedback modality is visual feedback. Visual attention,
however, might be needed for application control: to drive a wheelchair, to observe
the environment, etc. It would be important to also test other feedback.

4.3 Restoring Physical Disabilities

One of the most critical needs for people with severe physical disabilities is restoring
the ability to communicate. The field of BCI research and development has since
focused primarily on neuroprosthetics applications that aim at restoring damaged
hearing, sight and movement.

Figure 4.3 Basic diagram of a handicapped controlling computer with BCI

17
BRAIN COMPUTER INTERFACE

4.4 Communication

Communication systems that do not depend on the brain’s normal output pathways of
peripheral nerves and muscles. In these systems, users explicitly manipulate their
brain activity instead of using motor movements to produce signals that can be used to
control computers or communication devices.

The impact of this work is extremely high, especially to those who suffer from
devastating neuromuscular injuries and neurodegenerative diseases such as
amyotrophic lateral sclerosis, which eventually strips individuals of voluntary
muscular activity while leaving cognitive function intact

4.5 Robotics

Controlling robots with thought has long been a popular science fiction concept.
Recent work with BCIs, however, has shown that robotic control is indeed possible
with brain signals. Applications for neurally-controlled robots currently center on
assistive technologies—―helper‖ robots—but BCI control has been proposed for
military and industrial applications as well. One of the earliest BCI-controlled robots,
the experiment explored the effects of real- world feedback (the movement of the
robot) in conjunction with a P300-based BCI, which depends on user attention. The
robot was configured to perform the steps to make coffee, such as getting powdered
coffee, sugar, and cream, and stirring the mixture with a spoon. The results showed
that users can effectively attend to real-world feedback while operating an attention-
based BCI.

18
BRAIN COMPUTER INTERFACE

Figure 4.4 A lady controlling a robotic arm to feed herself

4.6 Virtual Reality

In the BCI research world that have more practical purposes. The early work in virtual
environments is described in Bayliss and Ballard (2000), which details a study of a
P300 BCI controlling a virtual apartment and a virtual driving simulator.

Subsequent work as detailed in Pfurtscheller et al. (2006) incorporates the ReaCTor


―cave‖ environment, an immersive virtual world which the user navigates using a
BCI. The subject can ―walk‖ through the virtual world by imagining foot movement,
and can ―touch‖ things in the virtual world by imagining reaching and hand
movement

19
BRAIN COMPUTER INTERFACE

5. CHALLENGES AND OPPORTUNITIES OF BCI

Figure 5.1 Challenges and Opportunities of BCI

20
BRAIN COMPUTER INTERFACE

5.1 P300 BCI

Figure 5.2 P300 BCI

But there is a vast similarity among different signals making it difficult to decode.

5.2 Volitional Control of Neural Activity: Implications for BCI

Successful operation of brain–computer interfaces (BCI) and brain–machine


interfaces (BMI) depends significantly on the degree to which neural activity can be
volitionally controlled. Some evidence comes from conventional experiments that
reveal volitional modulation in neural activity related to behaviors, including real and

21
BRAIN COMPUTER INTERFACE

imagined movements, cognitive imagery and shifts of attention. More direct evidence
comes from studies on operant conditioning of neural activity using biofeedback, and
from BCI/BMI studies in which neural activity controls cursors or peripheral devices.
Limits in the degree of accuracy of control in the latter studies can be attributed to
several possible factors. Some of these factors, particularly limited practice time, can
be addressed with long-term implanted BCIs.

Brain–computer interfaces (BCI) and brain–machine interfaces (BMI) convert neural


activity at the level of neuronal action potentials, ECoG, or EEG into signals that
control computer cursors or external devices. The BCI paradigm bypasses the normal
biological pathways mediating volitional movements and employs upstream neural
activity that may have a complex relationship to motor or cognitive behavior. The
transform between this neural activity and the required control parameters is
facilitated by sampling relevant activity in appropriate brain regions, such as motor
cortex cells involved in limb movement. Conversion of these signals is further aided
by appropriate transform algorithms to generate the requisite control parameters. But
even with the best matches and the optimal algorithms, accurate device control under
diverse behavioral conditions depends significantly on the degree to which the neural
activity can be volitionally modulated.

5.2.1 Volitional Activation Revealed by BCI and BMI Studies

The volitional control of cortical cell activity has now been dramatically demonstrated
in numerous BCI and BMI studies in which primates controlled the position of cursors
or robotic arms with cortical activity under closed-loop conditions. Under ‘open-loop’
conditions, the activity of neural populations could be linearly transformed to the 3-D
coordinates of the monkeys' hand as they retrieved food from a well and brought it to
their mouth. Interestingly, the conversion parameters obtained for one set of trials
provided increasingly poor predictions of future responses, indicating a source of drift
over tens of minutes in the open-loop condition. This problem was alleviated when the
monkeys observed the consequences of their neural activity in ‘real time’ and could
optimize cell activity to achieve the desired goal under ‘closed-loop’ conditions. For
example, monkeys could successfully acquire targets on a two-dimensional workspace

22
BRAIN COMPUTER INTERFACE

or in virtual 3-D space with a cursor driven by activity of 10–30 motor cortex neurons.
More recently, the weighted activity of cell ensembles recorded over many cortical
areas are used to control a robotic arm to reach and grasp objects. Significantly,
several of these studies also demonstrated the ability to extract movement predictions
from neurons in post central as well as pre central cortical areas. Pre central motor
cortex cells provided the most accurate predictions of force and displacement, but
neurons from many other areas also provided significant predictions. The prediction
accuracy increased with the number of cells included, albeit with diminishing returns.

5.2.2 Limitations on Control for BCI and BMI

The complex transforms of neural activity to output parameters complicates the


degree to which neural control can be learned. In contrast to the relatively simple task
of driving one or two cells in bursts while allowing free performance of any correlated
responses, the requirement to modulate activity of a population to accurately control a
transformed function may be more difficult because the effect of any particular cell is
largely submerged in the population function. Moreover, activity of each cell in the
population has some stochastic component which may conspire against learning
optimal control of any particular cell.

The degree of independent control of cells may be inherently constrained by ensemble


interactions. A special example of such a constraint is the fixed relative recruitment
order of motoneurons according to the size principle, which has foiled attempts to
activate high threshold motor units independently of lower threshold units. Neural
ensembles may have comparable limits on the degree to which individual elements
can be independently activated. To the extent that internal representations depend on
relationships between the activities of neurons in an ensemble, the processing of these
representations involves corresponding constraints on the independence of those
activities. These constraints may explain the diminishing returns obtained from
increasing the number of neurons included in a linear filter. The ‘neuron dropping
curves’ representing the average accuracy as a function of the number of cells have
extrapolated asymptotes below 100% for indefinitely large populations. Yet, it

23
BRAIN COMPUTER INTERFACE

remains possible that longer experience with the same neuronal ensembles could
improve the achievable accuracy.

Difficulty in achieving reliable control also comes from employing adaptive decoding
schemes. Although such adaptive algorithms are intended to automatically optimize
control, they create a moving target for volitional modulation; the neural activity
pattern that worked at one time may subsequently become less effective, requiring the
learning of new patterns.

Finally, the ability to learn optimal control may be limited by the short and
intermittent exposure times, dictated by the need to tether the subject to the requisite
instrumentation. For example, a paraplegic subject that could practice neural control
of a cursor only several hours a week demonstrated remarkable success in controlling
a cursor movement, but nevertheless achieved a limited degree of accuracy.
Intermittent sessions also involve possible changes in the recorded neuronal
population, requiring the subject to relearn the task with a slightly different population
of cells.

5.3 Implantable Recurrent Brain–Computer Interfaces

‘Neurochip’ reliably records the activity of the same single neurons and two related
arm muscles for weeks, storing raw and/or compressed data to memory for daily
downloading via an infrared link. The compact connections and self-contained
circuitry makes unit recordings remarkably stable despite the unconstrained
movements of the monkey in the cage. For many neurons the correlations between
neural and muscle activity remained relatively stable, which bodes well for prosthetic
applications.

The Neurochip can also operate in a recurrent loop mode, converting action potentials
of a cortical neuron to stimuli delivered elsewhere in the motor system. Thus the
cortical cell could directly control functional electrical stimulation of muscles, spinal
cord or other brain region. Continuous operation of such a recurrent BCI (R-BCI)
allows the subject to adapt to the artificial pathway and by appropriately modifying

24
BRAIN COMPUTER INTERFACE

the neural activity, to incorporate its operation into normal behavior. Such R-BCI has
obvious potential prosthetic applications in bridging lost biological connections,
particularly when multiple parallel channels are implemented.

5.4 BCI Input and Output

One of the biggest challenges facing brain-computer interface researchers today is the
basic mechanics of the interface itself. The easiest and least invasive method is a set
of electrodes -- a device known as an electroencephalograph (EEG) -- attached to
the scalp. The electrodes can read brain signals. However, the skull blocks a lot of the
electrical signal, and it distorts what does get through.

Figure 5.3 Input and Output of BCI

To get a higher-resolution signal, scientists can implant electrodes directly into the
gray matter of the brain itself, or on the surface of the brain, beneath the skull. This
allows for much more direct reception of electric signals and allows electrode
placement in the specific area of the brain where the appropriate signals are generated.
This approach has many problems, however. It requires invasive surgery to implant
the electrodes, and devices left in the brain long-term tend to cause the formation of
scar tissue in the gray matter. This scar tissue ultimately blocks signals.

25
BRAIN COMPUTER INTERFACE

Regardless of the location of the electrodes, the basic mechanism is the same: The
electrodes measure minute differences in the voltage between neurons. The signal is
then amplified and filtered. In current BCI systems, it is then interpreted by a
computer program, although you might be familiar with older analogue
encephalographs, which displayed the signals via pens that automatically wrote out
the patterns on a continuous sheet of paper.

In the case of a sensory input BCI, the function happens in reverse. A computer
converts a signal, such as one from a video camera, into the voltages necessary to
trigger neurons. The signals are sent to an implant in the proper area of the brain, and
if everything works correctly, the neurons fire and the subject receive a visual image
corresponding to what the camera sees.

Another way to measure brain activity is with a Magnetic Resonance Image (MRI).
An MRI machine is a massive, complicated device. It produces very high-resolution
images of brain activity, but it can't be used as part of a permanent or semipermanent
BCI. Researchers use it to get benchmarks for certain brain functions or to map where
in the brain electrodes should be placed to measure a specific function. For example,
if researchers are attempting to implant electrodes that will allow someone to control
a robotic arm with their thoughts, they might first put the subject into an MRI and ask
him or her to think about moving their actual arm. The MRI will show which area of
the brain is active during arm movement, giving them a clearer target for electrode
placement.

5.4.1 Cortical Plasticity

For years, the brain of an adult human was viewed as a static organ. When you are a
growing, learning child, your brain shapes itself and adapts to new experiences, but
eventually it settles into an unchanging state -- or so went the prevailing theory.
Beginning in the 1990s, research showed that the brain actually remains flexible even
into old age. This concept, known as cortical plasticity, means that the brain is able
to adapt in amazing ways to new circumstances. Learning something new or partaking
in novel activities forms new connections between neurons and reduces the onset of

26
BRAIN COMPUTER INTERFACE

age-related neurological problems. If an adult suffers a brain injury, other parts of the
brain are able to take over the functions of the damaged portion. Why is this important
for BCIs? It means that an adult can learn to operate with a BCI, their brain forming
new connections and adapting to this new use of neurons. In situations where implants
are used, it means that the brain can accommodate this seemingly foreign intrusion
and develop new connections that will treat the implant as a part of the natural brain.

5.5 Sensory Input

The most common and oldest way to use a BCI is a cochlear implant. For the
average person, sound waves enter the ear and pass through several tiny organs that
eventually pass the vibrations on to the auditory nerves in the form of electric signals.
If the mechanism of the ear is severely damaged, that person will be unable to hear
anything. However, the auditory nerves may be functioning perfectly well. They just
aren't receiving any signals.

Figure 5.4 Dr. Peter Brunner demonstrates the brain-computer interface at a conference in Paris.

A cochlear implant bypasses the nonfunctioning part of the ear, processes the sound
waves into electric signals and passes them via electrodes right to the auditory nerves.
The result: A previously deaf person can now hear. He might not hear perfectly, but it
allows him to understand conversations.

The processing of visual information by the brain is much more complex than that of
audio information, so artificial eye development isn't as advanced. Still, the principle

27
BRAIN COMPUTER INTERFACE

is the same. Electrodes are implanted in or near the visual cortex, the area of the brain
that processes visual information from the retinas. A pair of glasses holding small
cameras is connected to a computer and, in turn, to the implants. After a training
period similar to the one used for remote thought-controlled movement, the subject
can see. Again, the vision isn't perfect, but refinements in technology have improved
it tremendously since it was first attempted in the 1970s. Jens Naumann was the
recipient of a second-generation implant. He was completely blind, but now he can
navigate New York City's subways by himself and even drive a car around a parking
lot. In terms of science fiction becoming reality, this process gets very close. The
terminals that connect the camera glasses to the electrodes in Naumann's brain are
similar to those used to connect the VISOR (Visual Instrument and Sensory Organ)
worn by blind engineering officer Geordi La Forge in the "Star Trek: The Next
Generation" TV show and films, and they're both essentially the same technology.
However, Naumann isn't able to "see" invisible portions of the electromagnetic
spectrum.

28
BRAIN COMPUTER INTERFACE

6. BCI INNOVATORS

A few companies are pioneers in the field of BCI. Most of them are still in the
research stages, though a few products are offered commercially.
• Neural Signals is developing technology to restore speech to disabled people.
An implant in an area of the brain associated with speech (Broca's area) would
transmit signals to a computer and then to a speaker. With training, the subject
could learn to think each of the 39 phonemes in the English language and
reconstruct speech through the computer and speaker.
• NASA has researched a similar system, although it reads electric signals from
the nerves in the mouth and throat area, rather than directly from the brain.
They succeeded in performing a Web search by mentally "typing" the term
"NASA" into Google.
• Cyberkinetics Neurotechnology Systems is marketing the BrainGate, a neural
interface system that allows disabled people to control a wheelchair, robotic
prosthesis or computer cursor.
• Japanese researchers have developed a preliminary BCI that allows the user to
control their avatar in the online world Second Life

29
BRAIN COMPUTER INTERFACE

8. CELL-CULTURE BCIs

Researchers have also built devices to interface with neural cells and entire neural
networks in cultures outside animals. As well as furthering research on animal
implantable devices, experiments on cultured neural tissue have focused on building
problem-solving networks, constructing basic computers and manipulating robotic
devices. Research into techniques for stimulating and recording from individual
neurons grown on semiconductor chips is sometimes referred to as neuroelectronics
or neurochips.

Figure 8.1 World first: Neurochip

World first Neurochip developed by Caltech researchers Jerome Pine and Michael
Maher Development of the first working neurochip was claimed by a Caltech team
led by Jerome Pine and Michael Maher in 1997. The Caltech chip had room for 16
neurons. In 2003, a team led by Theodore Berger at the University of Southern
California started work on a neurochip designed to function as an artificial or
prosthetic hippocampus. The neurochip was designed to function in rat brains and is
intended as a prototype for the eventual development of higher-brain prosthesis. The
hippocampus was chosen because it is thought to be the most ordered and structured
part of the brain and is the most studied area. Its function is to encode experiences for
storage as long-term memories elsewhere in the brain.

30
BRAIN COMPUTER INTERFACE

9. ETHICAL CONSIDERATIONS

Discussion about the ethical implications of BCIs has been relatively muted. This may
be because the research holds great promise in the fight against disability and BCI
researchers have yet to attract the attention of animal rights groups. It may also be
because BCIs are being used to acquire signals to control devices rather than the other
way round, although vision research is the exception to this.

This ethical debate is likely to intensify as BCIs become more technologically


advanced and it becomes apparent that they may not just be used therapeutically but
to enhance human function. Today's brain pacemakers, which are already used to treat
neurological conditions such as depression could become a type of BCI and be used to
modify other behaviors. Neurochips could also develop further, for example the
artificial hippocampus, raising issues about what it actually means to be human.

Some of the ethical considerations that BCIs would raise under these circumstances
are already being debated in relation to brain implants and the broader area of mind
control.

31
BRAIN COMPUTER INTERFACE

10. CONCLUSION

The future will bring a revolution as BCI are constructed using nanotechnology and
information getting in and out is becoming easy. The challenge is to make it reliable
and durable. Prof. Stephen Hawkins never dreamt of communicating after being
paralyzed, but Brain Computing made it possible. This is itself a huge step forward.
In the future it can be used by those individuals whose injuries are less severe. The
most advanced work in designing work a brain –computer interface has stemmed
from the evolution of new electrodes next generation product may be able to provide
an individual with the ability to control devices that allows breathing, bladder, and
bowl movements. The brain computer interface system is an Investigational device
and is not approved for sells and is available only for a clinical trail. Once this fiction
gets a place in real world and also on human, it will be boon for those who have
suffered from congenital diseases.

32
BRAIN COMPUTER INTERFACE

REFERENCES

[1]. Adams L, Hunt L, Moore M (2003) The aware system: Prototyping an


augmentative communication interface. Paper presented at the Proceedings of
the Rehabilitation Engineering Society of North America (RESNA).
[2]. Archinoetics Inc (2009) Brain Painting, from http://www.archinoetics.com/
[3]. Bayliss J, Ballard D (2000) A virtual reality testbed for brain-computer
interface research. IEEE Trans Rehabil Eng 8(2):188–190
[4]. Bell C, Shenoy P, Chalodhorn R, Rao R (2008) Control of a humanoid robot
by a noninvasive brain-computer interface in humans. J Neural Eng 5:214–220
[5]. Birbaumer N, Cohen L (2007) Brain-computer interfaces: Communication and
restoration of movement in paralysis. J Physiol 579:621–636
[6]. Birbaumer N, Hinterberger T, Kubler A, Neumann N (2003) The thought-
translation device (TTD): Neurobehavioral mechanisms and clinical outcome.
IEEE Trans Neural Syst Rehabil Eng 11(2):120–123
[7]. Blankertz B, Dornhege G, Krauledat M, Müller KR, Kunzmann V, Losch F et
al (2006) The Berlin brain-computer interface: EEG-based communication
without subject training. IEEE Trans Neural Syst Rehabil Eng 14(2):147 –152
[8]. Blankertz B, Krauledat M, Dornhege G, Williamson J, Murray-Smith R,
Müller KR (2007) A note on brain actuated spelling with the Berlin brain-
computer interface. Universal Access in HCI, Part II, 4555:759–768BCI-
info.org, BCI news and research portal.
[9]. http://computer.howstuffworks.com/brain-computer-interface.
[10]. Mind Control, Wired Magazine, March 2005, article on Matt Nagle.
[11]. ‘Brain’ in a dish flies fight simulator, CNN,4 November 2004, article on cell-
cultured BCI.
[12]. Controlling robots with mind, Scientific America, 16 September 2002, article
on Miguels Nicolelis.
[13]. http://www.HUT - LCE Cognitive Technology Brain Computer Interface

33

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy