Sie sind auf Seite 1von 10

Pressing The Screen Into Your Mind: The Ethical Implications of ComputerBrain Interfacing

Though there is evidence of brain surgery occuring as early as 5100 BC


(Walker), and these early trepanations (as well as the modern pseudoscience)
remind us that we have long been interested in plumming the depths of the
mind, our understanding of the mechanisms by which the brain operates is
still relatively limited. Yes, the field of neuroscience has, to an extent, enabled
us to map semantic fields in the brain, to the point where we are now able to
manipulate specific neurons, but we are still completely unable, for instance,
to comprehend how a child, in just a matter of months, is able to distinguish
arbitary signifiers from meaningless noise and then to communicate at a level
that no other primate will ever reach. Understanding the gaps in our
knowledge is crucial to how we percieve the advancing research into the
technology that is known most commonly as the Computer-Brain Interface
(CBI) . This technology has certainly shown us much about the brain that we
did not know before, but it is primary not for research but prothesis. Invasive
experiments using animals as human proxies can only lead us so far and
research on humans is often, if not always, fraught with ethical issues and in
the past has lead to disasterous consequences. The invention of
Electroencephalography (EEG) by Hans Berger in 1924 was a milestone that
allowed non-invasive interfaces to be set up, spawning the concepts of
Neurofeedback and Neurogaming and making interfacing more commercially
viable. Partly-invasive CBIs that sit between the scalp and the brain are a
promising compromise between the inexpensive EEG's with poor signal
quality and the invasive, expensive CBIs with high risk of side effects.
Although CBIs are often distinguished from neuroprosthetics, typically
involved in converting sensory data to the brain via perriphery nerves (such
as coclear and retinal implants), we are here concerned with any electronic
device which directly connects the cognitive apparatus to a digital system
that either recieves data from or delivers data to an external analogue
source, such as text or images on a screen, sound from a speaker or even
another brain. They are mostly implemented theraputically, but have more
recently turned towards commercialisation through enhancement so that
there is already an IPhone app, the PLX Xwave, which gives a basic form of
one dementional mind control to the user - though cleary not one "limited
only to the power of your imagination" as its advertising claims (Van Hemert).
Briefly touching upon these two areas of use we will consider what is and
what is not currently possible, and the ethical implications of both the present
and future.

Theraputic Uses: Restoring Functionality


From Parkinson's disease to "locked in" syndrome, surgical proceedures
have often been implemented in an attempt to bring patients towards a
normal level of motor functioning by implanting an electronic device such as
a brain pacemaker to correct brainwave patterns or monitor their changes.
There are numerous possible side effects to these proceedures such as
obesity and depression, and obviously brain surgery itself is never entirely
safe, especially since patients may have to undergo ajustment through a
second surgery if it begins to cause pain. Although treatments such as deep
brain stimulation (used to treat Parkinson's disease) seem to be effective, it is
still not entirely clear exactly how their underlying mechanisms function
(Hammond et al 2111). Although in many cases patients with Parkinson's
disease regained some motor control it is telling that even patients who were
able to go to work before undergoing the procedure were rendered incapable
of doing so post-op. The implant often changes pateints' pschycosocial
perceptions of themselves making them feel like "electronic dolls" (Clausen).
The first BCI spelling device for a locked in patient was created in 1996
by Niels Birbaumer and his team allowing a locked-in patient to communicate
("A Spelling Device" 297) . However as of today there is still no reliable CBI
for 'total' locked-in patients because it is very difficult to establish a feedback
loop if they do not have any muscle movement at all as such devices rely on
conditioning the movement of a cursor with the muscles in the eye or
eyebrow. Despite this in the last 17 years progress has been made towards
creating a system that will enable these patients to communicate and its
results are at least partially encouraging (Birbaumer, et al. "Brain
communication in the locked-in state"). This is important as the will to live for
such patients is often predicated on being able to communicate as there is no
other physical way for them to interact.
There has been much research into restoring motor functions for
tetraplegics. Miguel Nicolelis and his team at Duke University have carried
out invasive experiments on monkeys that has successfully given them
bimanual control of robotic arms by thought and more recently to control a
walking robot on the other side of the Pacific (A BMI Enables Bimanual Arm
Movements in Monkeys; Interview by Christine Soares). Although his eventual
aim is to build a body suit that paraplegics can walk with using their thoughts
the most problematic aspect is that the imagined movements have to be
mapped to the monkeys actual movements before the brain-computer device
can work, not something person with paralysis could do, unless their
condition was slow in coming on. In 2005, Matt Nagale became the first
tetraplegic to move a prosthetic arm, and since then another has been
implanted with seven degrees of freedom (Collinger et al. 557). However
patients' movement signals must "persist in cortex after spinal cord injury

and be engaged by movement intent when sensory inputs and limb


movement are long absent" (Hochberg et al. 164) There has also been a
succesful attempt to use a non-invasive EEG to bypass spinal chord injuries
and render uninhibited muscle movements in the hand, by imagining
movements in the foot (the device recieving the BCI is in this case a
functional electrical stimulation or FES, see Pfurtschellera et al 33).
There
seems to be great promise for partially paralysed people because we can use
computers to map parts of their brain as they think about moving their
muscles and even the slightest movement can be enough to move a cursor,
without the ability to form these maps there is little hope for communication.
There is also the consideration of how difficult it can be to gain consent from
such closed off patients, and that the margin of error always means that
communication might easily be mistranslated. There is very intersting work
into developing patterns of pre-speech which has rendered such
communication plausible, if not yet possible. Morse code was first transmitted
through an EEG in the 1960's but little progress had been made on speech
until last year, when a crucial study, carried out by Gerwin Schalk's team,
revealed that "it is possible to use signals recorded from the surface of the
brain" using electrocorticography or (ECoG), a partly-invasive CBI, to
"discriminate the vowels and consonants embedded in spoken and in
imagined words" (1088).
Commercial Prospects: Enhancing Humanity
The first documented CBI prototype used an EEG to warn pilots when
they were in danger of altitude sickness in the Second World War (Clausen).
Although there is potential for using these devices theraputicly to treat
anything from pschopaths and children with ADHD and sleeping problems,
they have quickly bled over into the commercialisation and there are now
already several different consoles which programmers can attach themselves
to such as Emotiv and Neurosky. Based on Pavlovian conditioning these
technologies map out the brain functions out and after a certain amount of
training gamers will be able to think about movement and have it recreated
on the screen. Though at the moment this is limited to only several
dimentions of movement some companies have built in other features that
claim to adjust music and colour tone to the players current mood (Marshall
et al. 82). EEG's that have been adapted for Neurogaming use a variety of
strategies and each has different drawbacks but all require a certain level of
silence to function and a massive difficulty is that as a game is played players
cognitive states change or adapt, complicating sustained mapping for
gameplay (ibid 83). Schalk has tested gameplay with an ECoG implanted,
using a patient fitted for epilespy treatment, and he was able to play space
invaders with minimal learning time (Kennedy). If such implants become
more commercially viable, and socially accepted, in the future, this may be

more promising for gamers.


Neurofeedback is quickly taking off as a different tool for
enhancements in fields of sports and music performance. Like gaming it is
based on conditioning towards recorded brain activities and also (when
coupled with biofeedback) records supplimentary measurements such as
heart rate and blood pressure. Marketed to everyone from Olympic gymnasts
and golfers to orchestras, these products are aimed to "develop the skills to
reach the optimal mental and physical state to perform at their best"
(Starkman). Though most seem to feel that it is working, currently, for health
purposes such as ADHD, in the scientific community there is still "insufficient
evidence to support a conclusion concerning the health outcomes or benefits
associated with this procedure" (Capital Blue Cross 3) and as Clausen points
out it has been that way for more than a decade. There are many ethical
implications around these technologies which used in a particular way might
constitue to doping in sporting terms. If we consider stage fright (as a part of
presence) to be an intergral component of musical performance then we
could also discuss the ethical emplications of using technology to rid our
selves of such a trivial problem. A similar system has been used to allow a
partly paralysed patient to play a very rudimentary musical instrument
(Miranda et al.), and there are marketed instruments that operate on EEG's,
even software which plugs your mind straight into Digtal Audio Workstations
such as MindMIDI.
In 2008 the first voiceless telephone call was demonstrated on stage
using non-invasive Ambient Audeo technology to senses muscle movement in
the laryngeal muscles which control speech (Simonite). It was then only
possible to use 150 words but now it can be trained to recognise different
phonemes. Despite the theraputic implications of this technology for those
who cannot speak or communicate it seems to be applied mainly
commerically. The effort required is apparently beyond the level of just
thinking about speaking, and so the ethical questions of not being able
having people hear what you're thinking seem to be avoided.
Further Research and Ethical Conundrums
In 1999 the first replication of vision from cats brain to a computer
screen was achieved through invasive experimentation (Dan et al. 8036),
and, in 2008, using a non-invasive functional magnetic resonance imaging
device (fMRI), this process was demonstrated to be possible in humans
(Miyawaki 195). This research certainly has limits as "it is impractical to
specify brain activity for all possible images" making constraint-free visual
image reconstruction much more challenging, but it is the furthest we have
come to reconstructing human perception directly from the brain. The
opposite process, giving sight to the blind, was tried invasively as long ago as

1978, using cameras to caputre images, but ended tragically with the death
of William Dobelle before his work was documented and the painful recession
of his patients' partly resotred vision (Balogh). It is plausible that this
Frankenstienian catastrophy has made this particular area of research taboo
in scientific communities as there has been little further work done towards
transfering the image directly to the brain, but the limited success of the
project has shown that such a system has potential doubtless its results,
negative and positive, wont be forgotten. In 2011 the first cortical implant
was approved from clinical trials, but these do not treat blindness but rather
stimulate muscles which are degenerating from diseases which cause
blindness.
Therapies which seek to rectify disabilities such as blindness, deafness
and tetraplegia have massive practical value for patients, despite the
possibility of side effects in invasive circumstances. However, with such
practices of correction come the homogenising discourses that inseminate
the idea that the abnormal should seek technology to become normal, rather
than society becoming more adaptable to the multiplicity of different bodies
and lifestyles that exist within it. This particular hegemony was coined by
Anita Silvers as the 'tyranny of the normal'. These are similar arguments to
the ones pitted against genetic screening for conditions at birth. The
possibility of completely restoring sight for example, like testing children for
genetically inherited blindness, might sound like a rallying call for genocide to
the ears of the blind community. Strangely, at present, the degrees of
freedom given by even the most invasive techniques are so limited that
instead of recreating the normal we are turning the abnormal into the
uncanny.
Time will improve techniques and also change perception of these
devices, but at present CBIs are anything but normal. The simple fact that
commercialisation of devices has already begun, despite many of them being
more or less undocumented scientifically, is proof that they will develop,
diversify and intergrate themselves with our species. If we see technology
from the point of view of Kevin Kelly, as the seventh kingdom of life following
evolutionary trends towards intergration and specification, then by having the
mind ontically intergrated into the circuit of information we might consider
the computer-brain complex to be evolving in a manner relative to the step
towards multicellular life. At present we are able to achieve limited ontic
integration with the screen by and there is still the possibility of, if anyone
gives more study to the limited successes Dobbelle achieved, bringing the
computer screen visually into the mind. In an experiment on rats, Nicholeis'
team have managed to , stimulate the visual cortex of one rat by sending the
signals through the internet from the brain of another rat who is given a
visual stiumlus (Nicholeis, Pais-Vieira, et al.). By doing this they learn to solve

tasks together and have essentially formed the technological equivalent of a


hive mind, or as Nicholeis describes it an "organic computer" (Sample).
Despite the fact morphic resonance is almost certainly a psuedoscience the
very idea, like the idea of enhancement through trepanation, has leaked into
the human conciousness and now we are at the brink of being capable of
creating a collective resonance through machines. This is not what this
technology is intended to produce, but Nicholeis admits he believes there will
be "human brain nets" in the future, if probably not in his own lifetime
(Costandani).
Unlike Kelly I do not see the development of technology as being, for
the most part, unambiguously beneficial. Until artificial general intelligence is
achieved, assuming that it can be, technology has no choice how it is used
and therefore is neutral. Since 2008 the US has to been working on synthetic
telepathy for combat situations (Bland), and according to Weiss there was a
report released earler in the year by the NRC which highlighted some
dangerous thought processes that are going into the some of the funding for
these innovations: "how can we disrupt the enemy's motivation to fight... how
can we make people trust us more... what if we could help the brain remove
pain and fear... is there a way to make the enemy obey our commands?".
Conditioning against fear and pain is already plausible, if not tested, although
the other questions, at least by using BCIs, are in the realm of science fiction.
Unless of course these 'enemies' were 'terrorists', or anyone who the
government deems to be a subvertive force to national interests, in a future
where these machines have been intergrated into our culture, connected by
some 'matrix' of CBIs.
This leads us to the nightmare dystopia of one of our possible futures,
which I would like to christen the trepanopticon. In this future research is
further research is completed into Dobelle and Schalk's findings making it
possible to communicated sound and images in and out of the brain,
essentially the direct neural connection to the matrix of Neuromancer. Whilst
the screen comes closing in, the US and other countries continue to invest
heavily in dominating in the militaristic applications of CBIs. With the
revelations which broke last summer over the extent to which the NSA
already collects and stores ambient data it should be clear that agencies may
attempt to use this technological shift for their own ends, whether to monitor,
police or control. If the panopticon is the ideal architectural structure for the
modern distribution of power, then the trepanopticon is the ideal
technological enhancement for the post-modern, post-human distribution of
power. Mike D'Zmura, who is one of the researchers employed by US army,
emplores that people will "learn to think in a way the computer couldn't
interpret" or "they can just switch it off", (Bland), but what if the opposite is
true? What if, like the introduction of cell phones little more than a decade

ago, the CBI becomes a portable device that we leave on everywhere, or


even have voluntarily implanted into the skull? What if this changes the way
we think so that we inhibit our own dissenting or creative ideas? Anders
Sandberg speaks closer to the mark when he points out that what we have in
our head is just a draft and that most of the time "we'd be very thankful not
to be in someone else's head" (Sample).
Like the issue of many medical-technological issues it is very probably
the pace with which different areas of the field progress that will ultimately
decide whether the organic computer networks are formed for the benefit or
the confinement of humanity. If there is not more study done into how this
shift in interfacing actually affects brain function then we may be changing
our brains in dangerous ways which we are unaware of. That said, depending
on what actually does become possible, the prevention of the formation of a
trepanopticon can be achievedif tight international regulation to make these
technologies neutral is put in place and adhered to, essentially if we learn
from the ethical problems of leaving government agencies with the power to
oversee our actions.
Don't talk politics and don't throw stones
Your Royal Highness says
Well of course I'd like to sit around and chat
Well of course I'd like to stay and chew the fat
...
But someone's listening in.

Life in a Glass house - Radiohead

,
Works Cited:
Balogh, Meghan. "Man's High Tech Paradise Lost". The Whig, The Whig
Mag., 28 November 2012. Web. 11 April 2014.
Birbaumer, Niels, D. De Massari, et al. "Brain communication in the

locked-in state." Oxford Journals: Brain 136.6 (2013): 1989-2000. Web. 11


April 2014.
Birbaumer, Niels, N. Ghanayim, et al. "A spelling device for the
paralysed." Nature: Scientfic Correspondance 398.1 (1999): 297-8. Web. 11
April 2014.
Bland, Eric. "Army Developing 'Synthetic Telepathy'". Science, NBC, 13
October 2008. Web. 11 April 2014.
Clausen, Jens. "Lecture on Brain Computer Interfaces for Enhancement
Purposes." University of Tbingen. Gartenstr, Tbingen. 10 April 2010. Web.
Vimeo. 11 April 2014.
Capital Blue Cross. "Biofeedback and Neurofeedback Policy". Capital
Blue Cross Medical Policy MP- 2.064. Capital Blues Cross Ltd., 1-65 January
28, 2014. Web. 11 April 2014.
Collinger, J. L., et al. "High-performance neuroprosthetic control by an
individual with tetraplegia." The Lancet (2013): 557 - 564. Web. 11 April
2014.
Costandani, Mo. "Brain-to-brain interface transmits information from
one rat to another." Neurophilosophy. Guardian, 28 February 2009. n. pag.
Web. 11 April 2014.
Dan, Yang, G. B. Stanley, F. F. Li, "Reconstruction of Natural Scenes
from Ensemble Responses in the Lateral Geniculate Nucleus." The Journal of
Neuroscience 19.18 (1999): :80368042. Web. 11 April 2014.
Hammond C., Ammari R, Bioulac B, Garcia L "Latest view on the
mechanism of action of deep brain stimulation." Mov Disord 23:1 (2008)
211121. Web. 11 April 2014.
Hochberg, Leigh R., et al. Neuronal ensemble control of prosthetic
devices by a human with tetraplegia" Nature, 2006: 442, 164-171. Web. 11
April 2014.
Kelly, Kevin. What Technology Wants. New York, Penguin: 2003. Print.
Kennedy, Pagan. "The Cyborg in us all". New York Times, New York
Times mag., n. pag. 14 September 2011. Web. 11 April 2014.
Marshall, D., S. Wilson, et al. "Games, Gameplay, and BCI: The State of
the Art," Computational Intelligence and AI in Games 5.2 (2013): 82-99. Web
11 April 2014.
Miyawaki, Yoichi, Hajime Uchida. "Visual Image Reconstruction from

Human Brain Activity using a Combination of Multiscale Local Image


Decoders." Neuron 60.5 (2008): 915929. Web. 11 April 2014.
Miranda, Eduardo, W. L. Magee, et al. "Computer Brain Music
Interface" Music and Medicine 3.3 (2011): 134-140. Web. 11 April 2014.
Nicolelis, Miguel. "Lecture: A monkey that controls a robot with its
thoughts. No, really." TED talks. TED, April 2012. Web. 11 April 2014.
Nicholeis, Miguel, Miguel Pais-Vieira, et al. "A Brain-to-Brain Interface
for Real-Time Sharing of Sensorimotor Information". Scientific Reports 3 n.
pag. 28 February 2013. Web. 11 April 2014.
Nicolelis, Miguel, P. J. Ifft, , et al. "A Brain-Machine Interface Enables
Bimanual Arm Movements in Monkeys." Science Translational Medicine 5
(2013): 1-13. Web. 11 April 2014.
Nicolelis, Miguel. " Interview by Christine Soares." Scientific American,
Scientific American Mag., n. pag. Jan 16. 2008. Web. 11 April 2014.
Pfurtschellera, Gert, G. R. Mllera, et al. "Thought control of
functional electrical stimulation to restore hand grasp in a patient with
tetraplegia." Neuroscience Letters 351.1 (2003): 3336. Web. 11 April 2014.
Sample, Ian. "Brain-to-brain interface lets rats share information via
internet" Guardian. n. pag. 1 March 2013. Web. 11 April 2014.
Schalk, Gerwin, X. Pei, et al. "Decoding vowels and consonants in
spoken and imagined words using electrocorticographic signals in humans."
Journal of Neural Engineering 8.4 (2011): 1088-1741. Web. 11 April 2014.
Silvers, Anita. Defective Agents: Equality, Difference and the Tyranny
of the Normal." Journal of Social Philosophy 25.1 (1994): 154175. Web. 11
April 2014.
Simonite, Tome. "Nerve-tapping neckband used in 'telepathic' chat".
New Scientist, New Scientist Mag., n. pag. 12 March 2008. Web. 11 April
2014.
Starkman, Randy. "Wired for Success." The Star, The Star mag., n. pag.
8 October 2009. Web. 11 April 2014.
Walker, Arnelie. "Neolithic Surgery." Archaeological Institute of America
50.5 (1997). n. pag. Web. 11 April 2014.
Wiess, Rick. "Mining Mental Minefields: How to Stockpile the
Neuropharmacutical Arsenal". Science Progress, Science Progress mag., n.
pag. 15 August 2008. We.b. 11 April 2014

Van Hemert, Kyle. "XWave Headset Lets You Control iPhone Apps With
Your BRAIN." Gizmodo, Gizmodo Mag., n. pag. 7 September 2010. Web. 11
April 2014.

Das könnte Ihnen auch gefallen