Beruflich Dokumente
Kultur Dokumente
Abstract-Brain-computer interfaces are attracting much at- are transmitted wirelessly to a computer, and the decoded brain
tention in the field of pervasive computing. It makes possible to signals are converted to stimulate the brain of the ratbot to tum
directly transfer a subjects decision to another. In this paper, left, tum right or go forward.
we develop a motor brain-computer-brain system, which allows
human mind to drive a rat robot in a navigation environment.
Neural signals from an EEG headset are transmitted wirelessly II. SYSTEM DESIGN
to a computer, then the decoded neural signals are converted to
steer a rat robot to navigate in a sand table. Our work breaks The system framework of our brain-computer-brain system
new ground as a brain-computer-brain interface for pervasive is illustrated in Fig. 1. A ratbot is executing search missions
computing, and demonstrates the possibility to exchange messages in a sand table. Video of what the ratbot is looking at from
in dyads or networks of brains. camera mounted on the ratbot is sent to a computer through
Index Terms-ratbot; brain-computer interface; brain to brain; 1.2G Hz wireless network. Human wearing a wireless Emotiv
brain network EEG neuroheadset watches the video playing on the screen of
the computer, and thinks where to go. Electrical brain activity
I. INTRODUCTION from Brain 1 (the Sender) is recorded using Emotiv EEG
neuroheadset, and sent to the computer via Bluetooth. The
Pervasive computing has become an emerging field since computer evaluates real time brainwave activity to discern the
the Mark Weisers pioneering article [1], where computing is intents of the user (Left, Right, or Forward), and sends the
made to appear everywhere. Various technologies have been order to stimulate Brain 2 (the Receiver) via Bluetooth. The
developed for supporting pervasive computing mainly based well trained ratbot would go according to the given order. Thus
on traditional mechanical sensors and actuators. However, bio- a close loop based brain network is running until the search
robots are superior in many aspects to traditional mechanical missions are completed.
robots such as mobility, perceptivity and adaptability [2]. Take
rat robot (ratbot) for example, rat has great adaptation and
flexibility in different environments, and its great sensory
capability, including smelling and hearing, can be used by
recording and interpreting brain signals through BCI tech-
niques [3]. Kuznetsov et al has expanded the current landscape
of sensing in pervasive computing to include living organisms,
such as plants and animals, along with traditional tools and
digital devices [4], cyborg intelligence [5] and bio-robot like
Roboroach [6].
Fig. 1. System framework
Brain computer interface provides a direct communication
pathway between a brain and an external device, and it
has shown great potential for pervasive computing by some
promising research results. DiGiovanna et al has developed a A. Computer to Brain
co-adaptive brain machine interface in which rats and a neural
The first ratbot was developed in 2002 [16], a ratbot system
prosthetics adapt to each other [7]. And several groups have
refers to a rat equipped with a wireless backpack stimulator,
tried to build a direct network between two brains to complete
through which the outer electric stimulus is delivered directly
some simple tasks [8], [9], [10]. Neural signals are everywhere
into the rats brain to direct its behaviors [17]. Researchers have
just like mobile phones. Brain computer interface can be a very
realized three behavioral control commands, Forward, turning
good complementary technology to pervasive computing for
Left and Right to steer the ratbot along a complicated route.
directly interpreting humans intention, emotion and physiolog-
Stimulations in the medial forebrain bundle (MFB) of brain
ical status. Recent years have seen more and more applications
have been used as a type of reward to rats, and stimulations
of BCI technology in the pervasive computing [11], [12], [13],
in the somatosensory cortices of the left and right half brains
[14], [15].
have been used as steering cues for controlling rat navigation.
In this paper, we develop a motor brain-computer-brain A ratbot would learn to walk according to the stimulations after
system. A ratbot with a micro camera mounted on his back navigation training [18], [19]. A full review of ratbot is beyond
is placed in a sand table to execute search missions, human the scope of this paper, interested readers can refer to previous
wearing a wireless EEG headset watches videos of the exper- articles mentioned above. A well-trained ratbot named F05 is
iment form the camera, and think where to go. Brain signals used in our experiment.
B. Brain to Computer SMA to measure motor imaginary EEG. They are F3, F4,
F7, F8, FC5 and FC6 [21]. The raw EEG data was filtered
As you can see, in order to navigate a ratbot by three
by an 8-12Hz band-pass elliptic filter. To improve the SNR,
different kinds of stimulus (Left, Right, and Forward), there
we used a common spatial pattern (CSP) method. The CSP
should be at least three relatively steady brain states human
method transforms multichannel filtered EEG data into a lower
can generate. The wireless neuroheadset we use here is cheap
dimensional subspace by calculating a transform matrix called
and convenient to wear, but the EEG brain signals it gets have
spatial filter [22]. Then we use the spatial filter to reduce the
a significant amount of noise and require more sophisticated
data dimension. We also take 20 neutral tests to calculate a
signal processing and machine learning techniques to classify
baseline condition of dimension reduced data. The task-related
neural events [11]. As a pilot study, and take the difficulty
motor imaginary was detected by applying a threshold of five
of controlling a live ratbot into account, the EEG controlled
times standard deviation in baseline condition.
navigation paradigm utilizes blink potential to give Forward
order and motor imaginary related sensorimotor rhythms to 3) Online Brain to Computer Inteiface: To evaluate our
give Left and Right order (see Fig. 2). In the following online brain to computer interface, subjects are told to wink
subtitles, we discuss the realization of our real-time online or imagine left or right movement after screen cues. The output
motor brain to computer interface. accuracy of 3 subjects in 100 tests is shown in Table I. The
man who has the best online classification result proceeds to
control the ratbot (F05) in our experiment.
200
100
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
sec
Fig. 4. Black line represents screen cues, and blue line rep-
Fig. 3. Unfiltered AF4 signal where we set wink threshold resents outputs of control signals. Green line and red line
larger than four times standard deviation of non-wink condition represent the processed EEG signals of imaging left and right.
(about 150j-tV).
229
2014 IEEE International Conference on Pervasive Computing and Communications Work in Progress
TABLE II. Success rate (SR) and time consumption of the two
missions. Max, Min and Mean represent the maximal, minimal
and average value of the time consumption.
Mission 1 Mission 2
Fig. 5. A landmark in the video from the ratbots camera.
SR Max Min Mean SR Max Min Mean
(s) (s) (s) (s) (s) (s)
Hand 80% 233 145 179 80% 198 121 172
Brain 70% 312 147 203 60% 376 271 305
III. EXPERIMENTS AND RESULTS
230
2014 IEEE International Conference on Pervasive Computing and Communications Work in Progress
231