Sie sind auf Seite 1von 4

2014 IEEE International Conference on Pervasive Computing and Communications Work in Progress

Mind-controlled Ratbot: A Brain-to-brain System

Yipeng Yu,Cunle Qian, Zhaohui Wu, Gang Pan*


Department of Computer Science
Zhejiang University, China
Email: {yuyipeng.gpan}@zju.edu.cn

Abstract-Brain-computer interfaces are attracting much at- are transmitted wirelessly to a computer, and the decoded brain
tention in the field of pervasive computing. It makes possible to signals are converted to stimulate the brain of the ratbot to tum
directly transfer a subjects decision to another. In this paper, left, tum right or go forward.
we develop a motor brain-computer-brain system, which allows
human mind to drive a rat robot in a navigation environment.
Neural signals from an EEG headset are transmitted wirelessly II. SYSTEM DESIGN
to a computer, then the decoded neural signals are converted to
steer a rat robot to navigate in a sand table. Our work breaks The system framework of our brain-computer-brain system
new ground as a brain-computer-brain interface for pervasive is illustrated in Fig. 1. A ratbot is executing search missions
computing, and demonstrates the possibility to exchange messages in a sand table. Video of what the ratbot is looking at from
in dyads or networks of brains. camera mounted on the ratbot is sent to a computer through
Index Terms-ratbot; brain-computer interface; brain to brain; 1.2G Hz wireless network. Human wearing a wireless Emotiv
brain network EEG neuroheadset watches the video playing on the screen of
the computer, and thinks where to go. Electrical brain activity
I. INTRODUCTION from Brain 1 (the Sender) is recorded using Emotiv EEG
neuroheadset, and sent to the computer via Bluetooth. The
Pervasive computing has become an emerging field since computer evaluates real time brainwave activity to discern the
the Mark Weisers pioneering article [1], where computing is intents of the user (Left, Right, or Forward), and sends the
made to appear everywhere. Various technologies have been order to stimulate Brain 2 (the Receiver) via Bluetooth. The
developed for supporting pervasive computing mainly based well trained ratbot would go according to the given order. Thus
on traditional mechanical sensors and actuators. However, bio- a close loop based brain network is running until the search
robots are superior in many aspects to traditional mechanical missions are completed.
robots such as mobility, perceptivity and adaptability [2]. Take
rat robot (ratbot) for example, rat has great adaptation and
flexibility in different environments, and its great sensory
capability, including smelling and hearing, can be used by
recording and interpreting brain signals through BCI tech-
niques [3]. Kuznetsov et al has expanded the current landscape
of sensing in pervasive computing to include living organisms,
such as plants and animals, along with traditional tools and
digital devices [4], cyborg intelligence [5] and bio-robot like
Roboroach [6].
Fig. 1. System framework
Brain computer interface provides a direct communication
pathway between a brain and an external device, and it
has shown great potential for pervasive computing by some
promising research results. DiGiovanna et al has developed a A. Computer to Brain
co-adaptive brain machine interface in which rats and a neural
The first ratbot was developed in 2002 [16], a ratbot system
prosthetics adapt to each other [7]. And several groups have
refers to a rat equipped with a wireless backpack stimulator,
tried to build a direct network between two brains to complete
through which the outer electric stimulus is delivered directly
some simple tasks [8], [9], [10]. Neural signals are everywhere
into the rats brain to direct its behaviors [17]. Researchers have
just like mobile phones. Brain computer interface can be a very
realized three behavioral control commands, Forward, turning
good complementary technology to pervasive computing for
Left and Right to steer the ratbot along a complicated route.
directly interpreting humans intention, emotion and physiolog-
Stimulations in the medial forebrain bundle (MFB) of brain
ical status. Recent years have seen more and more applications
have been used as a type of reward to rats, and stimulations
of BCI technology in the pervasive computing [11], [12], [13],
in the somatosensory cortices of the left and right half brains
[14], [15].
have been used as steering cues for controlling rat navigation.
In this paper, we develop a motor brain-computer-brain A ratbot would learn to walk according to the stimulations after
system. A ratbot with a micro camera mounted on his back navigation training [18], [19]. A full review of ratbot is beyond
is placed in a sand table to execute search missions, human the scope of this paper, interested readers can refer to previous
wearing a wireless EEG headset watches videos of the exper- articles mentioned above. A well-trained ratbot named F05 is
iment form the camera, and think where to go. Brain signals used in our experiment.

978-1-4799-2736-4/14/$31.00 ©2014 IEEE 228


2014 IEEE International Conference on Pervasive Computing and Communications Work in Progress

B. Brain to Computer SMA to measure motor imaginary EEG. They are F3, F4,
F7, F8, FC5 and FC6 [21]. The raw EEG data was filtered
As you can see, in order to navigate a ratbot by three
by an 8-12Hz band-pass elliptic filter. To improve the SNR,
different kinds of stimulus (Left, Right, and Forward), there
we used a common spatial pattern (CSP) method. The CSP
should be at least three relatively steady brain states human
method transforms multichannel filtered EEG data into a lower
can generate. The wireless neuroheadset we use here is cheap
dimensional subspace by calculating a transform matrix called
and convenient to wear, but the EEG brain signals it gets have
spatial filter [22]. Then we use the spatial filter to reduce the
a significant amount of noise and require more sophisticated
data dimension. We also take 20 neutral tests to calculate a
signal processing and machine learning techniques to classify
baseline condition of dimension reduced data. The task-related
neural events [11]. As a pilot study, and take the difficulty
motor imaginary was detected by applying a threshold of five
of controlling a live ratbot into account, the EEG controlled
times standard deviation in baseline condition.
navigation paradigm utilizes blink potential to give Forward
order and motor imaginary related sensorimotor rhythms to 3) Online Brain to Computer Inteiface: To evaluate our
give Left and Right order (see Fig. 2). In the following online brain to computer interface, subjects are told to wink
subtitles, we discuss the realization of our real-time online or imagine left or right movement after screen cues. The output
motor brain to computer interface. accuracy of 3 subjects in 100 tests is shown in Table I. The
man who has the best online classification result proceeds to
control the ratbot (F05) in our experiment.

TABLE I. Online Classification Results


Subjects Wink(%) Right(%) Left(%)
Tsien 95 89 90
Fig. 2. Experimental paradigm Yu 92 87 91
Jang 94 86 88
1) Blink Detection: EMG is generated by muscle cells
when these cells are activated in muscular movements. And the
facial EMG made a big part of the EEG Artifacts [20], and is The response time was the time duration between the cues
easily detected in AF nodes in the Emotiv EPOC neuroheadset and control signals. As shown in Fig. 4, the positive value
(see Fig. 3). The wink EMG has a high SNR (signal-to-noise of cues and control signals refer to the task of imagining
ratio), so it can be detected without being filtered. The EMG left and the left command, while the negative refers opposite.
has a short response time close to the response time of human The average response times of wink and motor imaginary are
(0.313s on average), and its a robust way of real-time control. 0.313s and 0.439s.

200

1W --------------- --- -- -- --- -- -


Wink threshold

100

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
sec
Fig. 4. Black line represents screen cues, and blue line rep-
Fig. 3. Unfiltered AF4 signal where we set wink threshold resents outputs of control signals. Green line and red line
larger than four times standard deviation of non-wink condition represent the processed EEG signals of imaging left and right.
(about 150j-tV).

C. Brain to Brain Collaboration


2) Motor Imaginary Classification: Its well known that
mu rhythm of cortex in one side decreases in homolateral A brain-computer-brain interface should enable a real-
limbs movement or movement imaginary, and its referred to as time and accurate transfer of control signals. Human can
Event Related Desynchronization/Synchronization(ERDIERS). quickly learn to use brain signals to control outside mechanical
Substantially, the ERDIERS are transient response of humans equipment devices (such as a wheelchair or a toy vehicle) after
primary motor cortex (M1) and supplementary motor cortex training, but it is not easy to control a live ratbot. By the way,
area (SMA). We choose three electrodes pairs near M1 and ratbots can be controlled to navigate in complex environment

229
2014 IEEE International Conference on Pervasive Computing and Communications Work in Progress

in which the environmental map has been known, but the


mission environment we set to execute here is unknown.
The only message we can get from the environment is some
landmark arrows (see Fig. 5). When we see an arrow in the
front through the video, human should blink to give Forward
order to drive the ratbot to advance, and then think left or right
to make the ratbot tum to the direction as the arrow indicates.
What is more, the video pictures are shaking during the entire
mission, sometimes even black if video signals are blocked
in some places like tunnel in the sand table. So navigating a
ratbot in an unknown environment is a difficult task, it needs Fig. 6. (a) Mission 1 is to find a Minnie mouse with two left
good brain collaboration between human and the ratbot. turns. (b) Mission 2 is to find a bomb with two left turns and
two right turns, it is more difficult than mission 1.

that the mission would be easy completed if the ratbot and


experimenter both have a good physiological status.

TABLE II. Success rate (SR) and time consumption of the two
missions. Max, Min and Mean represent the maximal, minimal
and average value of the time consumption.
Mission 1 Mission 2
Fig. 5. A landmark in the video from the ratbots camera.
SR Max Min Mean SR Max Min Mean
(s) (s) (s) (s) (s) (s)
Hand 80% 233 145 179 80% 198 121 172
Brain 70% 312 147 203 60% 376 271 305
III. EXPERIMENTS AND RESULTS

All procedures used in this study were carried out in


accordance with the Guide for the Care and Use of Laboratory IV. CONCLUSIONS
Animals. The brain-computer-brain interface is realized in C++
and Matlab with a friendly user interface written by Qt to We develop a brain-computer-brain interface allowing hu-
adjust the stimulus parameters. Fig. 6 shows the two missions man to mind control a ratbot in an unknown environment
in which ratbots will be driven to the goal from the start with acceptable success rates and time consumption. Our
point with the instruction of the arrows. We carefully promote work incorporates a traditional mechanical sensor (the camera)
our experiment in three steps. In step 1, a ratbot was hand and an intelligent living actuator (the ratbot) into ubiquitous
controlled by the experimenter to navigate to finish the two applications. Based on advanced BCI technology, brain net-
missions through the camera video. Each mission has 10 trials. work established here enables two subjects to communicate
Time consumption and success rate of task completion are and collaborate with each other in a direct and high-level
recorded. In step 2, a ratbot is navigated in the sand table by manner. Our future work is to combine motor imagery related
an assistant, the experimenter thought to give control orders sensorimotor rhythms with visual stimulation evoked signals
before the videos, but the orders are not actually sent to the (SSVEP or P300) to build a pure brain to brain interface, and
ratbot. This step is set to train the experimenter to give correct test our brain-computer-brain interface on other people and
orders under the challenging task environment.After taking other ratbots.
these two steps to valid our brain-computer-brain interface,
we proceed to formal experiment in step 3 in which the One may argue that the invasive stimulus would cause
experimenter thinks and blinks to navigate the ratbot. Mission pain in the animals and that is inhumane. Actually, mild
1 and mission 2 both have 10 trials in this step too. stimulations in the somatosensory cortices of the left and right
half brains are just cues to make the ratbot feel like touch
The experiment result is shown in Table ll. First, the something, and a mild stimulation in the MFB is to make the
success rate of brain control is close to hand control, which is ratbot feel exciting. Transcranial focused ultrasound, which is
a good validation for the effectiveness of our motor brain- a non-invasive region-specific brain stimulation technique [9],
computer-brain interface. Second, mission 1 has a higher will also be considered in our future work.
success rate and a smaller time consumption than mission
2 in brain control, which coincides with the intuition that ACKNOWLEDGEMENTS
mission 2 is more difficult than mission 1. But this result
is not indicated by hand control. Third, brain control has a This work was partly supported by the National Key Basic
more time consumption compared to hand control especially Research Program of China (2013CB329504) and Program for
in mission 2. It indicates that the key to navigate the ratbot New Century Excellent Talents in University (NCET-13-0521).
in brain control is to make ratbot tum to the right direction The authors would thank Kedi Xu, Chaonan Yu, Liqiang Gao
in the binary crossing. Finally, in the experiment we found for the rat surgery. Gang Pan is the corresponding author.

230
2014 IEEE International Conference on Pervasive Computing and Communications Work in Progress

REFERENCES [11] A. Campbell, T. Choudhury, S. Hu, H. Lu, M. K. Mnkerjee, M. Rabbi,


and R. D. Raizada, "Neurophone: brain-mobile phone interface using a
[l] M. Weiser, ''The computer for the 21st century," Scientific American, wireless eeg headset," in Proceedings of the second ACM SIGCOMM
vol. 272, no. 3, pp. 78-89, 1995. workshop on Networking, systems, and applications on mobile hand-
[2] C. Sun, N. Zheng, X. Zhang, W. Chen, and X. Zheng, "Automatic nav- helds. ACM, 2010, pp. 3-8.
igation for rat-robots with modeling of the human guidance;' Journal [l2] Y. Yu, D. He, W. Hua, S. Li, Y. Qi, Y. Wang, and G. Pan, "Flyingbuddy2:
of Bionic Engineering, vol. 10, no. 1, pp. 46-56, 2013. a brain-controlled assistant for the handicapped;' in Proceedings of the
[3] R. Shusterman, M. C. Smear, A. A. Koulakov, and D. Rinberg, "Precise 2012 ACM Conference on Ubiquitous Computing. ACM, 2012, pp.
olfactory responses tile the sniff cycle," Nature neuroscience, vol. 14, 669--670.
no. 8, pp. 1039-1044, 201l. [l3] Y. Wang, M. Lu, Z. Wu, L. Tian, W. Hua, X. Zheng, K. Xu, and
[4] S. Kuznetsov, W. Odom, J. Pierce, and E. Paulos, "Nurturing natural G. Pan, "Ratbot: A rat "understanding" what humans see;' in IlCAI
sensors;' in Proceedings of the 13th international conference on Ubiq- 2013 Workshop on Intelligence Science, 2013.
uitous computing. ACM, 2011, pp. 227-236. [14] S. W. Gilroy, J. Porteous, F. Charles, M. Cavazza, E. Soreq, G. Raz,
[5] Z. Wu, G. Pan, and N. Zheng, "Cyborg intelligence," IEEE Intelligent L. Ikar, A. Or-Borichov, U. Ben-Arie, I. Klovatch et aI., "A brain-
Systems, vol. 28, no. 5, pp. 31-33, 2013. computer interface to a plan-based narrative," in Proceedings of the
[6] B. Brains. The roboroach. [Online]. Available: https://backyardbrains. Twenty-Third international joint conference on Artificial Intelligence.
com/products/roboroach AAAI Press, 2013, pp. 1997-2005.
[7] J. DiGiovanna, B. Mahmoudi, J. Fortes, J. C. Principe, and J. C. [l5] E. Haapalainen, S. Kim, J. F. Forlizzi, and A. K. Dey, "Psycho-
Sanchez, "Co adaptive brain-machine interface via reinforcement learn- physiological measures for assessing cognitive load," in Proceedings
ing," Biomedical Engineering, IEEE Transactions on, vol. 56, no. 1, pp. of the 12th ACM international conference on Ubiquitous computing.
54-64, 2009. ACM, 2010, pp. 301-310.
[8] R. P. N. Rao. (2013, Aug.) Direct brain-to-brain communication in [l6] S. K. Talwar, S. Xu, E. S. Hawley, S. A. Weiss, K. A. Moxon, and J. K.
humans: A pilot study. [Online]. Available: http://homes.cs.washington. Chapin, "Behavioural neuroscience: Rat navigation guided by remote
edu!~raolbrain2brainlexperiment.html control," Nature, vol. 417, no. 6884, pp. 37-38, 2002.
[9] S. Yoo, H. Kim, E. Filandrianos, S. J. Taghados, and S. Park, "Non- [17] Z. Feng, W. Chen, X. Ye, S. Zhang, X. Zheng, P. Wang, J. Jiang, L. Jin,
invasive brain-to-brain interface (bbi): Establishing functional links Z. Xu, C. Liu et al., "A remote control training system for rat navigation
between two brains," PloS one, vol. 8, no. 4, p. e6041O, 2013. in complicated environment," Journal ofZhejiang University Science A,
vol. 8, no. 2, pp. 323-330, 2007.
[l0] M. Pais-Vieira, M. Lebedev, C. Kunicki, J. Wang, and M. A. Nicolelis,
"A brain-to-brain interface for real-time sharing of sensorimotor infor- [l8] Y. Yu, N. Zheng, Z. Wu, X. Zheng, W. Hua, C. Zhang, and G. Pan,
mation," Scientific reports, vol. 3, 2013. "Automatic training of ratbot for navigation," in IlCAl 2013 Workshop
on Intelligence Science, 2013.
[19] M. Lee, G. Jun, H. Choi, H. S. Jang, Y. C. Bae, K. Suk, I. Jang,
and B. Choi, "Operant conditioning of rat navigation using electrical
stimulation for directional cues and rewards," Behavioural Processes,
vol. 84, no. 3, pp. 715-720, 2010.
[20] I. Goncharova, D. J. McFarland, T. M. Vaughan, and J. R. Wolpaw,
"Emg contamination of eeg: spectral and topographical characteristics,"
Clinical Neurophysiology, vol. 114, no. 9, pp. 1580-1593, 2003.
[21] Y. Wang, B. Hong, X. Gao, and S. Gao, "Design of electrode layout
for motor imagery based brain--computer interface," Electronics Letters,
vol. 43, no. 10, pp. 557-558, 2007.
[22] Y. Wang, P. Berg, and M. Scherg, "Common spatial subspace decom-
position applied to analysis of brain responses under multiple task
conditions: a simulation study," Clinical Neurophysiology, vol. 110,
no. 4, pp. 604--614, 1999.

231

Das könnte Ihnen auch gefallen