Sie sind auf Seite 1von 1

How about taking a Low-cost Multi­modal Bio-sensing and

Eye-gaze Tracking System into the “Wild”?


Siddharth, Tzyy-Ping Jung, and Terrence J. Sejnowski

Abstract— We present a low-cost wearable multi-modal by oneself), scalable to incorporate bio-sensors as per the
bio-sensing system capable of recording bio-markers and need, can be worn comfortably for long periods of time, and
eye-gaze overlaid on world view in real-world settings. The uses a wireless connection to make it truly mobile. It uses a
system incorporates a wearable computer (e.g. Raspberry Pi 3) wearable computer (e.g. Raspberry Pi 3) to collect data from
to collect data from bio-sensors (e.g. 1-channel ECG and bio-sensors (such as 1-channel ECG and 14-channel EEG by
multi-channel EEG), and two cameras (a world camera and an Emotiv), and two cameras (a world camera and an IR-based
IR-based eye camera). The wearable computer transmits data eye camera). It transmits the sensors’ data to a main computer
from all the sensors to a main computer over Wi-Fi, while over Wi-Fi, while timestamping each packet data stream. The
time-stamping each packet data stream. The system is easy to main computer is used to visualize the data in real-time while
wear, comfortable, and scalable to incorporate bio-sensors as synchronizing and recording them. After a fast and easy
per the need. calibration, the eye camera’s IR image is used to find the
I. INTRODUCTION center of the pupil and map it onto the world camera’s view as
the gaze position. The system is scalable as it can be used with
Presently, bio-sensors are confined to laboratory’s any desired configuration of bio-sensors or sampling rate as
restricted environments for experimental data collection. per the target application of the study. Our research currently
There is no robust yet comfortable system that could be used focuses on the application of this multi-modal system for
for data collection in mobile settings or has applications in emotion recognition in real-life settings. Real-time processing
real-world environments. Traditional bio-sensing systems are of EEG is being done using ICA to extract true brain activity
costly, bulky and not designed for comfort or ease-of-use, and measure valence and arousal [3]. These bio-markers
making them impractical for real-world studies [1]. derived from the EEG can be used directly in an augmented
Additionally, the bio-sensors have to be usually assembled reality (AR) based headset for augmenting visual experience.
together, which requires more effort in time synchronization For instance, events such as when a person perceives another
and calibration between them. Another issue is not being able person’s face and subsequent emotional response can be
to exactly pinpoint the events happening in the real world to automatically tagged using eye-gaze. The emotional response
which the subject is responding. The usual practice to achieve to real-world situations can also be visually displayed to the
this as a roundabout is by asking the subject to mark the event subject using AR.
(for example, by pressing a button). But, this induces further
unreliability and compromises the experiment’s veracity. a. b.
Using more than one type of bio-sensor allows for being able
to boost performance by compensating for the other(s). For
example, the bio-markers in neurophysiological data (e.g.
electroencephalogram or EEG) can complement and c.
supplement those available in physiological data (e.g. ECG,
HR and HRV) for assessing human emotion [2].
II. RESEARCH
d.
We present a low-cost, wireless, multi-modal bio-sensing
system (Figure 1a) which is comfortable to wear, and can
reliably monitor bio-signals such as EEG, ECG, EMG, GSR
etc. in real-time. Furthermore, it also overlays subject’s
eye-gaze over his/her world view so that events in the real
Figure 1. (a). Multi-modal Bio-Sensing System. (b). ECG Sensor Module.
world can be easily tagged with the corresponding bio-signals. (c). World Camera. (d). IR-based Eye Camera.
The system is easy to wear (takes less than 5 minutes to set up
REFERENCES
*This work was supported in part by the Army Research Laboratory [1] McDowell K, Lin C-T., Oie K.S., Jung T-P., Gordon S., Whitaker
under Cooperative Agreement Number W911NF-10-2-0022. K.W., Li S-Y., Lu S-W. and Hairston W.D., “Real-World
Siddharth is with Department of Electrical and Computer Engineering at Neuroimaging Technologies,” IEEE Access 1:131-49, 2013.
University of California San Diego, La Jolla, CA, 92093, USA (phone: [2] Olga Pollatos, Wladimir Kirsch and Rainer Schandry, On the
1-650-772-8624; fax: 1-858-822-7556 and email: sid.sapien@gmail.com). relationship between interoceptive awareness, emotional experience,
Tzyy-Ping Jung is with Institute for Neural Computation, University of and brain processes, Cognitive Brain Research 25 (2005) 948-962
California San Diego, La Jolla, CA, 92122 (email: jung@sccn.ucsd.edu). [3] Yuan-Pin Lin, Jeng-Ren Duann, Jyh-Horng Chen and Tzyy-Ping Jung,
Terrence J. Sejnowski is with Computational Neurobiology Laboratory, EEG dynamics of musical emotion perception revealed by independent
The Salk Institute, La Jolla, CA, 92037 (email: terry@salk.edu). spectral components, NeuroReport 2010, 21:410–415

Das könnte Ihnen auch gefallen