Sie sind auf Seite 1von 8

IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 59, NO.

8, AUGUST 2012 2103


Online Removal of Eye Movement and Blink EEG
Artifacts Using a High-Speed Eye Tracker
Borna Noureddin

, Peter D. Lawrence, Senior Member, IEEE, and Gary E. Birch, Senior Member, IEEE
AbstractA novel approach is presented for using an eye
tracker-based reference instead of EOG for methods that require
an EOG reference to remove ocular artifacts (OA) from EEG. It
uses a high-speed eye tracker and a new online algorithm for ex-
tracting the time course of a blink from eye tracker images to
remove both eye movement and blink artifacts. It eliminates the
need for EOG electrodes attached to the face, which is critical
for practical daily applications. The ability of two adaptive lters
(RLS and H

) to remove OAis measured using: 1) EOG; 2) frontal


EEG only (fEEG); and 3) the eye tracker with frontal EEG (ET
+ fEEG) as reference inputs. The results are compared for differ-
ent eye movements and blinks of varying amplitudes at electrodes
across the scalp. Both the RLS and H

methods were shown to


benet from using the proposed eye tracker-based reference (ET
+ fEEG) instead of either an EOG reference or a reference based
on frontal EEG alone.
Index TermsArtifact removal, blinks, electroencephalography,
eye movements, eye tracking.
I. INTRODUCTION
A
N ongoing research problem [1][4] is the online removal
of the effects of eye movements and blinks from the EEG.
Some ocular artifact (OA) removal techniques require only EEG
signals. However, they need manual selection of a threshold [5]
or of components to reject [6], are not able to handle all sources
of OA (e.g., only blinks [5]), or are not suitable for real-time use
(e.g., require large amounts of data [1]). Other approaches to au-
tomated, online OAremoval [4], [7][11] use electro-oculogram
(EOG) signals as a reference. Eliminating EOG is critical for
practical daily applications, and would render such methods
more suitable for real-time applications (e.g., a braincomputer
interface), especially where only scalp EEG measurements are
available (e.g., no EOG electrodes are attached to the face).
Manuscript received April 27, 2010; revised October 5, 2010; accepted
November 26, 2010. Date of publication January 28, 2011; date of current
version July 18, 2012. This work was supported in part by the NSERC Dis-
covery under Grant 4924-05 and in part by the NSERC under Grant 90278-06.
Asterisk indicates corresponding author.

B. Noureddin is with the Department of Electrical and Computer Engineer-


ing, University of British Columbia, Vancouver, BC V6T 1Z4, Canada (e-mail:
bornan@ece.ubc.ca).
P. D. Lawrence is with the Department of Electrical and Computer En-
gineering and Institute for Computing, Information and Cognitive Systems,
University of British Columbia, Vancouver, BC V6T 1Z4, Canada (e-mail:
PeterL@ece.ubc.ca).
G. E. Birch is with the Neil Squire Society, Burnaby, BC V5M 3Z3, Canada,
and also with the Department of Electrical and Computer Engineering and Insti-
tute for Computing, Information and Cognitive Systems, University of British
Columbia, Vancouver, BC V6T 1Z4, Canada (e-mail: garyb@neilsquire.ca).
Digital Object Identier 10.1109/TBME.2011.2108295
Anovel approach is presented for replacing EOGwith a high-
speed eye tracker for OA removal methods requiring an EOG
reference. Using an eye tracker can reduce unintended EEG
distortion introduced by using an EOG reference (although in-
vestigating such reductions is beyond the scope of this paper).
The combination of the eye tracker and related novel image
processing methods with EEG also provides a powerful human
interface. Simple, calibration-free image processing is used to
generate suitable reference signals for removing both eye move-
ments and blinks.
Eye trackers may need to have a frame rate of at least
362 Hz [18] to not miss important eye movement and blink
dynamics. Unlike EOG, it is impossible to apply analog tempo-
ral lters to video. Instead, images must be sampled at a high
enough rate to avoid aliasing. This is the rst study that uses a
high-speed (>400 Hz) eye tracker for OA removal. Normally,
eye trackers require calibration for determining point of gaze,
but in this study only eye movements are tracked. Thus, un-
like previous work, neither the overall approach nor the specic
image processing algorithms require calibration or user inter-
vention. Furthermore, as shown in [12], the eye tracker used in
this study is robust to small, natural head movements.
To handle both eye movements and blinks, a new blink signal
generation (BSG) algorithm was also developed. Unlike previ-
ous work [13][17], it can operate in low-lighting conditions
without color information, training, calibration, or user-specic
parameters, and avoids complicated corner detection and fea-
ture extraction algorithms. Instead, it tracks intensity changes
around the pupil in the eye tracker images.
Kierkels et al. [2] also proposed using an eye tracker for OA
removal. They used an eye tracker to generate pupil positions,
which were used as inputs to a Kalman lter to remove eye
movement effects. Although they achieved superior results over
other OA removal methods at the time, this paper describes an
approach that can potentially be used by any method, including
those that use optimal lters other than the Kalman lter. Al-
though the Kierkels et al. method was shown to work well for
eye movements, it was not able to handle blinks, and required a
30-s starting period for stabilizing the Kalman lter (which re-
quired manual tuning), during which the OA removal algorithm
could not be used reliably. The approach described here, includ-
ing the new BSG algorithm, can handle both eye movements
and blinks, and does not require manual tuning, calibration or
a stabilization period. Finally, the eye tracker used by Kierkels
et al. operated at 50 Hz, while this study uses a high-speed eye
tracker.
The proposed approach to eliminating EOG in OA removal
is evaluated using two adaptive lters [7], [9]. Each lters
0018-9294/$31.00 2012 IEEE
2104 IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 59, NO. 8, AUGUST 2012
performance is measured using: 1) EOG; 2) frontal EEG; and
3) the eye tracker as a reference. To measure performance on real
EEG, the R metric [19] is used to measure how much artifact
is removed during different types of eye movements and blinks
at 52 different electrodes spread across the scalp. For the OA
removal algorithms investigated, during some eye movements,
using an eye tracker is more effective at removing OA at frontal
electrodes than using either EOG or frontal EEG.
In summary, to the best of the authors knowledge, this is
the rst study that replaces the EOG with a high-speed eye
tracker as a reference for any online OA removal method. It
can successfully remove the effects of both eye movements and
blinks without any calibration or manual tuning. The removal
of blinks is made possible by a novel, fully automated online
algorithm for extracting the time course of a blink from eye
tracker images whose output is shown to be as effective as EOG
for removing blink artifacts. It is also the rst study to fully
evaluate the performance of online OA removal using an eye
tracker over electrodes spread across the entire scalp for both
eye movements and blinks of different amplitudes.
II. METHODS
The general approach to OAremoval is shown in Fig. 1. Brain
electric signals are assumed to conduct to the scalp V
B
, com-
bining with ocular signals from eye movements and blinks V
O
,
other artifact signals V
A
, and measurement and ambient noise
V
N
. Generally, V
A
is assumed to be zero, and V
N
is modeled
as white Gaussian noise. These EEG sources are assumed to
combine linearly to form measured scalp potentials E. The OA
removal method T
OAR
is a transformation that aims to remove
V
O
from E, resulting in Y (ideally Y = V
B
). Several meth-
ods [4], [7][11] shown to be effective at OA removal use EOG
as a reference to calculate components to be subtracted from E
Y = E WX (1)
where X is a reference input or vector of measured reference
signals (always EOG until the present study), and W is an
array of lter coefcients. Two such methods are based on:
1) the recursive least-squares (RLS) algorithm [7] that has fast
convergence; and 2) the time-varying H

algorithm [9] that


converges more slowly but has been shown to perform better
than RLS for OA removal. Both algorithms adapt W at each
sample such that the total power of Y is minimized. The effect
of using a reference using an eye tracker and frontal EEGinstead
of EOGwas evaluated using each of the RLS and H

algorithms
as implemented in [20]. Adaptive lters such as RLS and H

are particularly suitable for online OA removal because they


operate on a sample by sample basis.
Each algorithm was investigated independently. Both meth-
ods lter EOG and subtract it from EEG
e(n) = s(n) r(n)
T
H(n) (2)
where e(n) is the estimated EEG without OA at sample n, s(n)
is the measured scalp potentials, r(n) is the EOG reference,
and H(n) is the adaptive lter coefcients (lter length of M
samples). Each method uses a different procedure for adaptively
Fig. 1. General approach to OA removal: V
B
is the signal component caused
by brain sources, V
O
by ocular sources, V
A
by other artifact sources (e.g.,
EMG, EKG, etc.), and V
N
by noise sources (e.g., measurement noise). E is
the measured scalp potential and Y is the output of the OA removal algorithm,
and represents the EEG with OA removed. T
OAR
is a transformation (usually
linear) that removes the OA.
calculating H(n). For RLS, M = 3 was used as in [7], noting
that the method is not sensitive to the choice of M, and in-
cremental increases in performance are noticeably reduced for
M > 3. At each sample, gain factor K(n) and lter coefcients
H(n) were calculated and used to lter s(n) as in (2)
K(n) =
[ R(n 1) ]
1
r(n)
+r(n)
T
[ R(n 1) ]
1
r(n)
(3)
[ R(n) ]
1
=
[ R(n 1) ]
1

1
K(n)r(n)
T
[ R(n 1) ]
1

(4)
[ R(0) ]
1
= 100 I (5)
H(n) = H(n 1) +K(n)(s(n) r(n)
T
H(n 1)) (6)
H(0) = 0 (7)
where I is the identity matrix, and is a forgetting factor.
As in [7], = 0.9999, noting that for low values the algorithm
would not converge, and for 0.995 < < 1 the actual value did
not make a noticeable difference to the nal performance.
For H

, M = 3 was used again, as the results in [9] indi-


cated that higher lter orders did not produce noticeably better
performance. The gain factor K(n) and lter coefcients H(n)
were calculated (
g
= 1.5 as in [9])
K(n) =

P(n)r(n)
1 +r(n)
T
P(n)r(n)
(8)
r

(n) = r(n)r(n)
T
(9)

P(n) = [ P
1
(n)
2
g
r

(n) ]
1
(10)
P(n) = [ P
1
(n 1) + (1
2
g
)r

(n 1) ]
1
+ 10
5
I
(11)
P(0) = 0.005 I (12)
H(n) = H(n 1) +K(n 1)(s(n 1)
r(n 1)
T
H(n 1)) (13)
H(0) = 0. (14)
In this way, the proposed method of replacing EOG with an eye
tracker-based reference signal was evaluated using two existing
OA removal methods that require an EOG reference.
NOUREDDIN et al.: ONLINE REMOVAL OF EYE MOVEMENT AND BLINK EEG ARTIFACTS 2105
TABLE I
DATA COLLECTION TASKS
A. Data Collection
The proposed method was evaluated using real recorded
biosignals. Simultaneous EEG, EOG, and eye tracking data were
collected from four subjects (one female, three males) aged 19
63, sitting approximately 50 cm in front of a computer monitor.
The subject protocol was approved by the University of British
Columbia Behavioural Research Ethics Board (certicate no.
B03-0500). Each subject was instructed to perform four tasks
(see Table I), providing data during nine OA types: small left
(SL), small right (SR), small downward (SD), large left (LL),
large right (LR), large upward (LU), and large downward (SD)
saccade, and small (SB), and large blink (LB).
For each subject, 55 EEG channels, 2 linked mastoids, and
6 channels of EOG (see later) were collected during a single
3-h session. All signals were sampled by the same amplier at
200 Hz, with a 70-Hz low-pass lter and a 0.530 Hz digital
band-pass lter (BPF). Data were processed using a 2.67-GHz
Intel Core2 PC with 2-GB RAM. Horizontal (H
EOG
) and ver-
tical (V
EOG
) EOG were calculated as follows:
H
EOG
= H
R
H
L
(15)
V
EOG
=
(V
BR
V
TR
) + (V
BL
V
TL
)
2
(16)
where H
R
and H
L
were measured at the left and right outer
canthi, respectively and V
TR
, V
BR
, V
TL,
and V
BL
were measured
above (denoted by T) and below (denoted by B) the right
eye and above and below the left eye, respectively.
In addition, an eye tracker [12] was used to collect eye move-
ment and blink data at approximately 400 Hz simultaneously
with EEG and EOG. Before the start of each task, it was en-
sured that the subjects eye was in the eye tracker cameras eld
of view. For each frame, the eye tracker provided the following.
1) A grayscale image of the eye of the subject.
2) Atimestamp when the image was captured. This was com-
bined with hardware logic signals provided by the eye
tracker to accurately synchronize the eye images with the
EEG samples.
3) A ag indicating whether a pupil was found.
4) The parameters of the ellipse that best t the pupil.
5) A ag indicating whether the subject was performing a
xation at the time.
The previously mentioned was used to extract three signals
(see Fig. 2). The pupil centers x- and y-coordinates were ltered
using a 0.530 Hz BPF to produce H
ET
and V
ET
, corresponding
to horizontal and vertical eye movements, respectively. The third
signal B
ET
corresponds to predicted changes in EEG during
blinks. The rst time the pupil was found and was stable, base
Fig. 2. Overall operation of eye tracker for every image captured. The signals
H
ET
, V
ET,
and B
ET
are extracted as a reference for OA removal.
Fig. 3. Areas of eye tracker images used for extracting blink signal (not drawn
to scale). R is the pupil radius (average of lengths of ellipse axes).
intensities X
1
, X
2
, and X
3
were calculated
X
k
=
I
k

0
, k = 1, 2, 3 (17)
where I
1
, I
2
, and I
3
are image intensities corresponding to S
1
,
S
2
, and S
3
in the eye image (see Fig. 3), and
0
is the average
intensity of the entire eye image. For each subsequent image,
intensities Y
1
, Y
2
, and Y
3
were calculated
Y
k
=
I
k

, k = 1, 2, 3 (18)
where I
1
, I
2
, and I
3
are as aforementioned, and is the av-
erage intensity of the entire eye image. If the image was from
a xation, the current pupil position was used as P
0
in Fig. 3.
Otherwise, the pupil position from the last xation was used,
since during a blink the pupil position is likely inaccurate or not
2106 IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 59, NO. 8, AUGUST 2012
Fig. 4. Eye tracking versus EOG during (a) simultaneous horizontal (left) and vertical (up) saccade, and (b) blink. Sample eye images are shown above and EOG
and eye tracking signals below, with arrows pointing to the time instant each image was captured. In (a), the x shows the current pupil position, and the dot shows
the pupil position at the start of the saccade. In (b), the ellipse shows the area where the intensity value is tracked during the blink.
available. For each image, Z was computed from X and Y
Z = 2
N
1

i=1
(Y
1
X
1
)
2
+
N
2

i=1
(Y
2
X
2
)
2
+
N
3

i=1
(Y
3
X
3
)
2
(19)
where N
1
, N
2
, and N
3
are the number of pixels in S
1
, S
2
, and
S
3
, respectively. The resulting Z was ltered using a 0.530 Hz
BPF to produce the blink signal B
ET
. Finally, the eye trackers
timestamps were used to synchronize H
ET
, V
ET
, and B
ET
with
EEG. Fig. 4 shows examples of EOG and eye tracker signals
corresponding to eye movements and blinks.
B. Reference Inputs
From the data collected, three different reference inputs [X
in (1)] were constructed for use by each of the algorithms (AF7,
AF8, and Fpz are signals from those frontal EEG channels, and
H
EOG
, V
EOG
, H
ET
, V
ET
, and B
ET
are as aforementioned)
1) X = [H
EOG
, V
EOG
]
T
(EOG reference).
2) X = [AF7, AF8, Fpz]
T
(frontal EEG or fEEG refer-
ence, as in [21]).
3) X = [H
ET
, V
ET
, B
ET
, AF7, AF8, Fpz]
T
(eye-tracker-
based reference or ET + fEEG). See later for the reason
for combining the eye tracker signal with EEG signals.
C. OA Removal Evaluation
EEG, EOG, and eye tracker measurements contain common
information about eye and eyelid movement, as well as infor-
mation not in common [22], [23], as shown in Table II. While an
eye tracker signal only measures eye movements, EOGcontains
other biosignals. Therefore, it would be misleading to compare,
TABLE II
CONTRIBUTION BY SPECIFIC SOURCES TO VARIOUS MEASURED SIGNALS
for example, EOG with an eye tracker signal: using EOG might
remove more of the signal, but it may also result in removing
parts of the EEG that are unrelated to the OA (Background
EEG or ERP).
To address this problem, the comparison was carried out on
the average of several trials of each OA type (see aforemen-
tioned). The average signal was calculated and analyzed sep-
arately for each subject. Thus, any components that were un-
related to the OA were minimized. Since the average signal is
mostly OA related, removing more signal could consistently
be considered a good measure of the performance of the OA
removal algorithm, and by extension the choice of reference
input.
The eye tracker does not contain information unrelated to OA,
thus minimizing EEG distortion after OA removal. However,
NOUREDDIN et al.: ONLINE REMOVAL OF EYE MOVEMENT AND BLINK EEG ARTIFACTS 2107
since there are OA-related components (e.g., OA-related EMG
or EEG) that are picked up by EOG and EEG electrodes but not
by the eye tracker, the eye tracker signals were combined with
frontal EEG. Including EEG channels helps remove sources
time-locked with OA, while including eye tracking signals min-
imizes the removal of nonOA related components. Initial at-
tempts to use just eye tracker signals H
ET
, V
ET,
and B
ET
re-
sulted in poor OA removal by both algorithms, thus conrming
the previous discussion.
OA removal methods are normally evaluated using simulated
data. Artifact-free EEG and an artifact signal are combined arti-
cially and processed using the algorithm (see Fig. 1). Metrics
such as the SNRof the output Y can be compared to the artifact-
free EEG V
B
. For real EEG, however, V
B
, V
N
, and V
A
are
unknown, so the performance on real data is reported subjec-
tively [3], [24], [25], often based on visual inspection of the
resulting waveforms. That is, it is not possible to measure the
SNR. As shown previously [26], the R metric [19] can be used
instead to measure the amount of OA removed
R =

N
k=1
(E(k) Y(k))
2

N
k=1
Y(k)
2
(20)
where N is the number of samples. For each subject, each algo-
rithm was applied to every trial of each type of OA at each
EEG channel using each of the three reference inputs (see
Section II-B). The ensemble averages of the original and OA-
removed trials were used for X and Y , respectively, in (20)
to calculate R in each case. While the actual value of R does
not hold meaning, its relative value can be used to compare re-
sults using different algorithms and reference signals at different
electrodes, as shown later.
III. RESULTS
Statistical tests were performed for each combination of:
1) 52 electrodes; 2) 2 algorithms (RLS, H

); and 3) 9 OA
types to compare R using the ET + fEEG reference versus the
EOG reference in each case. That is, for each set of variables
(electrode location, OA removal algorithm and OA type), the
signcance of the difference in R between the reference signals
used by the given OA removal algorithm was determined using
a pairwise t-test on the subject data. The difference between
using ET + fEEG versus fEEG was similarly tested. Out of the
resulting set of pairwise t-tests, the combinations that showed
a signicant difference between ET + fEEG and EOG and/or
fEEG are shown in Table III (all other cases were found not
to have signicant differences (p < 0.05) between using ET
+ fEEG and either EOG or fEEG). In each row, the last two
columns show the signicance of the comparison between R
values for ET + fEEG versus (a) EOG; and (b) fEEG.
Fig. 5 shows an example of the previously mentioned graph-
ically for each algorithm. Bar graphs showing mean R at each
electrode are shown, with a box indicating that the difference is
signicant. Overall, more OA is removed at frontal electrodes,
and H

removes more OA than RLS, both as expected. In ad-


dition, Fig. 6 shows sample plots of EEG before and after OA
removal using both algorithms with each of the EOG and ET +
TABLE III
MEAN R VALUES FOR EACH ELECTRODE, ALGORITHM, OA TYPE, AND
REFERENCE INPUT COMBINATION WHERE THE R VALUE FOR ET+FEEG WAS
SIGNIFICANTLY DIFFERENT THAN EOG AND/OR FEEG
fEEG references. In each case, both sample single trials and the
average of several trials are shown. For RLS at Fp1 during SR
saccades, using ET + fEEG produces signicantly (p < 0.05)
better results (R = 27.9 in the rst line of the top of Table III)
than EOG(R=1.6). This is conrmed in Fig. 6(a), which shows
that both in single trials and the ensemble average, ET + fEEG
removes more signal during OA. Similarly, for H

at Fp2 dur-
ing SU saccades, using ET + fEEG is signicantly (p < 0.05)
better (R = 648 in the rst line of the bottom of Table III) than
EOG (R = 46.8), as shown in Fig. 6(b).
IV. DISCUSSION
For H

, at CP3, P7, and O1, for certain OA types (LL, LR,


SU), using the eye tracker was shown to remove signicantly
(p < 0.05) more OA than using frontal EEG as a reference
(indicated on right hand of bottomof Table III by S: Higher R).
For all other cases, using the eye tracker resulted, on average,
in as much OA being removed as using frontal EEG.
Although at a few posterior locations, for SR saccades, using
EOG removed more OA than using the eye tracker, for many
applications removing more OA from those electrodes may not
be of practical importance. Specically, there is very little power
during OAs at those electrodes (see Fig. 5 and six rows of
Table III) compared to frontal electrodes where using the eye
tracker resulted in signicantly (p < 0.05) more OA removal
than using EOG, and where the power during the OA is much
higher (e.g., R = 648 for ET + fEEG and 46.8 for EOG at Fp2:
eye tracker removes 13.8 times as much OA) than the posterior
electrodes (e.g., R = 1.8 for ET + fEEG and 5 for EOG at P2:
EOG removes 2.8 times as much OA).
For RLS, at some electrodes, for certain OA types (see top
of Table III), the eye tracker resulted in signicantly more OA
2108 IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 59, NO. 8, AUGUST 2012
Fig. 5. Logarithm of mean R values for (a) RLS and (b) H

algorithms during small upward saccades. Electrodes where the difference between means is
signicant are highlighted with boxes. Amplitudes are scaled the same in both (a) and (b) to show the relative performance of the algorithms at each electrode.
Fig. 6. OA removal using (a) RLS at Fp1 during small right saccade and
(b) H

at Fp2 during small upward saccade. In each case, three sample trials and
the average of several trials are shown. The original EEG, and the corresponding
artifact-removed signal using each of the EOG and eye tracker combined with
frontal EEG (ET + fEEG) references are plotted.
removal than EOG. In all cases considered, using the eye tracker
resulted, on average, in as much OA being removed as using
frontal EEG.
Thus, for both algorithms, using the eye tracker was found
overall to be as good or better than using EOG or frontal
EEG at removing OA over the range of OA types and elec-
trodes investigated. Compared to EOG, the eye tracker re-
moved signicantly (p <0.05) more OA (almost 14 as much)
at Fp2 during SU saccades and less OA at P2, PO3, PO4,
PO8, Pz, and POz during SR saccades (EOG removed <3
as much).
Looking at RLS only, it may be argued that since the eye
tracker does not signicantly improve performance over frontal
EEG, there is no need for an eye tracker. However, H

has
been previously shown [26] to perform better than RLS for
OA removal. This nding was consistent with the results in
this study: H

signicantly (p < 0.05) benets from using


the eye tracker instead of frontal EEG. Furthermore, the only
electrodes where the EOG reference resulted in more OA re-
moval than the eye tracker (and then only during SR sac-
cades) are at posterior locations (see Table III) with very
low power during OA. Hence, the superiority of using EOG
at those electrodes for one OA type may not be of practi-
cal importance for many applications, while the superiority
of using the eye tracker at Fp2 may be of greater practical
benet.
Finally, the BSG algorithm was evaluated as follows. Its
output was found [see Fig. 4(b)] to match V
EOG
s shapea
widely accepted measure of a blinks timecourse. Also, replac-
ing V
EOG
with BSGs output showed that using the ET + fEEG
reference (which relies on BSGs output for blink information)
resulted, on average, in as much OA removal as using EOG
during blinks of different amplitudes for both RLS and H

.
Therefore, it can be concluded that the BSG algorithm gener-
ated as effective a reference as EOG for both algorithms during
blinks.
NOUREDDIN et al.: ONLINE REMOVAL OF EYE MOVEMENT AND BLINK EEG ARTIFACTS 2109
V. CONCLUSION
A novel approach was presented for using an eye tracker
to replace EOG for any OA removal method that requires an
EOG reference. The performance on real EEG of the RLS and
H

algorithms using: 1) EOG; 2) frontal EEG; and 3) an eye


tracker with frontal EEG as reference inputs was compared
for eye movements and blinks of different amplitudes at 52
electrodes across the entire scalp. For both algorithms, the eye
tracker resulted in as much or more OA being removed than
EOG or frontal EEG. For the extensive analysis provided in this
study, the data collection procedure was necessarily long, but
for clinical use, short sessions can be used involving much fewer
electrodes. Also, additional subjects could be used to investigate
the effects of eye conditions such as cataracts and glaucoma on
the eye tracker. Future work could also apply the approach to
improving the SNR of evoked potentials.
In addition, this was the rst use of a high-speed eye tracker
for online OA removal that was able to remove both eye move-
ment and blink effects. This was possible because of a new
algorithm (BSG) for extracting a blinks time course from eye
tracker images. The results conrmed that the BSG algorithm
can be used as part of an effective calibration-free eye tracker-
based OA removal method.
Furthermore, eliminating EOGelectrodes attached to the face
is critical for practical daily applications, while the eye tracker
can be used in applications where EEG and eye tracking are
combined for other purposes. The eye tracker and BSG algo-
rithm allow tracking of point-of-gaze and blink dynamics si-
multaneously with EEG processing, which is often desired or
required in clinical studies and a variety of humancomputer
interface applications such as neuromarketing, braincomputer
interfaces, and pilot and driver drowsiness monitoring. The last
two applications could particularly benet from the BSG algo-
rithm, as it can be used to measure blink frequency and speed
as an indicator of alertness.
REFERENCES
[1] G. Gomez-Herrero, W. D. Clercq, H. Anwar, O. Kara, K. Egiazarian, S.
V. Huffel, and W. V. Paesschen, Automatic removal of ocular artifacts in
the EEG without an EOG reference channel, in Proc. 7th Nordic Signal
Process. Symp. (NORSIG), 2006, pp. 130133.
[2] J. J. M. Kierkels, J. Riani, J. W. M. Bergmans, and G. J. M. van Boxtel,
Using an eye tracker for accurate eye movement artifact correction,
IEEE Trans. Biomed. Eng., vol. 54, no. 7, pp. 12561267, Jul. 2007.
[3] K. H. Ting, P. C. W. Fung, C. Q. Chang, and F. H. Y. Chan, Automatic
correction of artifact from single-trial event-related potentials by blind
source separation using second order statistics only, Med. Eng. Phys.,
vol. 28, no. 8, pp. 780794, 2006.
[4] G. L. Wallstrom, R. E. Kass, A. Miller, J. F. Cohn, and N. A. Fox,
Automatic correction of ocular artifacts in the EEG: A comparison of
regression-based and component-based methods, Int. J. Psychophysiol.,
vol. 53, no. 2, pp. 105119, 2004.
[5] Y. Li, Z. Ma, W. Lu, and Y. Li, Automatic removal of the eye blink artifact
from EEG using an ICA-based template matching approach, Physiol.
Meas., vol. 27, no. 4, pp. 425436, 2006.
[6] C. A. Joyce, I. F. Gorodnitsky, and M. Kutas, Automatic removal of
eye movement and blink artifacts from EEG data using blind component
separation, Psychophysiology, vol. 41, no. 2, pp. 313325, 2004.
[7] P. He, G. Wilson, and C. Russell, Removal of ocular artifacts fromelectro-
encephalogram by adaptive ltering, Med. Biol. Eng. Comput., vol. 42,
no. 3, pp. 407412, 2004.
[8] T. Liu and D. Yao, Removal of the ocular artifacts from EEG data using
a cascaded spatio-temporal processing, Comput. Methods Programs
Biomed., vol. 83, no. 2, pp. 95103, 2006.
[9] S. Puthusserypady and T. Ratnarajah, H

adaptive lters for eye blink


artifact minimization fromelectroencephalogram, IEEE, Signal Process.
Lett., vol. 12, no. 12, pp. 816819, Dec. 2005.
[10] A. Schlogl, C. Keinrath, D. Zimmermann, R. Scherer, R. Leeb, and
G. Pfurtscheller, A fully automated correction method of EOG artifacts
in EEG recordings, Clin. Neurophysiol., vol. 118, no. 1, pp. 98104,
2007.
[11] D. Talsma, Auto-adaptive averaging: Detecting artifacts in event-related
potential data using a fully automated procedure, Psychophysiology,
vol. 45, no. 2, pp. 216228, 2008.
[12] C. Hennessey, B. Noureddin, and P. Lawrence, Fixation precision in high-
speed noncontact eye-gaze tracking, IEEE Trans. Syst., Man, Cybern.
B: Cybern., vol. 38, no. 2, pp. 289298, Apr. 2008.
[13] C. T. Lovelace, R. Derakhshani, S. P. K. Tankasala, and D. L. Filion,
Classication of startle eyeblink metrics using neural networks, in Proc.
Int. Joint Conf. Neural Netw. (IJCNN), 2009, pp. 19081914.
[14] S. Sirohey, A. Rosenfeld, and Z. Duric, A method of detecting and
tracking irises and eyelids in video, Pattern Recognit., vol. 35, no. 6,
pp. 13891401, 2002.
[15] Y. li Tian, T. Kanade, and J. F. Cohn, Dual-state parametric eye tracking,
in Proc. 4th IEEEInt. Conf. Automat. Face Gest. Recognit., 2000, pp. 110
115.
[16] M. Pardas, Extraction and tracking of the eyelids, in Proc. IEEE Int.
Conf. Acoust., Speech, Signal Process. (ICASSP 2000), vol. 4, pp. 2357
2360.
[17] H. Tan and Y.-J. Zhang, Detecting eye blink states by tracking iris and
eyelids, Pattern Recognit. Lett., vol. 27, no. 6, pp. 667675, 2006.
[18] B. Noureddin, P. D. Lawrence, and G. E. Birch, Time-frequency analysis
of eye blinks and saccades in EOG for EEG artifact removal, in Proc.
3rd Int. IEEE/EMBS Conf. Neural Eng. (CNE 2007), pp. 564567.
[19] S. Puthusserypady and T. Ratnarajah, Robust adaptive techniques for
minimization of EOG artefacts from EEG signals, Signal Process.,
vol. 86, no. 9, pp. 23512363, 2006.
[20] A. Delorme and S. Makeig, EEGLAB: An open source toolbox for anal-
ysis of single-trial eeg dynamics including independent component anal-
ysis, J. Neurosci. Methods, vol. 134, no. 1, pp. 921, 2004.
[21] B. Noureddin, P. D. Lawrence, and G. E. Birch, Effects of task and EEG-
based reference signal on performance of on-line ocular artifact removal
from real EEG, in Proc. 4th Int. IEEE/EMBS Conf. Neural Eng. (NER
2009), pp. 614617.
[22] J. Malmivuo and R. Plonsey, BioelectromagnetismPrinciples and Appli-
cations of Bioelectric and Biomagnetic Fields. London, U.K.: Oxford
Univ. Press, 1995.
[23] E. Niedermeyer and F. L. da Silva, Electroencephalography: Basic Princi-
ples, Clinical Applications, and Related Fields, 4th ed. Baltimore, MD:
Lippincott Williams and Wilkins, 1998.
[24] J. J. M. Kierkels, G. J. M. van Boxtel, and L. L. M. Vogten, Amodel-based
objective evaluation of eye movement correction in EEG recordings,
IEEE Trans. Biomed. Eng., vol. 53, no. 2, pp. 246253, Feb. 2006.
[25] N. Ille, P. Berg, and M. Scherg, Artifact correction of the ongoing EEG
using spatial lters based on artifact and brain signal topographies, J.
Clin. Neurophysiol., vol. 19, no. 2, pp. 113124, 2002.
[26] B. Noureddin, P. D. Lawrence, and G. E. Birch, Quantitative evaluation
of ocular artifact removal methods based on real and estimated EOG
signals, in Proc. IEEE 30th Annu. Int. Conf. Eng. Med. Biol. Soc. (EMBS
2008), pp. 50415044.
Borna Noureddin received the B.Eng. degree
in computer engineering from the University of
Victoria, Victoria, BC, Canada, and the M.A.Sc. and
Ph.D. degrees in electrical engineering from the Uni-
versity of British Columbia, Vancouver, BC.
His research interests include computer vision,
biomedical signal and image processing, biophysical
modeling, and humancomputer interaction.
2110 IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 59, NO. 8, AUGUST 2012
Peter D. Lawrence (S64M73SM06) received
the B.A.Sc. degree in electrical engineering from the
University of Toronto, Toronto, ON, Canada, in 1965,
the M.S. degree in biomedical engineering from the
University of Saskatchewan, Saskatoon, SK, Canada,
in 1967, and the Ph.D. degree in computing and in-
formation science from Case Western Reserve Uni-
versity, Cleveland, OH, in 1970.
Between 1970 and 1972, he was a Guest Re-
searcher with the Applied Electronics Department,
Chalmers University, Goteborg, Sweden. Between
1972 and 1974, he was a Research Staff Member and a Lecturer with the
Mechanical Engineering Department, Massachusetts Institute of Technology,
Cambridge. Since 1974, he has been with the University of British Columbia,
Vancouver, BC, Canada, where he is currently a Professor in the Department of
Electrical and Computer Engineering. His main research interests include the
application of real-time computing in the control interface between humans and
machines, image processing, and mobile machine modeling and control.
Gary E. Birch (SM02) received the B.A.Sc. in elec-
trical engineering in 1983, and received the Doctor-
ate degree in electrical engineering (biomedical sig-
nal processing), in 1988, both from the University of
British Columbia, Vancouver, BC Canada.
He was appointed the Director of Research and
Development at the Neil Squire Society in 1988 and
then in 1994 was appointed Executive Director. He
is also the Adjunct Professor with the Department
of Electrical and Computer Engineering, University
of British Columbia. His specic areas of expertise
are assistive technologies, EEG signal processing, direct braincomputer inter-
face, digital signal processing, humanmachine interface systems, biological
systems, robotic control systems, environmental control systems, and service
delivery programs for persons with disabilities.

Das könnte Ihnen auch gefallen