Sie sind auf Seite 1von 13

2604

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 5, MAY 2012

Adaptive Emotional Information Retrieval From EEG Signals in the Time-Frequency Domain
Panagiotis C. Petrantonakis, Student Member, IEEE, and Leontios J. Hadjileontiadis, Senior Member, IEEE
AbstractThis paper aims at developing adaptive methods for electroencephalogram (EEG) signal segmentation in the time-frequency domain, in order to effectively retrieve the emotion-related information within the EEG recordings. Using the multidimensional directed information analysis supported by the frontal brain asymmetry in the case of emotional reaction, a criterion, namely asymmetry index , is used to realize the proposed segmentation processes that take into account both the time and frequency (in the empirical mode decomposition domain) emotionally related -based emotional lEEG components. The efciency of the ters was justied through an extensive classication process, using higher-order crossings and cross-correlation as feature-vector extraction techniques and a support vector machine classier for six different classication scenarios in the valence/arousal space. This resulted in mean classication rates from 64.17% up to 82.91% in a user-independent base, revealing the potential of establishing such a ltering for reliable EEG-based emotion recognition systems. Index TermsElectroencephalogram (EEG), emotion recognition (ER), empirical mode decomposition, frontal brain asymmetry, multidimensional directed information.

I. INTRODUCTION

LECTROENCEPHALOGRAM (EEG)-based emotion recognition (EEG-ER) systems are gaining considerable attention, since they provide a convenient and nonintrusive way of capturing signals related to emotional expression in the brain, with efcient time resolution. Moreover, many studies [1][7] have revealed the potential of such systems to differentiate discrete emotions and affective states paving the way of new approaches to the so-called Affective Computing area [8]. The latter deals with the design of systems and devices that can detect, recognize and process human emotion. A typical approach of an EEG-ER system realization consists of three major steps: a) the emotion elicitation step, i.e., specifically designed experiments where emotions are articially evoked to subjects by pictures [4], videos [2], and/or sounds [3] with predened emotional reference; b) the captured data preprocessing step, where the recorded signals are subjected to frequency band selection, noise cancelling, and artifact removal procedures; and c) the classication step, where the feature extraction techniques and robust classication methods are utilized to classify the recorded signals in different groups that

Manuscript received October 19, 2011; revised January 17, 2012; accepted February 02, 2012. Date of publication February 13, 2012; date of current version April 13, 2012. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Z. Jane Wang. The authors are with the Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, GR-54124 Greece (e-mail: ppetrant@auth.gr; leontios@auth.gr). Digital Object Identier 10.1109/TSP.2012.2187647

refer to different affective states evoked during the elicitation step. Besides the threefold aforementioned realization, the emotion elicitation step is the most crucial step for an efcient EEG-ER system. For instance, if the media used as stimulation do not effectively evoke the appropriate subjects emotional reaction and arousal (different subjects may be emotionally affected by different stimulation due to personality and/or personal experiences), the respective EEG recordings would not incorporate the corresponding emotional information, resulting in an incompetent emotion classication process. Based on that, and taking into account the difculty of evoking emotional states during an articially designed affective situation in a lab, it is of major importance to discard the useless nonemotion related information from the captured EEG dataset. The rst attempt towards such direction was made in [9], where a novel index, namely asymmetry index (AsI), utilizing multidimensional directed information (MDI) [10], was introduced in the trial-based domain. There, it was proved that EEG signals, which refer to distinct emotion elicitation trials, with relatively big (above a specied threshold) AsI values, were more effectively discriminated during the classication step compared to the opposite case. Besides the trial-based dependence of the emotion elicitation efciency in a relative experiment, the exposure of a subject to an emotional stimulation with specic duration, e.g., a picture projection, would evoke a relative, time-varying emotional reaction. As a result, the extraction of features for classication from the signals that refer to the whole duration of the picture projection would lead to problematic extraction of the emotion-related information during the feature extraction process and, thus, to a poor classication performance. Furthermore, the frequency-based characteristics of the emotion related EEG activity bring out another mean of emotion related information retrieval that would lead to even better isolation of the valuable information from EEGs. In this paper, in order to meet the time-varying appearance of the emotional reaction of each subject and the frequency-based dependency of emotional expression in EEG recordings, EEG signals are subjected to AsI-based algorithms to be emotionally puried, i.e., to exclude as much as possible the nonemotion related information, in time-frequency domain. Regarding the time-based perspective, AsI index was applied in a sliding time-window oriented approach. In this way, each EEG trial was segmented in targeted durations in the time domain, leading to a more effective representation of emotion-related information in the features extracted from the segmented signal, in contrast with the initial one. Moreover, considering the frequency-

1053-587X/$31.00 2012 IEEE

PETRANTONAKIS AND HADJILEONTIADIS: ADAPTIVE EMOTIONAL INFORMATION RETRIEVAL

2605

based characteristics of the emotional expression in the brain (see Section II-A), this work also aims at expanding the above time-based approach of EEG database segmentation to a frequency- and time-frequency-based one. In order to accomplish this, empirical mode decomposition (EMD) analysis [11] is utilized, as a means of the time-frequency representation of the EEG signal. In the EMD domain, the local characteristic time scale of the EEG signal is used for decomposing it into a number of intrinsic mode functions (IMFs), representing a generally simple oscillatory mode. Having this, the index is employed either globally (in the whole IMF) or locally (in a sliding window), leading to the aforementioned frequency- and timefrequency-based segmentation of the signal, respectively. Each time, a reconstruction process takes place to form the nal trialreferring EEG signal, from which the feature vector for classication is to be extracted. All above described approaches aim at retrieving the emotional information from EEG signals in an adaptive way, which is demonstrated from both the fundamental algorithmic tools used (e.g., EMD) and the fact that each emotion elicitation trial is faced separately and new, ltered signals are constructed. For the evaluation of the previously described methodologies for time-frequency-based segmentation, a thorough classication procedure was followed, incorporating two different feature extraction methods, namely higher order crossings (HOC) [4] and cross-correlation (CC) analysis [12] and six different classication scenarios, in accordance with the classication setup implemented in [9], for comparison reasons. The EEG database used (same as in [9]) comprised of EEG signals from 16 subjects captured during a specically designed experiment to evoke certain affective states. The results provided by the aforementioned classication setup reveals the signicance of such emotional information retrieval that leads to a more reliable and pragmatic EEG-ER system. The rest of the paper is structured as follows. Section II provides with some background material in regard with the emotional expression in the brain and fundamental methodological tools used in the proposed scheme. Section III explicitly describes the proposed approaches, whereas Section IV presents the feature extraction and classication approaches adopted in this paper. Section V describes the experiments conducted for the EEG database selection and some implementation issues, along with the description of the classication setup. Sections VI and VII present the results and provide some discussion on the overall evaluation of the proposed methodologies, respectively. Finally, Section VIII concludes the paper. II. BACKGROUND A. Emotions and Frontal Brain Asymmetry In psychology, emotions are usually analyzed in a 2D space, i.e., the valence/arousal space (VAS), instead of being characterized as distinct emotional states, such as happiness, fear, sadness [13]. In VAS, valence stands for ones judgment about a situation as positive or negative (including the whole range between these two extreme cases) and arousal spans from calmness to excitement, expressing the degree of ones excitation.

The most prominent expression of emotions in brain signals was described by Davidson et al. [14], who developed a model that relates the asymmetry between the left and right frontal and prefrontal lobes (expressed in alpha frequency band, i.e., 812 Hz) of the brain with emotions, with the latter be analyzed in the VAS. According to that model, emotions are either organized around approach-withdrawal tendencies or differentially lateralized in the frontal brain region. The left frontal area is involved in the experience of positive emotions, whereas the right frontal region is involved in the experience of negative emotions. Other studies [15][17] have also conrmed the aforementioned asymmetry concept and have examined frequency bands other than alpha, including theta (47 Hz), beta (1330 Hz), and gamma (30100 Hz), where asymmetrical effects were also found. Finally, Bos [3] examined the efcacy of alpha, beta and the combination of these bands to discriminate emotions within the VAS and concluded that both bands include important information for the aforementioned discrimination. Despite the fact that there is still an ongoing discussion on the Davidsons model, a vast amount of neuroscience bibliography has contributed to the validity of that model; thus, it is adopted in this study (see Section V-A). B. Multidimensional Directed Information (MDI) For the aforementioned asymmetry, a robust mathematical tool is needed to express the information shared between the two brain sides and nally dene a measure for its quantication. This shared information is frequently dened as correlations among multiple EEG recording channels (multiple time series) simultaneously observed from a subject. If a relation of temporal ordering is noted, as the correlation relation among these time series (EEG channels), some are interpreted as causes and others as results, suggesting a cause-effect relation among the time series (causality analysis). When causality in such a sense is noted in multiple time series, the relation is dened as directed information [18]. There are methods developed to perform causality analysis, such as directed-coherence analysis [19], directed-information analysis [18], MDI analysis [10], Kaminskis method (DTF) [20], partial directed coherence [21], and Granger causality [22]. MDI analysis was employed as a means to identify the causality between any two series (EEG channels) considering all acquired series. The main advantage of MDI, compared to the other aforementioned methods, is that the amount of information propagation is presented as an absolute value in bits and not as a relative value, i.e., correlation; a brief description of the MDI [10] follows. Consider the simple case of three stationary time series , , and of length , divided into epochs of length ; each epoch of length is written as a sequence of two sections of and lengths before and after the , , and sampled values of time series , , and at time , respectively, i.e. (1) (2) (3)

2606

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 5, MAY 2012

C. The Asymmetry Index (ASI) As described in Section II-A, the experience of negative emotions is related with an increased right frontal and prefrontal hemisphere activity, whereas positive emotions evoke an enhanced left-hemisphere activity. The measure is based on that asymmetry concept and its denition is briey described later. Assume that EEG channel 1 recorded from the left hemisphere, EEG channel 2 from the other hemisphere, and EEG channel 3 recorded from both hemispheres as a dipole channel represent the signals , , and of MDI analysis, respectively. In order to evaluate the asymmetry-related information between signals and , taking into account the information propagated to both of them by the signal , the MDI approach was applied by estimating the total amount of information shared between the left and right hemisphere (signals and , respectively) with (5). In accordance with the frontal brain asymmetry concept, would become maximum when the subject is calm (information symmetry), whereas it would become minimum when the subject is emotionally aroused (information asymmetry), corresponding to the and values, respectively. Consequently, the latter could be formed as (6) (7) , , , and are estimated by (5). where Using (6) and (7), is dened as the distance of the point, corresponding to a specic emotion elicitation cause (e.g., a picture), from the line , i.e. (8) (4) is the covariance matrix of the stochastic where variables . Using (4), the total amount of information, namely , that is rst generated in and propagated to taking into account the existence of , across the time delay range is concept is further expanded to be apIn this paper, the plicable in the time-frequency domain, with the frequency component expressed through the EMD approach [11], in order to accomplish an adaptive EEG time-, frequency-, and time/frequency-based segmentation. D. EMD The EMD method considers the signals at their local oscillation scale, subtracts the faster oscillation, and iterates the residual. The produced IMFs satisfy two important conditions, i.e.: i) the number of extrema and the number of zero crossings differ by at most one and ii) the local mean is zero. In particular, the EMD procedure for a given signal is realized through the sifting process, summarized as follows [11]: 1. Identication of the successive extrema of . 2. Extraction of the upper and lower envelopes by interpolation. 3. Computation of the average of upper and lower envelopes . 4. Calculation of the rst component: . 5. Treatment of as a new set of data, and repetition of steps 14 up to times until becomes a true IMF. Then set . Overall, should

Fig. 1. An illustration of what the , , and and how they are created from the EEG time series

sets represent [see (1)] .

; ; ; . An example of the formulation of (1), for the case of an EEG signal, is shown in Fig. 1. According to the MDI analysis [10], information that is rst generated in at time and propagated with a time delay of to taking into consideration information that is propagated to both of them from , can be calculated from [10] ;

where

(5) It must be stressed out that if time series and contain a common component from , i.e., there is information ow from to both and but not between and , in conventional directed information analysis, i.e., is excluded from (4), an information ow would wrongly be identied, as if there exists a ow between and . To circumvent this ambiguity, the MDI method is employed. In the subsequent paragraph, (5) will be used to consolidate the measure by estimating the mutual information shared between the left and right brain hemisphere, exploiting, that way, the frontal brain asymmetry concept.

PETRANTONAKIS AND HADJILEONTIADIS: ADAPTIVE EMOTIONAL INFORMATION RETRIEVAL

2607

contain the nest scale or the shortest period component of the signal. 6. Obtainment of the residue . 7. Treatment of as a new set of data and repetition of steps 16 up to times until the residue becomes a constant, a monotonic function, or a function with only one cycle from which no more IMFs can be extracted. 8. Finally, where is the th IMF and the nal residue. The above procedure results in a decomposition of the data into -IMFs and a residue . III. THE PROPOSED APPROACH The proposed emotion information retrieval scheme is realized through three segmentation types. In particular, the rst one, namely wAsI incorporates the application of the AsI index estimation within a sliding across time window of the signal with certain length. The second one, namely EMD-AsI, involves the application of AsI in the EMD domain, as a means for selecting the most appropriate IMFs that carry the emotional information. The last one, i.e., the EMD-wAsI approach, focuses at each IMF in a local manner, extracting its segments that contribute the most to the expression of the emotional information. It is actually the implementation of the wAsI algorithm in the IMFs of a signal. The aforementioned segmentation types are elaborately described later. A. Time-Based Segmentation (wAsI) Fig. 2 provides a schematic representation of the windowed approach. The case study depicted refers to a single emotion elicitation trial. The -sample signals from three EEG channels, i.e., , , and , that are simultaneously recorded during the emotion elicitation trial, are used for the implementation of the algorithm. First, a length window slides in a parallel way through the three EEG signals either for the signals that refer to the phase where the subject is relaxed or when the subject is emotionally aroused. The window moves across the signals with 1-sample step, resulting in a total number of windows . For each one of the triplet , , and , i.e., the parts of the signals , , and that are within the window borders, where , the MDI analysis is applied for the relax and the emotionally triggered (e.g., picture projection) phase. Subsequently, the respective and values are extracted and, according to the estimation formula of AsI, the new wAsI version is dened, i.e. (9) , value is exFor each window, a tracted and is assigned into the middle of the window. In order to extract the segment of the signal that is more likely to express emotional EEG activity, the EEG segments that correspond to those values that exhibited a concentrated multitude of peaks higher than 0.5 ( values are normalized in the range ) were selected, by multiplying the initial signal with the unit-amplitude pulse (see Fig. 2-gray pulse), namely . The selected segments constitute a new signal,

Fig. 2. The schematic representation of the wAsI approach.

(in Fig. 2 the segmented signal corresponds to the initial signal of the projection phase), that is supposed to correspond to an emotionally ltered signal. It must be stressed out that all signals, , , and either from the relax or the projection phase of a single emotion elicitation trial are segmented with the use of the (in Fig. 2 only the segmentation of the signal of the projection phase is depicted as an example). Afterwards, the feature extraction and the classication stages take place and the segmented signals are further investigated regarding their ability to better discriminate emotion in relation with the initial -sample signals. B. Frequency-Based Segmentation (EMD-AsI) A schematic diagram of the proposed EMD-AsI segmentation approach is depicted in Fig. 3. According to Fig. 3, the three EEG signals (channels) , , and , are initially subjected to the EMD process and the IMFs are extracted for each one of them. Afterwards, each triplet of IMFs, i.e., , and , and , passes through the MDI analysis, and the , values are estimated. Subsequently, a thresholding procedure takes place and removes the IMFs that correspond to normalized (range ) values lower than

2608

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 5, MAY 2012

the IMFs of the three EEG signals , , and are obtained and each one of the triplets , , and , are subjected to the wAsI algorithm which is schematically presented in Fig. 2. After the wAsI algorithm is applied and all segmented IMFs are obtained, the EMD-wAsI approach is concluded through a reconstruction phase that takes place per channel, e.g., , and the reconstructed , , and signals either for the projection or the relax phase are then subjected to the feature extraction and classication processes, described in the subsequent section. IV. FEATURE EXTRACTION AND CLASSIFICATION METHODS A. Feature Vector Construction Similarly to [9], the feature vector set was constructed based on two methods, i.e., HOC [4] and CC [12]. The HOC technique was proved benecial for feature vector extraction for the case of emotion recognition, as shown in [4]. A brief description of the calculation of HOC sequences is presented in the Appendix. The HOC used to construct the feature vector , was formed as follows: (10)
Fig. 3. The schematic representation of the EMD-AsI approach.

where denotes the maximum order of the estimated HOC, is the HOC order up to which the corresponding numbers of zero crossings were used to form the , and is the number of zero crossings in respective order (see Appendix). Due to the three-channel EEG recording setup (see Section V-A) the nal , where stands for combined, was structured as

(11) where , , denote the channel number that participates to the combination. The CC method was chosen here as a baseline approach. The CC method introduced in [12] estimates the CC coefcient between potentials of the EEG electrodes and for the frequency band , i.e. (12) where is the Fourier transform of EEG at the electrode site and frequency bin, and the summation is over the frequency bins in the frequency band. In this paper, the CC coefcient was estimated for both alpha and beta bands resulting in a six-feature vector, denoted as , three features for each band as a three-channel EEG recording set is used. B. Classication Techniques As there is no single best classication algorithm, that is onesize-ts-all, the choice of the most efcient classier is strongly dependent on the examined problem and the relevant dataset to be classied [23]. After having tested some classiers such as

Fig. 4. The block-diagram of the realization of the EMD-wAsI approach.

a threshold of 0.5. Finally, by summing up the IMFs that remain after the thresholding process, three new , , and signals are reconstructed and used as inputs to the subsequent feature extraction methods (see Section IV). C. Time- and Frequency-Based Segmentation (EMD-wAsI) The schematic representation of the EMD-wAsI segmentation approach is depicted in Fig. 4. As in EMD-AsI approach,

PETRANTONAKIS AND HADJILEONTIADIS: ADAPTIVE EMOTIONAL INFORMATION RETRIEVAL

2609

quadratic discriminant analysis (QDA) [24], Mahalanobis distance (MD) [25], -nearest neighbor ( -NN) [26], and support vector machine (SVM) [27], the latter was chosen, as it outperformed the others with higher recognition accuracy. In the SVM classier, a polynomial function was used, as a kernel function, to project the data into higher dimensional feature space, i.e. (13) where is the support vector and is the query feature vector. The SVM kernel (13) was chosen after a thorough experimentation with different kernels and various parameters, resulting in the most robust results. Among several available approaches to realize a multiclass SVM classication process the one-versus-all method was adopted here. V. EXPERIMENTS A. Datasets In this paper, the projection of pictures with predened emotional content was used as a means of emotional stimulation. The pictures used were drawn from the International Affective Picture System (IAPS) [28] database, a widely used dataset, which contains pictures that come with their individual norm values of valence and arousal in the range from 1 to 9 for each metric. The selected pictures (totally 40 pictures) were chosen to refer to arousal and valence higher (H) from 6 and lower (L) from 4 in the above described range with standard deviation lower than 2.2. Thus, 10 pictures were chosen for each one of the cases low arousal low valence (LALV), low arousal high valence (LAHV), high arousal high valence (HAHV), and high arousal low valence (HALV). It should be stressed out that valence is usually associated with the terms negative or positive. Due to the range of the IAPS norms, i.e., 1 to 9, for valence and arousal, and for the sake of simplicity, the terms low and high are alternatively used in this paper instead of negative or positive, respectively. For the construction of the EEG data, a specically designed experiment was conducted through the abovementioned elicitation process. In this experiment, 16 (9 males and 7 females in the age group of 1932 yr), healthy, right-handed volunteers participated. The experimental protocol included a series of discrete steps. In particular, sequentially for each one of the 40 pictures, the following procedure took place: i) projection of a 5 s black screen; ii) projection of a 5 s period in which countdown frames were demonstrated; iii) a 1 s projection of a cross shape in the middle of the screen to attract the sight of the subject; and iv) projection of the corresponding picture for 5 s. The 5 s countdown phase was employed to accomplish a relaxation phase and emotion-reset [3] before the projection of the new picture, due to its naught emotional content. During the experiment, the selected pictures were projected in sequence: 10 for LALV, 10 for LAHV, 10 for HAHV, and 10 for HALV. Thus, more intense emotions were chosen to be projected at the end of
Fig. 5. The Fp1, Fp2, F3, and F4 EEG sites according to the 10/20 system.

the whole process. This was adopted as a means to avoid order effects as, in a random picture projection, a possible case of intense-mild sequential emotional content would evoke an emotionally cover up effect, that is, an intensive emotion would dominate upon a milder one. In our previous work [9], the self-assessment of the IAPS pictures by the subjects participated in the experiment was also presented. There, it was shown that the categorization of the EEG signals using the self-assessment ratings did not provide better results than the categorization based on the IAPS norms. In a step further, it was observed that, although, the self-assessment of the valence dimension exhibited high consistency with the , the one that refers to the arousal dimenIAPS norms sion appeared to match with the IAPS norms approximately in 50% of the cases. The latter reveals an almost random behavior in the evaluation of the emotional arousal by the subjects. In order to avoid any propagation of such randomness to the subsequent analysis and taking into account the ndings in [9], the IAPS norms was preferred for the signals categorization in this work. The EEG signals from each subject were recorded during the whole experiment. The EEG signals were acquired from Fp1, Fp2, F3, and F4 positions, according to the 1020 system [29], related to the emotion expression in the brain, based on the asymmetry concept. The Fp1 and Fp2 positions were recorded as monopole channels, whereas the F3 and F4 positions as a dipole, resulting in a 3-channel EEG set (see Fig. 5). Thus, the shared information between Fp1 (time series in the MDI analysis) and Fp2 (time series in the MDI analysis) channels could be measured, as effectively as possible, by taking into account the respective information propagated by the dipole F3/F4 (time series in the MDI analysis) to both of them. The ground and reference electrodes were placed at the right (A2) and left (A1) earlobes, respectively (see Fig. 5). EEG recordings were conducted using the g.MOBIlab (g.tec medical and electrical engineering, Guger Technologies, www.

2610

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 5, MAY 2012

gtec.at) with sampling frequency of 256 Hz. After the acquisition part, the EEG signals were subjected to a bandpass 10th order Butterworth ltering, to retain only the frequencies within the alpha (812 Hz) and beta (1330 Hz) bands, as these are the frequency bands that convey the emotion related information expressed by the frontal brain asymmetry concept [3], [15]. Although there are sophisticated approaches to remove superimposed artifacts from various sources (e.g., Independent Component Analysis), it has been reported [30][32] that, in many cases, these artifacts are effectively eliminated by appropriate bandpass ltering. Specically, the inuence of eye movement/ blinking is most dominant below 4 Hz, heart-functioning causes artifacts around 1.2 Hz, whereas muscle artifacts affect the EEG spectrum above 30 Hz. Nonphysiological artifacts caused by power lines are clearly above 30 Hz (5060 Hz). Consequently, by extracting only the alpha and beta frequency bands from the acquired EEG recordings, the most part of the noise inuence is circumvented. This is further supported by the fact that the experimental procedure took place using a clear experimental protocol, where the subjects minimized their movements and eye-blinking during the image projection. Afterward, the EEG signals were cut into 5 s segments corresponding to the duration of each picture projection. Finally, the EEG signals referring to the countdown phase were also cut and ltered as they were also used for the analysis described in the previous section. It and values [see (7) and (6)] must be stressed out that the were calculated for the signals that correspond to the projection of the IAPS picture and the countdown phase that took place immediately before the projection of the corresponding picture, respectively. B. Implementation Issues After experimentation, the window length was set and which resulted in a efcient number of epochs for MDI implementation. Moreover, the interval between two succeeding peaks (see Fig. 2) that belong to the same , should in maximum be samples. Furthermore, the borders of each wAsI-Mask were set to samples left from the far most left wAsI peak of the concentrated multitude of peaks that was observed and samples right from the far most right one, due to the assignment of the wAsI value at the center of the window . Regarding the HOC-based feature vector was set to . The SVM kernel function parameter in (13) was set as after a thorough experimentation. Finally, the number of IMFs was found to be . C. Classication Setup The same classication setup used in [9] was also employed here to facilitate the direct comparison between the corresponding approaches. In particular, six different three-classes classication scenarios were employed: i) S1: class1: LA, class2:HA and class3: the respective Relax signals (countdown phase); ii) S2: class1: LV, class2:HV and class3: the respective Relax signals; iii) S3: class1: LALV, class2:HALV and class3: the respective Relax signals; iv) S4: class1: LAHV, class2:HAHV and class3: the respective Relax signals; v) S5:

class1: LALV, class2:LAHV and class3: the respective Relax signals; vi) S6: class1: HALV, class2:HAHV and class3: the respective Relax signals. These classication scenarios were chosen in order to: a) emphasize on the discrimination of the adverse expressions of the valence and arousal coordinates in the VAS and the relax state and b) to create different classication setups for testing the consistency of the examined approaches to perform efciently in every one of them. It should be pointed out that some classications scenarios, such as S1, demonstrate the discrimination of states across the arousal dimension of the VAS despite the fact that AsI is based on the Davidsons asymmetry theory, which refers only to the valence dimension. First, it is clear that the classication procedure exclusively depends on the feature vector extracted by the HOC or CC algorithms and not on the AsI value estimated for each one of the different emotion elicitation trials. AsI is used only for the indication of an effective emotion elicitation trial via the detection of a frontal brain asymmetry, which demonstrates a probable deviation from the neutral affective state. This deviation may be towards a positive or negative affective state and taking this into account, it is not questionable that a respective arousal component of that affective state would be brought out, as a direct consequence from the engagement of defensive or appetitive motivational systems [28], [33]. Thus, by assuring that a deviation from neutral state occurs, a related low, moderate or high emotional excitation (arousal) will be present. This remark is also the reason for using pictures from the IAPS database with extreme low or high values of valence or arousal in nine-grade scale, i.e., to foster a probable deviation from the neutral state. As soon as this deviation is tracked by the AsI, the corresponding EEG signals are subjected to the classication procedure, where sates with different valence and/or arousal can be recognized by analyzing the alpha and beta bands of EEG signals, acquired from the frontal and prefrontal lobe [1], [3]. According to the AsI concept [9], emotion elicitation trials, i.e., corresponding pairs, with big AsI values have been shown to elicit more effectively the respective emotional state, whereas others with smaller AsI values exhibit the opposite result. According to this observation, trials that correspond to AsI larger than 0.5 gather the Big AsI group [9]. Particularly, the Big AsI group consisted of 31, 29, 21, and 15 signals (gathered from approximately all subjects) for the LALV, LAHV, HAHV, andHALV cases, respectively. In accordance with the [9], in this paper, the classication process was conducted both for the signals that belong to the Big AsI group and all signals that belong to emotion dataset and found to provide with a new signal after a specied segmentation (see Section III). For each one of the classication scenarios, the 70% of the signals were used as a training set, whereas the remaining 30% as a testing set. The leave- -out cross-validation technique was adopted for a better evaluation of the classication performance resulting in a 20-iteration classication process. Thus, for each one of the 20 iterations 70% of the dataset was randomly extracted and used for training and the remaining 30% for testing. The mean classication rate from the 20 iterations was nally extracted. All above processes were conducted for both the HOC- and CC-based feature vectors.

PETRANTONAKIS AND HADJILEONTIADIS: ADAPTIVE EMOTIONAL INFORMATION RETRIEVAL

2611

Fig. 6. values of the AsI, wAsI, EMD-AsI, EMD-wAsI, and EMD-wAsI BigIMFs approaches for HOC- and CC-based feature vector extraction methods and (a) for the Big AsI group; (b) all signals or signals with wAsI-Masks.

VI. RESULTS A. User-Independent Case The results of the aforementioned classication setup in a user-independent framework, that is, signals were classied regardless of the subject from whom they were recorded, are presented in Fig. 6(a) and (b) for the Big group and the whole dataset, respectively. In Fig. 6(a), the solid black line corresponds to the initial Big AsI group, i.e., the initial EEG signals without any preprocessing except for the ltering procedure described in Section V-A. The dashed black line represents the values of the signals of the Big AsI group after the wAsI algorithm was applied and, thus, a time-based segmentation of them was accomplished. The black dotted and dash-dotted lines correspond to the Big AsI group signals after the EMD-AsI and EMD-wAsI algorithms were applied to them, respectively. Finally, the gray solid line corresponds to the of the signals of the Big AsI group after a combination of the algorithms EMD-AsI and EMD-wAsI was applied. Particularly, the EMDwAsI algorithm was applied to IMFs that was assigned from the EMD-wAsI algorithm to have AsI values above the specied threshold (denoted as ). All the above described lines are marked with a circle (o) for HOC-based feature vector and with a triangle for the CC-based one. From a simple visual inspection of the Fig. 6(a) it is obvious that the EMD-wAsI and the approaches surpass all the others for almost all classication scenarios (see also Table I). On the other hand, the CC feature vector provides with signicantly poorer classication results, but still exhibits improvement when the initial signal is subjected in timeor time/frequency-based segmentation (see Table I). To statistically justify this difference and accommodate for the limited number of cases involved, the Wilcoxon signed ranks non-parametric test was used. This analysis revealed a strong statistically

TABLE I MEAN CLASSIFICATION RATES

FOR

BIG ASI GROUP

signicant difference between the classication performance of HOC and CC feature vectors, resulting in . For the whole EEG dataset, the corresponding classication results are depicted in Fig. 6(b). It must be stressed out that for wAsI, EMD-AsI, and EMD-wAsI, not all of the intial signals of the dataset were used, i.e., 160 signals per affective state (LALV, LAHV, HAHV, HALV). This is due to the nonexistence of at least one for a signal in the wAsI approach or for all IMFs of a signal in the EMD-wAsI approach and nally the nonexistence of IMF of AsI value higher than the pre-specied threshold for a certain signal in the EMD-AsI case. Thus, for the {LALV, LAHV, HAHV, HALV} affective cases, {85, 95, 93, 93}, {148, 151, 152, 153}, and {131,

2612

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 5, MAY 2012

TABLE II FOR THE WHOLE EEG DATASET MEAN CLASSIFICATION RATES

Fig. 7. values in descending order along with the ones, in a ranking provided from the descending order of

and .

135, 134, 134} signals were used for the wAsI, EMD-AsI, and EMD-wAsI, respectively. In Fig. 6(b), the annotation is exactly the same as in Fig. 6(a). Again, the EMD-wAsI and approaches with the HOC-based feature vector surpass all the others (see Table II). All the remaining approaches exhibit very lower classication results for all classication scenarios and for both feature extraction techniques. Tables I and II provide a complete presentation of values for the Big AsI group and whole EEG dataset, respectively. From Tables I and II, the ability of the proposed approaches at each category level, i.e., classication scenarios S1 (solely arousal domain) and S2 (solely valence domain), can be deduced. In particular, although the AsI is mainly related to the valence domain, the corresponding -based mean classication rates (S2 column) are quite similar to those from the arousal domain (S1 column). This justies the role of the AsI as an efcient index that fosters the corresponding to capture both the emotional valence and arousal related activity. B. User-Dependent Case So far, the presented results refer to a user-independent perspective, i.e., classication process took place considering the signals of each subject separately. In an attempt to evaluate the efciency of the proposed approaches in a user-dependent concept, the classication setup previously discussed was also implemented for each one of the 16 subjects using the 10-EEG signal set for each one of the affective states per subject (as described in Section V-A, for each affective state 10 pictures from the IAPS database were used, resulting in 10 respective EEG signals). In order to relate the classication performance of each subject with her/his mean AsI value (in accordance with [9]), all subjects were ranked according to their (in a descending order) as is depicted in Fig. 7. Moreover, in the same gure, the dashed line represents the mean AsI values of the time-based segmented signals of each user (i.e., ), which are also presented according to the subjects rank derived from the and the time-frequency based segmented signals using the EMD-wAsI approach (i.e., ) also presented ac. This way, it cording to the subjects rank derived from the can be shown that the same inclination trend holds for the and values in relation to ones.

Fig. 8. Mean classication rates (for all subjects standard deviation, not shown here, was around 6.5%), , for HOC (circles) and CC (triangles) methods across all subjects, when using the initial EEG signals (solid line) and the ones (dashed line) and the EMD-wAsI-Masked (gray solid line).

In Fig. 8 the black solid, black dashed, and gray solid lines present the mean classication rate, , derived from the six classication scenarios, i.e., is the mean of the across the six classication scenarios , , for the whole signals, the time-based segmentation of the signals (wAsI) , and the timefrequency based one (EMD-wAsI), respectively. The circle (o) and triangle marks correspond to HOC and CC methods, respectively. The important conclusion from this gure is that the HOC and CC lines incline with the same way as the AsI line, for both initial and segmented signal with wAsI, showing that a subject with small mean AsI tends to exhibit poorer classication performance in contrast to another with higher mean values. On the contrary, the segmented signals with the EMd-wAsI approach exhibit an equalized classication performance for all subjects concentrated around 80% (with the values spanning approximately from 70% to 100% for the six classication scenarios S1S6) and 60% (spanning from 50% to 75% for S1-S6) for the HOC and CC methods, respectively.

PETRANTONAKIS AND HADJILEONTIADIS: ADAPTIVE EMOTIONAL INFORMATION RETRIEVAL

2613

VII. DISCUSSION From the results presented in the previous section, the timefrequency segmentation, i.e., EMD-wAsI approach, led, in most cases, to a more efcient isolation of the emotional information in EEG signals and resulted in better classication performance during an EEG-ER task (Fig. 6). It should also be stressed out that this classication rate improvement was detected for all classication scenarios (S1S6), either for all signals that appeared to have or for signals that belong to the Big AsI group, and for both feature extraction methods. This conrms the effectiveness of the EMD-wAsI method to isolate the emotion-related information, which, consequently, led to a more reliable EEG-ER system. The way of descending seen between the and values (see Fig. 7) becomes more horizontal in the case of the values. Moreover, the values are larger, something that should be expected as the values were extracted from reconstructed EEG segments after the emotional-based ltering of the initial EEG signals with the EMD-wAsI method, which led to the isolation of those segments, where the asymmetry concept was more dominant. All the above observations explain the resulted values, derived from the EMD-wAsI approach for the subject dependent classication (see Fig. 8, gray solid line). There, a balanced classication performance for all subjects is exhibited in contrast with the wAsI case and the case where the initial signal (without any segmentation) is used, where the classication performance is highly dependent on the value of each subject. This fact lead to the conclusion that the EMD-wAsI approach has effectively isolated great amount of the emotion-related information distributed through the time and the frequency characteristics of the EEG recordings that refer to a specic emotion elicitation trial. The improvement in the values of the subjects with relatively low values is more dominant, whereas the subjects with higher values exhibit a slight decrease in some cases, which may happen due to the ltering procedure that might discard also valuable information. Nevertheless, the overall performance for both the HOC and CC feature vector approaches shows that the EMD-wAsI method provides with an efcient way to isolate the desired information. As it can be seen in Fig. 8 (where the subject-dependent classication results are depicted), the classication rates for subject 11 for the wAsI approach are missing, as not enough EEG signals were extracted in order to have reasonable number of signals to perform effectively the classication step. For instance, subject 11 had only two trials with in the HALV case and the prerequisite for a reliable classication performance was set to at least three trials for each affective state. On the other hand, for the EMD-wAsI approach, the same subject is found to have valid values, as were regularly found in the IMFs of his EEG signals. Moreover, as noted in the previous section, the signals that appear to have were {85, 95, 93, 93}, and {131, 135, 134, 134} for the {LALV, LAHV, HAHV, HALV} cases for the wAsI and the EMD-wAsI approaches, respectively. Thus, it is obvious that the EMD-wAsI method overcomes in some way the difculty of the approach

Fig. 9. Number of IMFs selected from the EMD-AsI approach.

Fig. 10. The normalized mean between all the extracted from the wAsI approach for each one second of the 5-s period of the picture projection phase in the form of a normalized activation level.

to extract a for a large number of signals. This provides with a larger dataset to perform the classication step and, moreover, to better evaluate the subject-dependent performance, where few signals are available and become even fewer if the signals without are not taken into consideration. In Fig. 9, the total number of IMFs that was chosen from the EMD-AsI method for the four affective cases is shown. It is obvious that the rst two IMFs (out of totally ve IMFs) are in major chosen from the algorithm. A small number of the third IMF is chosen for each case. The aforementioned observations show that with the EMD-AsI algorithm, a signicant amount of the frequency component of each signal is excluded and this leads to a poor classication performance (see Fig. 6). In an attempt to generally identify the localization of the emotion related information for the four affective states, i.e., HALV, HAHV, LALV, and LAHV, the mean between all the extracted from the wAsI approach for each one second of the 5-s period of the picture projection phase was estimated and the results in the form of a normalized activation level are depicted in Fig. 10. From this gure, two major conclusions can be drawn. First, it is obvious that the 1st and the 5th seconds do not exhibit emotional information, as far as the asymmetry concept is concerned, revealing a probable transitional period

2614

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 5, MAY 2012

Fig. 11. The normalized mean between all the extracted from the EMD-wAsI approach for each one of the rst four IMFs and for each one second of the 5 s period of the picture projection phase in the form of a normalized activation level.

exhibits a relative faster implementation in contrast with the method. Thus, the selection of one of the two approaches is totally on the personal assessment of the researcher. The value of nally selected for the HOC based feature extraction method when applied to the EMD-wAsI case was set to , as it provided with the maximum classication performance. This could be compared with the corresponding values of the best cases of the EMD-AsI, wAsI, and AsI approaches, , , and , respectively. The excluwhere sion of IMFs in the EMD-AsI case and the exclusion of IMFs and/or time segments of the signals in the EMD-wAsI case resulted in a signicant decrease of the optimum HOC order, in was contrast with the case where only a time segment extracted or the whole signal with big AsI value was used (AsI approach [9]). This led to even faster implementation of the EEG-ER system, as the estimation of the HOC sequence of high orders results in a more time-consuming process. For instance, for a 5-s EEG signal, the time to estimate the HOC sequence for and are approximately 0.8 and 0.2 s, respectively (see also [5]). VIII. CONCLUSION In this paper, novel AsI-based algorithms are introduced in order to effectively retrieve the emotion-related information from EEG recordings. The frontal EEG asymmetry, combined with the multidimensional directed information approach, is exploited in order to dene an efcient way for time-frequency-based segmentation (wAsI, EMD-AsI, and EMD-wAsI approaches). In this way, the signal segments with less emotion-related information are discarded, keeping only the valuable ones, contributing to a more reliable EEG-ER process. Based on the presented results, the EMD-wAsI method resulted in the EEG segments that seem to demonstrate better classication attitudes compared to the other approaches, evaluated using two feature vector extraction techniques and six classication scenarios via an SVM classier on an experimental EEG dataset derived from 16 subjects. The promising classication rates derived from the proposed emotional ltering of the EEG signals set a new perspective in the emotional information retrieval approach, enhancing the potential of EEG to reect emotional arousals. Despite the advantageous potential of the presented EMD-wAsI approach, further justication of its reliability is needed through its implementation to datasets of large-scale experiments. APPENDIX Consider a nite zero-mean series oscillating about level zero. Let be the backward difference operator dened by (A1) The difference operator is a high-pass lter. If we dene the following sequence of high-pass lters: (A2)

from calmness to emotional excitement during the 1st second, and a possible loss of emotional interest during the last second. Second, it should be stressed out that for the HA affective states the duration of emotional activation is relatively longer in comparison with the LA affective states, as for HALV and HAHV cases, the 2nd, 3rd, and 4th seconds appear to be activated almost equally, whereas LALV case exhibits a signicant activation only during the 2nd second and LAHV case for the 3rd and, perhaps, the 4th one. This observation is probably related with the intensiveness of the emotional content reected by the corresponding pictures in the HA case. Thus, a more intense emotion maintains for a longer period of time than a milder one. In order to identify the localization of the emotion related information for the four affective states in each one of the frequency components as presented by the IMFs of the signals, the mean between all the extracted from the EMD-wAsI approach for each IMF and for each one second of the 5 s period of the picture projection phase was estimated, and the results in the form of a normalized activation level are depicted in Fig. 11. It must be stressed out that none of the 5th IMFs exhibited a , and, as a result, it is absent from Fig. 11. In this gure, the time-localized frequency component that is excluded from the EMD-AsI approach is revealed. A slight but profound activity is observed in the 4th IMF in all affective cases, proving that countable amount of emotion related information is stored in the 4th IMF, which is totally excluded from the EMD-AsI approach. This fact, along with the enhanced participation of the 3rd IMF in the EMD-wAsI approach, in contrast with the EMD-AsI one, vindicates the superiority of the EMD-wAsI over the EMD-AsI. The values of the EMD-wAsI and approaches show a relative equivalence between them. For example, the maximum classication rate is obtained for the EMD-wAsI approach, whereas, approach, due to its realization only in some selected IMFs,

PETRANTONAKIS AND HADJILEONTIADIS: ADAPTIVE EMOTIONAL INFORMATION RETRIEVAL

2615

with being the identity lter, we can estimate the corresponding HOC, namely simple HOC [34], by (A3) where ings and (A4) In practice, we only have nite time series and lose an observation with each difference. Hence, to avoid this effect, we must index the data by moving to the right, i.e., for the evaluation of simple HOC, the index should be given to the th or a later observation. For the estimation of the number of zero-crossings in (13), a binary time series is initially constructed given by if if is the number of zero crossings in respective order, denotes the estimation of the number of zero-cross-

(A5) and the desired simple HOC are then estimated by counting symbol changes in , i.e. (A6) In nite data records it holds REFERENCES
[1] G. O. A. Heraz and C. Frasson, Predicting the three major dimensions of the learners emotions from brainwaves, Int. J. Comput. Sci., vol. 2, no. 3, pp. 187193, 2008. [2] K. Takahashi, Remarks on emotion recognition from bio-potential signals, in Proc. 2nd Int. Conf. Autonomous Robots Agents, 2004, pp. 186191. [3] D. O. Bos, EEG-Based Emotion Recognition: The Inuence of Visual and Auditory Stimuli Jan. 14, 2012 [Online]. Available: http://hmi.ewi. utwente.nl/verslagen/capita-selecta/CS-Oude_bos-danny.pdf [4] P. C. Petrantonakis and L. J. Hadjileontiadis, Emotion recognition from EEG using higher order crossings, IEEE Trans. Inf. Technol. Biomed. , vol. 14, no. 2, pp. 186197, 2010. [5] P. C. Petrantonakis and L. J. Hadjileontiadis, Emotion recognition from brain signals using hybrid adaptive ltering and higher-order crossings analysis, IEEE Trans. Affect. Comput., vol. 1, no. 2, pp. 8197, 2010. [6] K. Schaaff and T. Schultz, Towards emotion recognition from electroencephalographic signals, in Proc. 3rd Int. Conf. Affect. Comput. Intell. Interaction (ACII), Amsterdam, Netherlands, 2009, pp. 16. [7] Y. Liu, O. Sourina, and M. K. Nguyen, Real-time EEG-based human emotion recognition and visualization, in Proc. 2010 Int. Conf. on Cyberworlds, Singapore, 2010, pp. 262269. [8] R. W. Picard, Affective Computing. Boston, MA: MIT Press, 1997. [9] P. C. Petrantonakis and L. J. Hadjileontiadis, A novel emotion elicitation index using frontal brain asymmetry for enhanced EEG-based emotion recognition, IEEE Trans. Inf. Technol. Biomed.. [10] O. Sakata, T. Shiina, and Y. Saito, Multidimensional directed information and its application, Elect. Commun. in Japan, vol. 4, pp. 385, 2002. [11] N. Huang, Z. Shen, S. Long, M. Wu, H. H. Shih, N. C. Zheng, N. C. Yen, C. Tung, and H. Liu, The empirical mode decomposition and Hilbert spectrum for nonlinear and nonstationary time series analysis, Proc. Roy. Soc. London A, vol. 454, pp. 903995, 1998.

[34].

[12] T. Musha, Y. Terasaki, H. A. Haque, and G. A. Ivamitsky, Feature extraction from EEGs associated with emotions, Artic. Life and Robot., vol. 1, no. 1, pp. 1519, 1997. [13] J. Russell, Core affect and the psychological construction of emotion, Psycolog. Rev., vol. 110, no. 1, pp. 145172, 2003. [14] R. J. Davidson, G. E. Schwartz, C. Saron, J. Bennett, and D. J. Goleman, Frontal versus parietal EEG asymmetry during positive and negative affect, Psychophysiol., vol. 16, pp. 202203, 1979. [15] R. J. Davidson, What does the prefrontal cortex Do affect: Perspectives on frontal EEG asymmetry research, Biolog. Psychol., vol. 67, pp. 219233, 2004. [16] D. Hagemann, E. Naumann, A. Lurken, G. Becker, S. Maier, and D. Bartussek, EEG asymmetry, dispositional mood and personality, Pers. Individ. Differences, vol. 27, pp. 541568, 1999. [17] J. A. Coan and J. J. B. Allen, Frontal EEG asymmetry as a moderator and mediator of emotion, Biolog. Psychol., vol. 67, pp. 749, 2004. [18] T. Kamitake, H. Harashima, and H. Miyakawa, Time series analysis based on directed information, Trans. IEICE, pp. 103110, 1984. [19] G. Wang and M. Takigawa, Directed coherence as a measure of interhemispheric correlation of EEG, Int. J. Psychophysiol., vol. 13, pp. 119128, 1992. [20] M. Kaminski and K. J. Blinowska, A new method of the description of the information ow in the brain structures, Biolog. Cybern., vol. 65, pp. 203210, 1991. [21] L. A. Baccala and K. Sameshima, Partial directed coherence: A new concept in neural structure determination, Biolog. Cybern., vol. 84, pp. 463474, 2001. [22] W. Hesse, E. Moller, M. Arnord, and B. Schack, The use of timevariant EEG Granger causality for inspecting directed interdependencies of neural assemblies, J. Neurosci. Methods, vol. 124, pp. 2744, 2003. [23] R. D. King, C. Feng, and A. Shutherland, StatLog: Comparison of classication algorithms on large real-world problems, Appl. Artif. Intell., vol. 9, pp. 259287, 1995. [24] W. J. Krzanowski, Principles of Multivariate Analysis. Oxford, U.K.: Oxford Univ. Press, 1988. [25] P. C. Mahalanobis, On the generalized distance in statistics, Proc. Nat. Inst. Sci. India, vol. 2, pp. 4955, 1936. [26] T. Mitchell, Machine Learning. New York: McGraw-Hill, 1997. [27] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge, U.K.: Cambridge Univ. Press, 2000. [28] P. J. Lang, M. M. Bradley, and B. N. Cuthbert, International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual, Univ. Florida, Gainesville, 2008, Tech. Rep. A-8. [29] H. Jasper, The ten-twenty electrode system of the international federation, Electroencephalogr. Clin. Neurophysiol., vol. 39, pp. 371375, 1958. [30] K. Coburn and M. Moreno, Facts and artifacts in brain electrical activity mapping, Brain Topogr., vol. 1, pp. 3745, 1988. [31] D. O. Olguin, Adaptive Digital Filtering Algorithms for the Elimination of Power Line Interference in Electroencephalographic Signals, Master, Instituto Tecnologico y de Estudios Superiores de Monterrey, Monterrey, Mexico, 2005. [32] M. Fatourechi, A. Bashashati, R. K. Ward, and G. E. Birch, EMG and EOG artifacts in brain computer interface systems: A survey, Clin. Neurophysiol., vol. 118, pp. 480494, 2007. [33] M. M. Bradley and P. J. Lang, Affective reactions to acoustic stimuli, Psychophysiol., vol. 37, pp. 204215, 2000. [34] B. Kedem, Time Series Analysis by Higher Order Crossings. Piscataway, NJ: IEEE, 1994.

Panagiotis C. Petrantonakis (S08) was born in Ierapetra, Crete, Greece, in 1984. He received the Diploma degree in electrical and computer engineering in 2007 from the Aristotle University of Thessaloniki (AUTH), Thessaloniki, Greece. Currently, he is a Ph.D. researcher at AUTH, afliated with the Signal Processing and Biomedical Technology Unit of the Telecommunications Laboratory. His current research interests lie in the area of advanced signal processing techniques, nonlinear transforms, and affective computing. Dr. Petrantonakis is a member of the Technical Chamber of Greece.

2616

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 5, MAY 2012

Leontios J. Hadjileontiadis (S87M98SM11) was born in Kastoria, Greece, in 1966. He received the Diploma degree in electrical engineering in 1989 and the Ph.D. degree in electrical and computer engineering in 1997, both from the Aristotle University of Thessaloniki, Thessaloniki, Greece. He also received the Diploma in Musicology from Aristotle University of Thessaloniki, in 2011, and the Ph.D. degree in music composition from the University of York, U.K., in 2004 Since December 1999, he has been with the Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, as a faculty member, where he is currently an Associate Professor, working on lung sounds, heart sounds, bowel sounds, ECG data compression, seismic data analysis and crack detection in the Signal Processing and Biomedical Technology Unit of the Telecommunications Laboratory. His research interests are in higher-order statistics, alpha-stable distributions, higher-order zero crossings, wavelets, polyspectra, fractals, neuro-fuzzy modeling for medical,

mobile, and digital signal processing applications. He is currently a Professor in composition at the State Conservatory of Thessaloniki, Greece. Dr. Hadjileontiadis is a member of the Technical Chamber of Greece, the Higher-Order Statistics Society, of the International Lung Sounds Association, and of the American College of Chest Physicians. He was the recipient of the second award at the Best Paper Competition of the 9th Panhellenic Medical Conference on Thorax Diseases97, Thessaloniki. He was also an open nalist at the Student Paper Competition (Whitaker Foundation) of the IEEE EMBS97, Chicago, IL, a nalist at the Student Paper Competition (in memory of Dick Poortvliet) of the MEDICON98, Lemesos, Cyprus, and the recipient of the Young Scientist Award of the 24th International Lung Sounds Conference99, Marburg, Germany. In 2004, 2005, and 2007, he organized and served as a mentor to three ve-student teams that have ranked as third, second, and seventh worldwide, respectively, at the Imagine Cup Competition (Microsoft), Sao Paulo, Brazil (2004)/Yokohama, Japan (2005)/ Seoul, Korea (2007), New York (2011), with projects involving technology-based solutions for people with disabilities and pain management.

Das könnte Ihnen auch gefallen