Sie sind auf Seite 1von 13

Human Behavior Research

Combined methods for measuring human behavior

Human Behavior
Confidential © 2019 Research
Redistribution is not permitted without written permission from iMotions 1
Confidential © 2019 Redistribution is not permitted without written permission from iMotions
Human Behavior Research
Multimodal biosensor platform
The application of biosensors are routinely - and increasingly - used in human behavior
research in order to understand the relationship between physiology and psychology (and
vice versa). Applying such measurements allows for a detailed quantification of human
behavior and responses to experimental stimuli. iMotions provides a software platform for
integrating and synchronizing the entire research process, in any experimental environment.

Collect detailed and nuanced data quickly

Use multiple data sources to increase validity

Get your results through the general noise




COLUMBIA
UNIVERSITY

Some of the universities that are currently using iMotions in their human behavior research.
iMotions software can be found at 41 of the top 100 universities in the world (according to QS, 2019).

Watch the video below to see how Professor Roger Azevedo from North Carolina State Uni-
versity uses a multimodal approach in his research.

2 Confidential © 2019 Redistribution is not permitted without written permission from iMotions
Multimodal Research Examples
An overview of recent research

Perceptual mechanisms create feelings


of trust

Smiling faces are often judged to be more trustworthy


than non-smiling faces. Researchers investigated how
this judgement is formed by using eye tracking with
participants who viewed film clips of smiling faces,
with only the eye or mouth region shown. It was found
that the smile contributed more to the feelings of
trustworthiness than the eyes.
(Calvo et al., 2018)

New insights for autism

Researchers recently used a combination of eye


tracking, facial expression analysis, EEG, ECG, and EDA
as part of an effort to collect large-scale, naturalistic
data from autistic patients. The research is intended
to create a deeper and more detailed understanding
of the psychophysiological components of autism.
(Ness et al., 2018)

VR changes the experience of learning

VR technologies are being increasingly explored (and


touted) as a new method for learning and instruction.
Researchers have explored this claim by collecting
EEG data from participants exposed to a virtual lab
environment in VR or with a regular screen. It was
found that the participants were more immersed, but
ultimately learnt less when in VR.
(Makransky et al., 2018).

Confidential © 2019 Redistribution is not permitted without written permission from iMotions 3
Facial expressions related to disgust

Facial expressions are clearly central to how we


communicated. Researchers set out to investigate
the formation of facial expressions when copying
others. Using EEG and facial expression analysis,
they found that facial expressions related to disgust,
as compared to smiles and jaw drops, were formed
faster, suggesting a need to be quick when showing
feelings of disgust.
(Recio et al., 2018)

Personality affects taste perception

Researchers explored an interaction between


personality and emotions, and the experiences of
tastes. By recording participant’s facial expressions,
EDA, and ECG while they tasted different solutions,
and following completion of a personality test, the
researchers built a profile of emotional responses to
tastes. They found that personality traits - particularly
extraversion and neuroticism - modulated how the
emotional responses were related to the preference
ranking and overall liking of different taste solutions.
(Samant et al., 2018)

Subtle behavioral differences in bipolar


disorder

By using eye tracking and facial expression analysis,


researchers explored how patients with bipolar
disorder differed in their responses to valenced
images, compared to individuals without bipolar
disorder. It was found that bipolar patients looked
less, and showed stronger facial expressions in
response to unpleasant images. These measures were
found to be subtly, but significant indicators of bipolar
disorder.
(Broch-Due et al., 2018)

4 Confidential © 2019 Redistribution is not permitted without written permission from iMotions
Application Areas
Questions that can be answered with biosensor-based research

1. Attention and Perception Research


Understanding attentional and perceptual processes requires measurement of the systems that guide how we
attend to and perceive stimuli. Methods such as eye tracking - be it in a lab, a real-world setting, or even VR - provide
a clear link to our visual system, while other biosensors can complement this approach with measurements of other
psychophysiological responses.

Examples include:

How do participants respond to Track visual responses to still images, videos, or VR

emotionally valenced images? experiences, to measure implicit reactions. Combine


with measures of EDA to assess sympathetic nervous
system activity.

How is the process of reading initiated Use high resolution eye tracking to follow the precise

and maintained? movement of the eyes with different reading tasks.


See how physiological arousal is related to or impacts
abilities with EDA and / or ECG recordings.

How does visual attention develop? Carry out experiments with any visual stimuli, and
record the attentional processes across age groups
or longitudinally. Relate this data to EEG activity to
understand cortical involvement.

Confidential © 2019 Redistribution is not permitted without written permission from iMotions 5
2. Cognition Research
Cognitive processes have a substantial impact on human behavior, through conscious and direct, or nonconscious
and indirect influence. Methods such as EEG allow detection of the brain activity that is involved in cognitive processes,
while other measures, such as electrodermal activity (EDA, also known as GSR), eye tracking, and electrocardiography,
can provide a multifaceted approach to understanding responses.

Examples include:

How do brain processes relate to Track EEG activity in response to a range of stimuli.

specific task completion? Take full control of the experimental setup, and use
additional methods to gain a more complete and
holistic view of psychophysiological responses.

Can EEG biomarkers determine the Use either pre-existing EEG metrics that are

behavioral responses of participants? automatically calculated in the software, or discover


new stereotyped responses to specific stimulus
exposure. Combine with other biosensor measures to
develop findings further.

What is the relationship between Follow a combination of methods to understand how

brain activity and other physiological brain activity influences, and is influenced by, other
aspects of physiology in response to experimental
processes?
manipulation.

6 Confidential © 2019 Redistribution is not permitted without written permission from iMotions
3. Emotion Research
Emotional responses guide much of human behavior, and shape responses to stimuli and the environment.
Additionally, the regulation of emotions as well as the development of appropriate emotional reactions is deeply linked
to a range of psychiatric disorders. Methods such as facial expression analysis, coupled with technologies such as EDA,
eye tracking, and EEG, can therefore guide a more nuanced view of emotions that can help in the understanding of
human psychology in both health and disease.

Examples include:

How do specific stimuli impact Understand how new settings, environments, or

emotional responses? scenes can impact the emotional expressions of


participants with facial expression analysis. Follow
visual attention with eye tracking to determine which
specific aspects are related to physiological changes.

Do differences in emotional responses Test a range of stimuli or settings and measure

exist between patient groups? emotional facial expressions with the widely validated
Affectiva system. Record sympathetic nervous system
responses through EDA, and follow attentional biases
with eye tracking.

How is communication shaped by Post-process any prior recording with the Affectiva

emotional displays? system to detect facial expression changes. Relate this


activity to recordings of participants watching such
communication.

Confidential © 2019 Redistribution is not permitted without written permission from iMotions 7
Human Behavior Research with iMotions
The complete human behavior research platform

Seamlessly integrate multiple


biosensors

Effortlessly integrate and synchronize 50+ different


sensors from 20+ independent vendors, across 10+
®

modalities (including a wide range of medical grade


devices). Add even more sensors through the Lab
Streaming Layer. It’s also possible to forward data in
real time and import external sensor / software data
and loop it back into the platform via the API.

Portable data-collection methods for


naturalistic studies

iMotions enables the synchronized collection of data


from multiple wearable biosensors including eye
tracking glasses, electrodermal activity, ECG, EMG, and
EEG. Use one or multiple biosensors to learn about
human behavior in realistic settings.

Go further with the data

Use cloud-based processing to easily obtain relevant


metrics. Access automatically calculated frontal
asymmetry and power spectral density for EEG
recordings, electrodermal activity peak data, and heart
rate variability for ECG recordings. A suite of fully
transparent processing methods is available for you to
explore and use.

8 Confidential © 2019 Redistribution is not permitted without written permission from iMotions
Study design control & flexibility

Design studies with complete freedom - assign the full


experimental process within iMotions. Build advanced
study designs by point and click. Easily set up
participant groups, randomizations, and block designs
as needed.

VR eye tracking integration

Single or multisensor studies can be readily created in


virtual environments with our VR integration (with or
without eye tracking). Study physiological responses
to stimuli in fully immersive settings, allowing you to
control the environment while also ensuring a high
degree of ecological validity.

Ongoing support

Each customer receives full and continuous support


from an dedicated customer success manager, with
years of on-the-ground experience. A help center
with hundreds of articles is also available for online
guidance, and technical help is provided through our
support staff.

Confidential © 2019 Redistribution is not permitted without written permission from iMotions 9
iMotions Software Solution
Multimodal research in any environment
iMotions reduces the complexity of carrying out multimodal research, enabling a wide
array of sensors to be seamlessly connected. By combining these different physiological
measurements, it’s possible to get a better understanding of human behavior in any
environment.

Any type of stimuli

Eye Tracking

GSR
Real-time / live view
Facial
Expressions
Analysis tools
EEG

ECG Post import / annotation

EMG
Raw data exports

Respiration

API data forwarding


Surveys

API 3rd party data processing

iMotions enables multimodal research to be carried out in an array of research scenarios.

Lab-based studies Natural environments VR environments

10 Confidential © 2019 Redistribution is not permitted without written permission from iMotions
iMotions supports leading 3rd party sensor products.
Additional sensors can be integrated via our API.

Biosensors Additional sensor


iMotions has a suite of partners that provide
integration
biosensor hardware like EEG, electrodermal activity
iMotions has a powerful API equipping researchers
(EDA), ECG, EMG, etc that are suitable for automotive
with the tools to:
research.

• Integrate new sensors with real-time data capture


• Eye tracking: Tobii, SMI, Smart Eye, EyeTech,
in iMotions
Gazepoint
• Live synchronize all data streams
• EEG: Advanced Brain Monitoring (ABM),
• Live forward all synchronized data streams
NeuroElectrics, Brain Products, Emotiv, OpenBCI
• Control the iMotions application via remote
• EDA: BIOPAC, Shimmer, Empatica
control
• ECG: BIOPAC, Shimmer
• Supports standard sensor protocols like LSL and
• EMG: BIOPAC, Shimmer
TTL
• Respiration: BIOPAC

EEG EDA ECG / EMG

Confidential © 2019 Redistribution is not permitted without written permission from iMotions 11
Selected Publications
Research made possible with iMotions

JAKE® Multimodal Data Capture System: Towards Automated Pain Detection in Children
Insights from an Observational Study of Autism using Facial and Electrodermal Activity
Spectrum Disorder Authors: Xu, X., Susam, B. T., Nezamfar, H., Diaz, D.,
Authors: Ness, S. L., Manyakov, N. V., Bangerter, A. et Craig, K. D., Goodwin, M. S., Akcakaya, M., Huang, J. S.,
al. de Sa, V. R.
Institutes: Janssen Research and Development,
Universities: University of California San Diego,
Duke University School of Medicine, Northeastern
University of British Columbia Vancouver,
University, University of California, University of
Washington Northeastern University, University of Pittsburgh

View publication View publication

Subtle behavioural responses during negative Transdermal neuromodulation of noradrenergic


emotion reactivity and down-regulation in activity suppresses psychophysiological and
bipolar disorder: A facial expression and eye- biochemical stress responses in humans
tracking study Authors: Tyler, W. J., Boasso, A. M., Mortimore, H. M.,
Authors: Broch-Due, I., Kjærstad, H. L., Kessing, L. V., Silva, R. S., Charlesworth, J. D.,Marlin, M. A., et al.
Miskowiak, K. Institute: Thync Inc.
Institutes: Copenhagen University Hospital, University
of Copenhagen

View publication View publication

Adding immersive virtual reality to a science Integrating metacognitive judgments and eye
lab simulation causes more presence but less movements using sequential pattern mining to
learning understand processes underlying multimedia
Authors: Makransky, G., Terkildsen, T. S., Mayer, R.E. learning

Universities: University of Copenhagen, University of Authors: Mudrick, N. V., Azevedo, R., Taub, M.
California Santa Barbara Universities: North Carolina State University, University
of Central Florida

View publication View publication

12 Confidential © 2019 Redistribution is not permitted without written permission from iMotions
Automatic Recognition of Posed Facial How Does Food Taste in Anorexia and Bulimia
Expression of Emotion in Individuals with Autism Nervosa? A Protocol for a Quasi-Experimental,
Spectrum Disorder Cross-Sectional Design to Investigate Taste
Authors: Manfredonia, J., Bangerter, A., Manyakov, N. Aversion or Increased Hedonic Valence of Food
V., Ness, S., Lewin, D., Skalkin, A., Boice, M., et al. in Eating Disorders

Institutes: Janssen Research & Development, Authors: Garcia-Burgos, D. Maglieri, S., Vögele, C.,
Northeastern University, Duke University School Munsch, S.
of Medicine, University of California San Francisco, Universities: University of Fribourg, Bern University
University of Washington of Applied Sciences, University of Luxembourg,
Université Libre de Bruxelles, Shaanxi Normal
University

View publication View publication

Copycat of dynamic facial expressions: Superior Personality traits affect the influences of
volitional motor control for expressions of intensity perception and emotional responses
disgust on hedonic rating and preference rank toward
Authors: Recio, G., Sommer, W. basic taste solutions

Universities: Hamburg University, Humboldt University Authors: Samant, S. S., Seo, H-S.
of Berlin University: University of Arkansas

View publication View publication

Want to know more?

GET IN TOUCH

United States Denmark Synchronize, Visualize and Analyze your research in Eye Tracking,
141 Tremont Street, 7th Floor Frederiksberg Allé 1-3 Fl. 7 Facial Expression Analysis, Galvanic Skin Response, Surveys, EEG and
Boston, MA 02211 Copenhagen V, 1621 much more in one software platform.
Tel +1 617-520-4958 Tel +45 71 998 098
www.imotions.com/
Germany Singapore
Landsberger Str 247 Level 4, 21 Merchant Road,
12623 Berlin 058267
Tel +49 (0)30-9203 0820 ​​Tel +65 9121-2843

Confidential © 2019 Redistribution is not permitted without written permission from iMotions 13

Das könnte Ihnen auch gefallen