Sie sind auf Seite 1von 96

Accessible and Assistive ICT

VERITAS
Virtual and Augmented Environments and Realistic User
Interactions To achieve Embedded Accessibility DesignS
247765

Innovative VR models, tools and simulation


environments

Deliverable No. D2.7.1

SubProject No. SubProject Innovative VR models, tools and


SP2
Title simulation environments

Workpackage Workpackage New Interaction Tools for


W2.7
No. Title Designer Experience

Activity No. Activity Title UI concepts, elements and


A2.7.1
architecture

Authors Marco Fontana & Marcello Carrozzino


(PERCRO), Thomas Grill (UOS), Anja Thieme
(UNEW), Fabrizio Nunnari (VRMMP), Hans
Joachim Wirsching (HS), Vitaly Kolodyazhniy
(COAT), Nicola Cofelice (LMS).

Dissemination Level Pu (Public)

Status F (Final)

File Name: VERITAS_D2.7.1_final.doc

Project start date and 01 January 2010, 48 Months


duration
VERITAS D2.7.1 PU Grant Agreement # 247765

Version History table


Version Dates and comments
no.

1 02 Nov 2010 First Draft

2 26 Nov 2010 Integration of contents ID2.7.1

3 09 Dec Contribute of partners involved

4 15 Dec Final Version

December 2010 iii PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Table of contents
Version History table .......................................................................................... 3
Table of contents ................................................................................................ 4
List of Figures ..................................................................................................... 6
List of Tables ...................................................................................................... 7
Abbreviations List ............................................................................................... 8
Executive summary ............................................................................................ 1
1 Introduction .................................................................................................. 2
1.1. Physical Impairments ............................................................................ 3
1.1.1 Motor Impairments .......................................................................... 3
1.1.2 Visual Impairments ......................................................................... 3
1.1.3 Speech Impairments ....................................................................... 3
1.1.4 Hearing Impairments ...................................................................... 4
1.2. Cognitive Impairments........................................................................... 4
1.3. Behavioural and Psychological Impairments ......................................... 5
1.4. Overview of Virtual Reality Tools .......................................................... 6
1.4.1. Introduction to VR and multi-sensorial systems .............................. 6
1.4.2. Visualization technologies .............................................................. 6
1.4.3. Sound in Virtual Environments........................................................ 8
1.4.4. Haptic Sensorial Channel ............................................................. 10
1.4.5. Devices for Interaction .................................................................. 11
2 Specification of Interaction Tools ............................................................... 14
2.1 Physical Interaction Tools ................................................................... 14
2.1.1 Visual Impairment IT ..................................................................... 14
2.1.2 Kinematic Functional Limitation IT Impl.1 ..................................... 18
2.2 Physical IT through innovative VR tools .............................................. 21
2.2.1 Kinematic Functional Limitation: Wearable Vibrotactile ................ 22
2.2.2 Dynamic Functional Limitation IT: Haptics .................................... 24
2.3 Cognitive interaction tool ..................................................................... 27
2.3.1 Conceptual specification of the simulation tool: The n-back task .. 29
2.3.2 Technical specification and integration of the cognitive simulation
tool................... .......................................................................................... 30
2.4 Behavioural & Psychological interaction tools ..................................... 32
2.5 Specification of the tool for Stress Induction ....................................... 33
2.5.1 Prior to performing the VERITAS tasks: using GUI....................... 33
2.5.2 Specification of the tool for Emotion Elicitation ............................. 36
3 Architecture of the Interaction Tools .......................................................... 39
3.1 Overall considerations ......................................................................... 39
3.2 Visual Impairment IT ........................................................................... 40
3.3 Interaction Manager ............................................................................ 41

December 2010 iv PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.4 Kinematic Functional Limitation IT Impl.1 ............................................ 43


3.4.1 Interaction Manager ...................................................................... 43
3.5 Kinematic Functional Limitation IT Impl.2 ............................................ 46
3.5.1 Interaction Manager ...................................................................... 46
3.6 Dynamic Functional Limitation IT ........................................................ 48
3.6.1 Interaction Manager ...................................................................... 48
3.7 Control Functional Limitation IT ........................................................... 50
3.7.1 Interaction Manager ...................................................................... 50
3.8 Cognitive IT ......................................................................................... 52
3.8.1 Interaction Manager ...................................................................... 52
3.9 Behavioural Stress Induction IT .......................................................... 54
3.9.1 Interaction Manager ...................................................................... 54
3.10 Behavioural Emotion Elicitation IT ................................................... 57
3.10.1 Interaction Manager................................................................... 57
3.11 Integrated Interaction Tool ............................................................... 60
4 Conclusions ............................................................................................... 65
Annex A Overview of the Interaction Tools.................................................... 66
References ....................................................................................................... 82
List of Cited VERITAS Deliverables ................................................................. 87

December 2010 v PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

List of Figures
Figure 1 Stereo image with polarization filters (left) and 4 wall CAVE
visualization system (right) ................................................................................. 8
Figure 2 Examples of haptic interfaces: Arm Exoskeleton and Hand
Exoskeletons (left) Desktop Haptic interface (right) .......................................... 10
Figure 3 Hands data glove (left) and markers for wrist optical sensor (right) ... 12
Figure 4 Vision cone ......................................................................................... 15
Figure 5 Eye point locations (left, right, middle)................................................ 16
Figure 6 Head orientation, vision line and fixation point ................................... 16
Figure 7 Rotation varying opening angles ........................................................ 17
Figure 8 Scaled human model.......................................................................... 19
Figure 9 Virtual markers ................................................................................... 19
Figure 10 Marker measurement stream (motion capture) ................................ 20
Figure 11 Joint angle limits ............................................................................... 20
Figure 12 First prototype of the vibrotactile bracelet equipped with 4 vibrating
motors without motion tracking functionalities .................................................. 24
Figure 13 GRAB Haptic user interface is able to deliver a force along any
wanted orientation in the 3d space ................................................................... 25
Figure 14 Custom Force-Feedback steering wheel develop by PERCRO
laboratory within the European project VIRTUAL
(http://vrlab.epfl.ch/Projects/virtual.html) .......................................................... 26
Figure 15 LogitechClearChat wireless USB headset with microphone. ............ 31
Figure 16 Graphical user interface of the Montreal Imaging Stress Task (MIST,
Dedovic et al., 2005). From top to bottom, the figure shows the performance
indicators (top arrow = average performance, bottom arrow = individual
subjects performance), the mental arithmetic task, the progress bar reflecting
the imposed time limit, the text field for feedback, and the rotary dial for the
response submission. ....................................................................................... 34
Figure 17 Visual Impairment Tool Architectural Scheme .................................. 42
Figure 18 Kinematic Functional Limitation IT (1) Architectural Scheme .......... 45
Figure 19 Motor Impairment Kinematics Tool Architectural Scheme ............... 47
Figure 20 Dynamic Functional Limitation IT Architectural Scheme .................. 49
Figure 21 Control Functional Limitation IT Architectural Scheme ..................... 51
Figure 22 Cognitive IT Architectural Scheme ................................................... 53
Figure 23 Behavioural Stress Induction IT Architectural Scheme ..................... 55
Figure 24 Behavioural Stress Induction Offline IT Architectural Scheme ........ 56
Figure 25 Behavioural Emotion Elicitation IT Architectural Scheme ................. 58
Figure 26 Behavioural Emotion Elicitation Offline IT Architectural Scheme ..... 59
December 2010 vi PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 27 Integrated Interaction Tool Architectural Scheme ............................ 63

List of Tables
Table 1 Visual Impairment Tool Data Specification .......................................... 40
Table 2 Interaction Manager Data Specification ............................................... 41
Table 3 Kinematic Functional Limitation It (Impl.1) Data Specification ............. 43
Table 4 Interaction Manager Data Specification ............................................... 44
Table 5 Kinematic Functional Limitation It (Impl.2) Data Specification ............. 46
Table 6 Interaction Manager Data Specification ............................................... 47
Table 7 Dynamic Functional Limitation It Data Specification ........................... 48
Table 8-Interaction Manager Data Specification............................................... 49
Table 9 Control Functional Limitation It Tool Data Specification ...................... 50
Table 10 Interaction Manager Data Specification ............................................. 50
Table 11 Cognitive It Data Specification ........................................................... 52
Table 12 Interaction Manager Data Specification ............................................. 53
Table 13 Behavioural Stress Induction It Data Specification ............................ 54
Table 14 Interaction Manager Data Specification ............................................. 55
Table 15 Behavioural Emotion Elicitation It Data Specification ........................ 57
Table 16 Interaction Manager Data Specification ............................................. 58
Table 17 Interaction Manager Data Specification ............................................. 62

December 2010 vii PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Abbreviations List
Abbreviation Explanation

AD Alzheimer Disease
CAVE Cave Automatic Virtual Environment
CPU Central Processing Unit
CRT Choice Reaction Time
CSV CommaSeparated Value
DOF Degree of Freedom
DoW Description of Work
FMRI Functional Magnetic Resonance Imaging
FFSW Force Feedback Steering Wheel
GUI Graphical User Interface
HI Haptic Interface
HMD Head Mounted Display
HRTF Head Related Transfer Function
ISP Immersive Simulation Platform
IADS International Affective Digital Sounds
IAPS International Affective Picture System
IM Interaction Manager
IT Interaction Tools
LCD Liquid Crystal Display
MIST Montreal Imaging Stress Test
PBM Physical Based Modelling
PD Parkinson Disease
PE Parameter Encoder
PET Positron emission tomography
SAPI Speech Application Programming Interface
VE Virtual Environment
VR Virtual Reality
WM Working Memory
WVB Wireless Vibrotactile Bracelet

December 2010 viii PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Executive summary
This document shall provide an overview of the software architecture and the
specifications of the VERITAS Interaction Tools for Designer Experience that
will be developed within the WP 2.7.

VERITAS Interaction Tools are interfaces that are developed to enhance the
designer's experience, to the point that he will be able to feel or intuitively
understand the user's disabilities. The main underlying concept is that the novel
interaction tools shall provide either sensorial stimulations altered according to
the disability to be simulated or other kind of information able to reproduce an
equivalent stress as experienced by disabled users. The aim is to let designers
directly experience a specific disability, and therefore get a deeper awareness
and understanding of disabled users needs, so as to help them in conceiving
products with a better accessibility and usability. Thus, the designer is driven
through the assessment of accessibility of designed products and services in an
intuitive way, allowing him to get consciousness of the impairments of the user.

The first section of the document is dedicated to the description of the general
approach that has been assumed for the choice of simulating certain disabilities
and for the definition of the basic functionalities of the different tools. A brief
overview of the different types of disabilities categorized in Physical
Impairments, Cognitive Impairments and Behavioural and Psychological
impairments is presented also in this section.

The second section of this document provides guidelines on how the novel
interaction tools, developed in VERITAS in order to enhance the designers
experience by transmitting information able to simulate or to let them feel a
certain disability, relate to each other and follow a common view and design
guidelines. Detailed specifications are then provided.

The third section of this document is dedicated to the definition of the software
architecture of each interaction tool. The type of data required by the different
tools for interacting with the Immersive Simulation Platform and the relationship
with the Interaction Manager.

For each of the Interaction Tools the scheme of its integration within the
platform is provided.

December 2010 1 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

1 Introduction
In VERITAS SP2 the development of an Immersive Simulation Platform (ISP) in
which the designer will be involved in first person in the test of the designed
system or service is foreseen. She/he will be immersed in a Virtual Environment
system and be able to select the type of user. The designer will than embody
the end-user and will be able to feel the functional limitation associated with the
end-user characteristics.

So the aim of the work that is developed in WP2.7 is to provide some innovative
tools (Interaction Tools, IT) for allowing the designer to acquire consciousness
of the impairments of the end-user by a direct experience.

Since not all the impairments can be experienced in a direct way by the
designer we consider two different approaches for the impairment simulation:

Direct Experience:

A subset of the functional limitations correlated with the impairments that are
included in the VERITAS platform can be directly replicated within the ISP. The
designer in this case can experience directly the functional limitation that affects
the end-user. An example can be the replication of the visual field functional
limitation in which the designer can be immersed in a virtual scene that is
purposely modified in order to replicate at best the visual impairment of the
user.

Augmented Experience:

Not all the functional limitations can be directly replicated on the designer. For
example it could not be acceptable to replicate pain due to arthritis or similar
motor impairments. However in many of these cases we can use purposely
developed advanced interaction tools for augmenting the simulation experience.
The scope of the interaction tool is to facilitate the designer in getting conscious
of the impairment of the user. Following this approach for example the limitation
in motion of the articular joint can be simulated by visualizing some warnings
when a joint limit is achieved.

To this end we provide a brief classification of the different impairments in order


to clarify which of them will be included in the Interaction Tools for designer
experience and which is the strategy that is assumed for the simulation of the
different kind of impairments.

A detailed taxonomy of VERITAS beneficiaries impairments is analysed within


the framework of SP1 and is reported in Deliverables: D1.3.1 Abstract physical
models definition, D1.4.1 Abstract Cognitive User models definition and

December 2010 2 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

D1.5.1 Abstract Behavioural and Psychological User models definition. A short


presentation of the outcomes of the aforementioned Deliverables follows in
order to introduce the reader to the disabilities under research and simulation in
VERITAS.

1.1. Physical Impairments


There are many different diseases that can cause some kind of physical
impairment in people, ranging the nervous system to musculoskeletal system
and connective tissues diseases.

Functionally the physical impairments can be categorized as follows:

1.1.1 Motor Impairments


Motor Impairments consist of a loss or limitation of function in muscle control or
movement or a limitation in mobility. This may include shakiness, arthritis,
paralysis, and limb loss, among other difficulties. Such functional limitation can
be classified in:

Kinematics functional limitations. Reduction in mobility of joints, velocity of


joints, and ultimately reach and dexterity abilities.

Dynamics functional limitations. Reduction in muscle strength and ultimately the


ability to produce useful forces.

Control Functional limitation. Neuromuscular deficiencies, which ultimately


results in difficulty of controlling movements.

1.1.2 Visual Impairments


Visual Impairment consists of a functional loss of vision resulting from either
disease, trauma, or congenital or degenerative conditions that cannot be
corrected by conventional means, such as refractive correction, medication, or
surgery.

This impairment of vision system can be divided in two main functional


categories:

Visual Acuity Functional Limitations Loss of the ability to perceive details


presented with good contrast;

Visual Field Functional Limitations the ability to simultaneously perceive visual


information from various direction from the environment.

1.1.3 Speech Impairments


Speech Impairment consist of functional limitations in speaking. It is a type of
communication disorders where 'normal' speech is impeded. Stuttering and
December 2010 3 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

lisps are the most common examples. Some rare cases can lead to people that
are totally unable to speak (mutism).

1.1.4 Hearing Impairments


Hearing impairment refers to conditions in which subjects are partially or
completely unable to detect or perceive sound or a certain range of sound
frequencies that are commonly heard by normal subjects.

1.2. Cognitive Impairments


The cognitive disabilities that will be addressed in the scope of the cognitive
impairment simulation tools are described in detail through WP1.4. Basically the
cognitive impairments are cause of alteration of the cognitive functions, that can
be distinguished in Basic Cognitive Functions and High Level Cognitive
Functions.

New Cognitive ITs for designer experience will have the aim of altering some of
such cognitive functions in order to reduce the designer cognitive performance
to the level of the impaired user.

Basic Cognitive Functions and Parameters can be summarized as follows:

Reaction time is a measure of the overall cognitive performance speed; may be


affected in the disabled and elderly users. Simple and choice reaction times are
considered which correspond to one and several possible responses,
respectively. Slowing of reaction time is observed in the elderly and in
Parkinsons and Alzheimer diseases

Attention is involved in virtually all cognitive tasks. Age or disability-related


decline in attention affects many aspects of persons everyday life. We consider
three types of attention (selective, divided, and sustained) that are relevant for
realistic user modelling for the disability types and application domains
addressed within VERITAS

Memory is the ability to store, retain, and recall information, and thus is also a
crucial aspect of cognitive performance. User modelling for the application
domains and disability types within VERITAS requires the following types of
memory to be modelled: semantic, episodic, procedural, and working memory.
Deficits in different memory types are observed in the elderly, Parkinsonian and
Alzheimer patients

Perception (visual, auditory, and haptic) may be impaired in the elderly. On the
other hand, e.g., haptic and auditory perception play a more important role for
vision-impaired people as a substitute for visual perception. Patients with

December 2010 4 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

cognitive impairments, as Parkinsons disease, have a face perception


impairment.

Higher-lever Cognitive Functions are:

Decision making is important in many activities, especially for the automotive


application domain where the ability to make timely and correct safety decisions
while driving a car is critical. Developing games for the elderly and cognitively-
impaired also requires adequate modelling of their decision-making capabilities

Orientation in space is deteriorated in the elderly and cognitively impaired


people. At the same time, blind individuals possess superior navigational skills
which do not rely on vision

Speech and language may be deteriorated in cognitive impairments; therefore


these aspects should be taken into account for designing user interfaces.
Besides that, special attention is needed for improving speech recognition-
based devices and interfaces in case of elderly speech or speech impairments,
presenting information to people with dyslexia, or developing sign language-
based interfaces for hearing- and speech-impaired

Cognitive flexibility is the ability to switch attention from one aspect of an object
to another, or to shift the attentional set, as investigated in set shifting tasks like
the Wisconsin Card Sorting Test. Patients with Parkinsons disease are
characterized by a deficit in set shifting expressed by a difficulty in suppressing
a prepotent response.

1.3. Behavioural and Psychological Impairments


The behavioural and psychological impairments that will be addressed in the
scope of the simulation tools are analyzed in the framework of WP1.5. Basically
the behavioural and psychological impairments are causes of alteration of the
behavioural and psychological facets.

New Behavioural and Psychological IT for designer experience will have the
aim of altering some of such facets in order to impose on the designer a
behavioural and psychological state that is comparable to the state of the end-
user.

Behavioural and psychological facets can be summarized as follows:

Stress disability induces stress or increases stress level of a disabled person


in many real-life situations. In order to minimize the stress level in disabled
users, designers of improved products for disabled people should be able to
estimate the users stress and take this information into account for optimal
design
December 2010 5 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

Fatigue disabled and elderly people may get tired more rapidly. For
maintaining safety and optimal usability, the users fatigue should be quantified
and involved as a design parameter that should be kept below a certain level

Motivation performance at workplace or the success of therapy or


rehabilitation is determined to a greater extent by the persons motivation so this
parameter should be assessed and taken into account for optimal results

Emotional state the users satisfaction is reflected in his or her emotions that
can be quantified using the techniques of affective computing and entered as
design parameters in order to ensure positive emotional response to using the
improved products

1.4. Overview of Virtual Reality Tools


In this section we provide a brief overview of the different Virtual Reality (VR)
tools that are considered for the implementation of the VERITAS IT for Designer
experience. The functionality and system capabilities are reported in order to
introduce the tools that are going to be developed described in detail in the 2 nd
section of this document.

1.4.1. Introduction to VR and multi-sensorial systems

Virtual Reality is a multi-sensorial immersive experience. The basic idea is to


build environments which do not actually exist, but can provide sensations and
stimuli able to let the user believe to be part of that environment.

In order this to happen, a VR environment must obey to all the laws of physics
which we are commonly used to and which we daily live together with. A VR
system, having to accomplish to functionalities like the perception of users
movements, their control and the sensorial response, needs a set of
technologies among which we can find:

3D stereoscopic display

3D sound

force and tactile feedback

position tracking and motion capture

vocal synthesis and analysis

1.4.2. Visualization technologies

Images produced by the 3D rendering process must be presented to users


eyes; this can be achieved in several ways: by using a simple desktop monitor,
December 2010 6 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

projection screens, of stereoscopic devices, like the Powerwall, the CAVE or a


Head Mounted Display (HMD), a wearable helmet provided with two small
displays put just in front of the users eyes.

The principle of stereoscopy is based on the fact that humans own two eyes,
each one of them giving a different perspective vision of the surrounding world.
Combining these two perspective bi-dimensional images, it is possible to
recover information on the missing third dimension (depth).

Basically two different hardware technologies are available for the perception of
stereo images: active stereo and passive stereo. In active stereo, for each
frame two images are projected sequentially; therefore, there is a continuous hi-
frequency (about 120Hz) switching between images for the right eye and
images for the left eye. Users wear a special active device, shutter glasses,
which is synchronized with the image switcher and able to make lenses opaque
or transparent. When the image for the right eye is present, the left lens is
completely opaque, otherwise it is transparent. The same happens for the right
lens. The human brain receives a sequence of images, but they are so quickly
presented and switched that it believes to perceive them at the same time. In
other words, the brain merges the images and can reconstruct depth from them

In passive stereo both images are projected at the same time but, thanks to a
filtering system (the most widely known uses colour filters or polarization filters,
which exploit the phenomenon of light polarization), only the correct image
(Figure 1.1) reaches each eye. There are advantages and disadvantages for
both technologies: active stereo is more expensive and requires dedicated
hardware, passive stereo presents the problem of ghosting (or stereo crosstalk),
which means that one eye perceives also a small fraction of the image
presented for the other eye.

The HMD does not suffer from this problem because each eye has a LCD panel
directly in front of it, however it is expensive and it has a limited field of view.
Autostereoscopic Displays are 3D displays that do not require wearing special
glasses or helmets, allowing the viewer to see a different perspective of a scene
with each eye. Typically, the displays use lens arrays or special polarization to
direct left and right images to the appropriate eye. The drawback of this type of
display is that it requires the head to be positioned within a narrowly controlled
space, although recent developments in technology are expanding the space in
which the head may be positioned.

December 2010 7 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 1 Stereo image with polarization filters (left) and 4 wall CAVE visualization system
(right)

In the field of portable devices, retinal displays may represent the near future of
HMD see-through displays. They use scanned-beam technology, and then
optically guide the computer generated image directly to the users eye. The
specially-coated ocular piece is optimized to allow the image to be reflected into
the users eye, while simultaneously allowing the user to continue to see the
outside world unhindered. Users can adjust the optical focus of the image,
placing it precisely at the users working distance. The result is a good clarity of
combined image data and the real-world.

In order to produce the right feedback to the sensorial system and allow a
natural interaction with the Virtual Environment (VE), the stereoscopy must be
combined with a tracking system that, at every instant, read the position of the
users head.

It is necessary to know the position of the observer in the VE, in order to


calculate the correct perspective and the direction of the eye separation. A
tracking system, able to perform a complete motion capture, is also desirable to
provide to the VR system information about the location of every relevant
component of the users body, particularly the ones most involved in interaction,
like limbs or hands.

1.4.3. Sound in Virtual Environments

The multi-modal presentation of information can stimulate different senses,


improve the sense of immersion perceived by the user, and increase the
amount of information accepted and processed by the user. This increase of
information may reduce the error and time taken to complete a task. Existing
studies show that the addition of the 3D sound feedback introduces an
improvement in task performances. These studies also indicate that the
integrated feedback offers better task performance than any feedback used in
isolation. In general, the sound feedback complements and enriches the

December 2010 8 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

interaction to a full modal interface that offers a coherent perception of the


environment.

A complete and accurate recording and reproduction of a 3D sound field usually


adds a new dimension to information presentation and interaction. However,
current surround sound systems provide a relatively flat plane of perception,
therefore it is necessary to adopt more complex sound processing
methodologies.

The typical way to recreate a sample audio field is to convert sound waves into
electrical signals. Depending on the wanted level of directionality it is possible to
have Mono recordings (involve only one channel from one direction), Stereo
recordings (involve two channels from different locations/directions) or Surround
capture (involves more channels).

Ordinary two-channel stereo recordings fail to take into account the


complexities of human hearing, especially the transfer function characteristics of
the pinnae - i.e. the frequency-response and arrival time differences caused by
the outer ears. As a matter of facts, recording is head dependent. A possible
solution is to perform binaural recordings which take in account and combine
information from both ears. This capture method is aimed to the subsequent
synthesis of the HRTF (Head Related Transfer Function), a sort of ear-print
proper of each person which can be used to process audio signals to recreate
the directionality of the original sound field. Unless is necessary to have an
HRTF characteristic of a well-specified user, usually it is sufficient to synthesize
an average HRTF. For this purpose, binaural recordings are carried out using a
mannequin having two microphones placed in correspondence of its ears. The
HRTF is obtained as the response to a sound impulse whose location is moved
around a sphere surrounding the mannequin. This technique can be also used
to sample on a discrete space the sound field generated by a specified sound
source in a head-centred way.

The existing methodologies of audio rendering attempt to address the following


issues:
sound modelling (the sound can either be captured and the related
waveform rendered, or it can be synthesized based on related physical
properties)
sound rendering, including directionality (the location of the sound field,
usually synthesized by means of filters) and environmental issues (the
complete sound field is re-created taking into account the properties of
the surrounding environment and how it modifies the original sound).

December 2010 9 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

1.4.4. Haptic Sensorial Channel

Without doubt visual and acoustical rendering are the most important and,
consequently, the most commonly implemented feedbacks in a VR system.
Among the various motivations it can be considered that a consolidated base of
hardware devices already existed, even if used in other sectors: projectors,
loudspeakers, monitors, headphones were already largely available
independently from VR purposes. An increasing importance, however, is being
assumed by the haptic feedback, as haptic interfaces become more affordable
and widespread. Haptic Interfaces (HI), originally created for tele-operation
purposes, are robotic devices able to interactively exert stimuli which induce
tactile perception The main functionalities of a HI are to exert a well known force
or torque (achieved by means of actuators) and to acquire the position of the
user body involved in the interaction (achieved by means of sensors).

Figure 2 Examples of haptic interfaces: Arm Exoskeleton and Hand Exoskeletons (left)
Desktop Haptic interface (right)

Several categorizations exist for Haptic Interfaces (HIs), depending on the


number of degree of freedom (DOF) actuated and/or acquired, on the type of
kinematics (parallel or serial), on the number of contact points, on the
morphology (anthropomorphic, portable or desktop). HIs must be controlled by
a software layer which is commonly defined as Haptic Rendering. Haptic
Rendering is in charge of providing forces to the user according to his/her
interaction with the virtual environment. Several concepts are very similar to
visual rendering, therefore some of the examined structures and algorithms may
apply also to haptic rendering software.

In order to achieve a realistic simulation of the behaviour of a VE and its


components, is more and more required an accurate model of their physical
properties. The Physical Based Modelling (PBM) allows to simulate the physical
processes involved in the interaction with a VE, therefore it is in charge of
handling almost the whole VE behavioural data management. The physics
engine may be of variable complexity, depending on its involvement in simple

December 2010 10 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

collision detection tasks (which are crucial for the correct interaction among
virtual objects and between the user and the VE) or in deeper calculations on
the dynamics of VE objects.

As PBM usually needs huge amounts of memory and CPU time to achieve an
interactive real-time simulation, often the PBM module runs on a separate
computing node. Generally speaking, all the rendering modules for the various
sensorial channels are considered to be distinct (even if they run on the same
computing node) and exchange information by means of network
communication (messages) or shared memory. This often leads to a
redundancy in the data base, because the same object may be modelled in
different representations or resolutions according to the particular rendering
module. Moreover, when several of these modules are co-existing, the problem
of co-locating the stimuli generation arises: as an example, when a virtual hand
visually touches a virtual objects, the possible haptic feedback should be
generated in the same time and with the same properties so that the user does
not perceive a non natural or, even worse, disturbing feeling.

It is evident that an integrated approach in the VR application management is


highly desirable in order to minimize the redundancies and, therefore, optimize
the efficiency of the system.

1.4.5. Devices for Interaction

The architecture of a VR system, which so far appears already sufficiently


complex, is further complicated by the presence of additional software modules
in charge of acquiring efferent data, which is information going from the user to
the VE. This information is needed to handle the interaction, whether it consists
of a simple dynamic update of the perspective (which requires the exact
position and orientation of the user's head) up to complex manipulations of
virtual objects.

The techniques to achieve the user's relevant data are basically:


Motion and posture capture
Audio capture and speech recognition

Motion capture is the process of acquiring spatial data related to the user's
body. The position and the posture of body, fingers, limbs, or the sequence of
these postures combined to form an animation, are acquired by tracking
devices. These devices may be able to acquire single points 6DOF data
(position and orientation) or more structured data (like the joint angles of body
postures). Several technologies exist, based on different physical principles, like
magnetic fields, ultrasonic waves, kinematics chains, or on different data
acquisition methodologies, like range measurements, marker recognition,

December 2010 11 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

motion-related data. Each one has its pros and cons and, depending on the
specific task, can be opportunely used.

The features required from a tracking system are mainly:

precision (to ensure the correctness of the data processing)

low latency (to avoid erroneous behaviours that may lead even to sickness)

low encumbrance (to ensure wearability and to avoid hampering the user's
motion)

Speech recognition is the process of converting an acoustic signal, captured by


a microphone or a telephone, to a set of words [COLE 1997]. The recognized
words can be the final results, as for applications such as commands & control
and data entry. They can also serve as the input to further linguistic processing
in order to achieve speech understanding.

Figure 3 Hands data glove (left) and markers for wrist optical sensor (right)

The acquired efferent data are subsequently processed to allow the actual
interaction, with the user modifying the virtual environment according to his/her
actions. In the simpler cases (for instance, the updating of the point of view) the
process is straightforward. In other cases the most suitable interaction
metaphors must be accurately chosen to achieve the expected result.

Metaphors reproduce concepts, known by the user in a certain context, in order


to transfer this knowledge into a new context related to the execution of a task.
Therefore, they have to be sufficiently representative of the task and compatible
with the user's knowledge. They also have to be compatible with the physical
constraints of the used interfaces. The typical tasks involved in a VR application
are:

Environment navigation and motion


Object selection
Object manipulation
December 2010 12 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

A number of different metaphors have been designed and implemented, which


are categorized depending on the interaction device or on the type of
interaction.

In the latter case they are divided in:


egocentric metaphors (first-person)
esocentric metaphors (third-person)

The interaction may be direct (the user performs actions in first-person, seeing
his/her real body interacting with the VE) or mediated (either in first or third-
person, seeing an avatar).

An avatar is an interactive representation of a user in a virtual environment,


which is needed when the user cannot see his/her body (for instance when
wearing a Head Mounted Display). Even if he first realized avatars provided a
rough visual representation, they were well accepted by users as long as they
carefully followed their movements. Nowadays the technological improvements
allow visualizing highly realistic avatars which make use of advanced
techniques of real-time deformation (skinning), photo-realistic rendering and
physical based modelling. Avatars are used not only for the visual self-
perception of users, but also as a graphical representation of other users in a
shared virtual environment.

December 2010 13 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

2 Specification of Interaction Tools


In order to make the designer experience specific impairments, the VERITAS
Interaction Tools will be developed and integrated in the VERITAS Immersive
Platform. This platform provides a virtual environment in which the designer can
perform different physical actions like walking, reaching, driving etc... During the
performance specific devices measure the designers behaviour and give
feedback of the virtual interaction to her/him. On the basis of the acquired data
related to specific designers actions, ITs will allow to simulate the specific
impairments giving a corresponding altered feedback to the designer.

The Physical, Cognitive and Behavioural&Psychological Interaction Tools that


are going to be developed in WP2.7 are detailed in the following sections.

A brief overview of the IT is also provided in Annex A Overview of the


Interaction Tools.

2.1 Physical Interaction Tools

2.1.1 Visual Impairment IT


Visual impairment (or vision impairment) is a significant limitation of visual
capability resulting from either disease, trauma, or congenital or degenerative
conditions that cannot be corrected by conventional means, such as refractive
correction, medication, or surgery. This functional loss of vision is typically
defined to manifest with best corrected visual acuity of less than 20/60, or
significant central field defect, significant peripheral field defect including
homonymous or heteronymous bilateral visual, field defect or generalized
contraction or constriction of field, or reduced peak contrast sensitivity with
either of the above conditions.1

The tool described in this section focuses on the simulation of visual impairment
and provides the designer a mean to perceive how a beneficiary suffering from
visual impairment may perceive the environment.

2.1.1.1 Tool Overview


According to D1.3.1 the important property visual acuity can be expressed as
the reciprocal value of the size of the gap (measured in arc minutes) of the
smallest "Landolt C"2 that can be reliably identified. This can be geometrical

1
See http://en.wikipedia.org/wiki/Visual_impairment
2
Standard symbol used for testing vision
December 2010 14 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

modelled through a cone with the corresponding opening angle at the eye
position and aligned to the vision line (Fig.2.1). In order to cover additional
aspects as gaze and visual perception ranges, more general cones have to be
provided, which are based on varying opening angles along the cone rotation
around the vision line and on the head orientation.

Figure 4 Vision cone

This geometrical construct is the basis for the visual interaction tool. The cone is
generated on the input parameter eye position, vision line and opening angles.
The opening angles are set by the selected virtual user model, while the other
position parameters have to be provided according to the designers behaviour.

The integration of that tool in the immersive platform will support the following
online interaction process:

The designers eye position and vision line is continuously tracked by a specific
head-eye tracking device.

Based on the tracked eye and vision data the corresponding vision cones are
visualized in the virtual environment through a specific head-video device.

In addition the space outside the vision cone is disturbed or hidden completely.

When this process is applied to the designer acting in the virtual environment,
he only can see and recognized the environment within the vision range
restricted by the selected visual impairment. This allows a direct experience of
these impairments by the designer.

(see http://en.wikipedia.org/wiki/Landolt_C)

December 2010 15 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

2.1.1.2 Technical specification


The interaction tool requires the following input parameters:

Eye position list

The coordinates of the left and right eyes or the middle point of eyes position
are given (Fig.2.2). At least one eye position has to be provided.

eye point locations

Figure 5 Eye point locations (left, right, middle)

Vision line or fixation point

The line of vision has to be provided as a geometrical line (including direction


and distance) or alternatively as position of the fixation point (vision line is
defined from eye and fixation position) (Fig 2.3).

Head orientation (optional)

This input parameter is optional and only necessary for specific vision cone IDs
(see below). The orientation is given by the head upward and transversal vector
(Fig.2.3).

head orientation

vision line

fixation point

Figure 6 Head orientation, vision line and fixation point

December 2010 16 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Vision cone ID

This input parameter defines which kind of visual cone should be generated. A
potential list is acuity, gaze, and visual perception.

Vision cone control parameter

Each vision cone requires a specific set of parameters, which controls the
geometrical representation of the cone. These parameters are taken from the
given virtual user model. These parameters can be absolute (i.e. opening angle
in degrees) or relative (reduction of normal / average opening angles in
percentage).

Based on the input parameters, the interaction tool performs the following
process:

When the selected vision cone (ID) requires the head orientation, the current
vision line is rotated to be perpendicular to the head upward and transversal
vector.

The vision cone is generated as a rotation object around the vision line. For
each rotation angle between 0 and 360 the corresponding opening angle is
extracted from a database according to the selected vision cone ID. The
opening angles are modified according to the given control parameters. The
cone origin is the given eye position and the height is equal to the given vision
line length.

Figure 7 Rotation varying opening angles

The generated vision cone is represented by a geometrical surface object (i.e.


B-Spline surface) (Fig.2.4). Optionally a triangle mesh is generated from the
vision cone surface.

This process will provide the following output objects:

December 2010 17 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

One vision cone for each given eye position (left, right, middle) as a graphical
surface representation (i.e. B-Spline surface)

2.1.2 Kinematic Functional Limitation IT Impl.1


2.1.2.1 Overview
According to D1.3.1 the mobility can be expressed by reduced anatomical joint
angle ranges. This can be kinematical modelled through an articulated digital
human model equipped with the corresponding joint angle ranges.

This articulated digital human model is the basis for the mobility interaction tool.
Based on a set of body point positions the human model is accordingly moved
by inverse kinematics methods. These methods calculate proper joint angles to
match the given body point positions (motion tracking). This calculation takes
the specific joint angle ranges into account. When a motion forces a joint to
move beyond its limits, the joint angle is set to the limit, what stops this joint
movement. The joint angle ranges are set by the selected virtual user model,
while the body point position parameters have to be provided according to the
designers behaviour. This is done using a motion capturing system to measure
marker positions during the motion.

The integration of that tool in the immersive platform will support the following
online interaction process:

An individual human model is scaled the designer anthropometrics in order to


allow a proper motion tracking.

The designers body point positions are continuously tracked by a specific


marker tracking device.

Based on the tracked body point positions the human model is moved
correspondingly and visualized in the virtual environment through a specific
head-video device. The motion calculation takes the given joint angle limits into
account.

In addition the human model posture is checked for joint angles blocked at their
limits. In this case the joint is marked on the human model visualization.
Alternatively the blocked joints can be displayed as a list through the video
device.

When this process is applied to the designer acting in the virtual environment,
he sees his virtual body representation moving and receives a virtual feedback
when his movement is not possible due to the limited mobility. Through
corresponding motion behaviour modifications the designer can imitate and
experience restricted motion caused by the mobility impairment.

December 2010 18 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

2.1.2.2 Tool specification


The interaction tool requires the following input parameters:

Digital human model scaled to designers body dimensions

The digital human model is scaled to represent the designers kinematics and
body dimensions (Fig.2.5) in order to allow a proper motion tracking.

Figure 8 Scaled human model

Virtual marker positions

According to the designers marker configuration in the immersive system, the


digital human model is equipped with corresponding virtual markers. The virtual
positions on the manikin have to be inline with the real marker positions on the
designers body (Fig.2.6).

Figure 9 Virtual markers

Continuous marker position measurement stream (motion capture)

The motion of the designer in the immersive system is continuously tracked


(Fig.2.7) and the corresponding marker positions are provided to the interaction
tool.

December 2010 19 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 10 Marker measurement stream (motion capture)

Joint specific range of motion

Each joint of the human model is equipped with joint angle limits restricting the
joint kinematics to produce anatomical reasonable motions (Fig.2.8). The limits
can be set to standard values (average people) or to specific values. These
specific values can be provided absolutely or calculated by factors from the
standard values. The absolute values or the factors are extracted from the
selected virtual user model.

Figure 11 Joint angle limits

Based on the input parameters, the interaction tool performs the following
process:

The current marker position set is provided from the continuous measurement
stream (motion capture).

The digital human model, which is scaled to designers body dimensions and is
equipped with proper virtual markers, is moved using inverse kinematics. In the
final calculated posture the virtual marker positions match to the measured
marker positions.

December 2010 20 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

When the inverse kinematics method leads to joint angle beyond the
corresponding limits, the joint motions are block at these limits. These blocked
joints are marked internally.

The digital human model appearance is updated according to the new


calculated posture. The blocked joints are highlighted on the manikin or
displayed in the corresponding list.

This process will provide the following output objects:


Updated human model appearance according to current marker positions
Highlighted or listed joints which are blocked due to joint angle limits

2.2 Physical IT through innovative VR tools

The VR devices that are foreseen to be integrated and developed consist


mainly of motion tracking devices and haptic/vibrotactile interfaces. Some of
them will be off the shelf devices that will require integration efforts and other
will be completely custom developed for the VERITAS system.

Haptic/vibrotactile user interfaces are able to stimulate the human haptic


channel in relation with the actions that are performed within the virtual
environment. Such stimulation will be properly designed in order to allow the
designer immersed in the simulation environment to enforce his consciousness
of the end-user disability.

In particular the physical impairments that will be simulated through new


advanced VR tools (developed in A2.7.6) are the so-called motor impairments.
These are disabilities that can cause symptomatic limitations in the free
movements of the user limbs, reduced capabilities of exerting forces on the
surrounding environment and reduced control of movements.

In the VERITAS ISP the following type of functional limitations will be


addressed:

Joint Limitations: Many common diseases associated with aging and several
specific pathologies can cause limitations in the movements of the arm, leg,
neck backbone etc Among them in VERITAS VR tools mainly address to arm
or leg movements.

Force exertion Limitation: Common diseases associated with aging and other
specific pathologies can be cause of reduced ability in force exertion. VERITAS
VR Tools will be able of simulating this symptoms focusing on the user arms
forces.

December 2010 21 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Reduced motor control: Common diseases associated with aging and other
specific pathologies (for example Tremor) can be cause of reduced ability in
controlling the movements of the limbs. VERITAS VR Tools will be able of
simulating these symptoms focusing on the user arms arm movements.

In order to simulate such kin of functional limitation it has been chosen to


employ and develop three different types of VR device:
A wearable vibrotactile device;
A desktop kinaesthetic haptic user interface;
A haptic car steering wheel.

In the next sections it is described for each VR device the detail of the
principles, functionality and their integration in the VERITAS ISP.

2.2.1 Kinematic Functional Limitation: Wearable Vibrotactile


2.2.1.1 Vibrotactile Devices
Vibrotactile devices are simple tools able to provide vibration sensations on
certain location on the human skin. Such devices are spread in mobile devices
such as mobile telephone and they are usually employed to provide warnings
and flag messages.

In several applications such devices are arranged in the form of bracelet or suits
and they are used to communicate directional information i.e. while driving a car
the device will inform the driver about the direction of a possible imminent
collision.

An interesting example of their application is when several vibrating elements


are disposed around the wrist to form a bracelet. In this configuration the
resulting device shows very good capability of indicating directional information
as reported in [Serg08] and in [Died09].

2.2.1.2 VERITAS application


The foreseen application of vibrotactile devices in VERITAS is an example of
the above mentioned strategy that consist in transmitting coded information to
provide the designer with consciousness of the functional limitations of the end-
user. In particular the vibrotactile device will be employed to communicate
messages representing warnings related with spatial information.

The device that is going to be developed is a bracelet called Wireless


Vibrotactile Bracelet (WVB) that is worn around the designer wrist or ankle.
Such bracelet will have two basic functionalities:

To exert controller vibrations in different location around the wrist in order to


provide the direction of reaching of encountered constraint;
December 2010 22 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

To track the position in the 3D space of the user wrist in order to provide the
OSP with the information needed to animate the virtual human model.

2.2.1.3 Functionalities:
The Wireless Vibrotactile Bracelet, represented in Fig.2.9. that is already in
development at PERCRO Laboratory [Diederichs2009], is a wireless device
with rechargeable batteries and on-board Bluetooth communication able to
provide the user with programmable pulsating vibrotactile feedback on multiple
points around the user wrist. Such device has been applied for the support of
navigation in real and virtual environment.

In VERITAS the WVB will be employed to communicate to the designer


immersed in the simulation environment information on possible mobility
constraint associated with articular joints functional limitations.

Typical physical functional limitations of elderly and disable user involve the
limitation of joint movements due to arthritis and calcifications or as
consequence of injuries. The movements of the joints are limited by stiffening of
ligaments or by inflammations and the user feel pain when reaching certain
configurations.

These VERITAS VR devices will focus on the movements of the human arm
and leg. Other body parts are still critical like neck and hip but their inclusion is
out of the application fields considered in VERITAS project.

The new WVB is going to be designed in a way that the designer will receive
vibration around his wrist indicating the progressively approaching of the arm to
critical positions while performing the tasks required by the activity in the
simulation environment.

The vibration that is commanded is going to intuitively communicate the


approach direction towards which the motor restriction affects the end-user
mobility.

The WVB will be equipped with a motion tracker that measure in real-time the
position of the wrist and compute through an inverse kinematics algorithm the
position of the shoulder and forearm joints of the virtual user.

December 2010 23 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 12 First prototype of the vibrotactile bracelet equipped with 4 vibrating motors
without motion tracking functionalities

2.2.1.4 Device design input


The design a WVB and the software application for its control will require the
knowledge and the quantification in terms of functional limitations for typical
motor impairments of the arm and forearm of disable user.

The VR tool will require the exchange of user impediment information provided
by the Integrated Interaction Tool.

2.2.1.5 Data exchange with other components


The data exchanged with the ISP will be the position of the Designer wrist, in
order to update the position of the virtual human model in the virtual
environments.

2.2.2 Dynamic Functional Limitation IT: Haptics


2.2.2.1 Haptics:
Kinaesthetic Haptic user interfaces are robotics devices able to exert controlled
forces on the human body according to interactive control laws. In many
applications such forces are exerted in order to simulate a virtual force (a
weight) or contact that occurs with virtual surfaces. In some particular
application haptics can absolve the functionality of simulating virtual constraints
like in [Const2007]

2.2.2.2 VERITAS application:


Haptic kinaesthetic devices, differently from vibrotactile ones, can be employed
for simulating functional impairments. In this case the device is not transmitting
coded information but it is commanded in order to physically simulate the real
end-user impairments.

December 2010 24 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

In particular in the VERITAS ISP it is foreseen to use a haptic desktop device


called GRAB with the aim of introducing disturbing forces for simulating
functional limitations of Control of Movements.

The GRAB (Fig.2.10) device is a haptic able to exert controlled force on the
user hands and fingers. Its functionalities are basically equivalent to the
commercial Phantom device from Sensable Corp. but the workspace of the
device is much larger and covers almost completely the workspace of the
human arm.

In VERITAS ISP it is planned to integrate this device to absolve the function of


simulating functional limitations associated with motor control disabilities like
Essential Tremor, Parkinson and generally Neurodegeneration due to aging.

2.2.2.3 Functionalities:
The GRAB haptic user interface is a 3 DoF (Degrees of Freedom) haptic device
able to exert forces along any 3D direction in the range of 5N of magnitude. A
detailed description of the device is provided in [Aviz03].

In order to simulate functional limitations associated with motor control it is


foreseen to use the GRAB haptic user interface for introducing disturbing
forces. Such disturbing forces will be commanded in a way to modify the
trajectories of the designer hand. The modification caused by such disturbances
will be equivalent to the distortion of movements that affect the disable end-
user.

Figure 13 GRAB Haptic user interface is able to deliver a force along any wanted
orientation in the 3d space

2.2.2.4 Device design input


The design of application on GRAB haptic device will require the knowledge
and the quantification in terms of functional limitations for typical motor
impairments of the arm and forearm of a disabled user. The VR tool will require

December 2010 25 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

the exchange of user impediment information provided by the Integrated


Interaction Tool.

2.2.2.5 Data exchange with other components


The data exchanged with the ISP will be the position of the Designer hand,
used for updating the position of the virtual human model in the virtual
environments.

Figure 14 Custom Force-Feedback steering wheel develop by PERCRO laboratory within


the European project VIRTUAL (http://vrlab.epfl.ch/Projects/virtual.html)

2.2.2.6 Force feedback steering wheel


The Force Feedback Steering Wheel (FFSW) is an example of the so-called
Task replica haptic user interfaces. Such devices are haptic interfaces that are
characterized by having a reproduction of the real tool that is handled by the
end-user in the real world.

This kind of devices shows very good performances for what concern realism
and quality of the feedback but they obviously lack of versatility. In the case of
FFSW the application is obviously restricted to a car simulation environment.

Examples of application of this kind of device within virtual environment can be


found in [Frisoli03] where a complete VR cockpit has been developed for Virtual
Prototyping and Ergonomic Study. Another example is reported in
[Sungjae2010] where haptic and tactile feedback are integrated together.

2.2.2.7 VERITAS application


In VERITAS it is foreseen to develop application scenarios related to
automotive sector. The interaction tool supporting such kind of immersive
simulation environments is the Force-feedback steering wheel.

December 2010 26 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

For what concerns disabilities related to functional limitations on the magnitude


of force that the end-user can apply, it is possible to use the direct experience
approach. This means that the device will put designers in the condition to get
an equivalent perception of force of the disabled user.

In the case of driving a car the control of vehicle direction takes place by
applying forces and controlling the position of the rotation of the steering wheel.

In particular the functional limitations that will be considered are those related
with disabilities that cause loss of muscles tone and reduced muscular mass i.e.
typical disabilities associated with aging and with other specific pathologies like
Ulnar Neuropathy or as consequence of Lordosis.

2.2.2.8 Functionalities
In particular the FFSW device will allow programming the force response of the
steering wheel according to the dynamic model of the simulated car. The force
feedback is programmed by default to simulate the behaviour of a real steering
wheel; when the designer asks for the simulation of one of the above mentioned
impairments the force response of the steering will be altered. In particular the
steering force will be increased in order to make the designer feel the equivalent
effort of the end-user.

2.2.2.9 Device design input


The design of the haptic steering wheel and the software and control the device
tool will require quantitative knowledge of force impairments of arm of elderly
user and the most common impairments resulting from motor impairments. This
data will be extracted from the data reported in D1.3 and performing further
analysis.

2.2.2.10 Data exchange with other components


The tool will provide position of the rotating wheel to the ISP in order to update
and control the VR model of the car.

The tool will require:


Data related to end-user and simulated car for initializing the model.

Real-time data of speed and trajectories data in order to compute the force that
has to be provided to the user.

2.3 Cognitive interaction tool

The idea of the cognitive simulation tools is to develop a system or application


that allows designers the experience of a range of cognitive disabilities. A
simulation of cognitive impairment through the system should support designers

December 2010 27 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

in better understanding people with certain limitations in their cognitive abilities.


A mere consciousness of the disability will help them to intuitively evaluate the
accessibility of designed products and services.

People with cognitive impairments are a specific impairment sub-group of the


VERITAS target users with disabilities. The tools encompass first and foremost
limitations in information processing. This sub-group and its specific abilities are
described in detail in the context of A1.1.1: End-user grouping.

The Cognitive Interaction Tool will address working memory limitations (e.g.,
older people, patients with Parkinson disease (PD) or Alzheimer disease (AD)).
Depending on the disability to be experienced, the tool will artificially impose
cognitive load on the designer to simulate a cognitive restriction. The tool will
further allow pre-selecting and adjusting of the degree of cognitive load imposed
on the designer, depending on the extent of the respective target disability.

The cognitive disabilities and users that will be addressed in the scope of the
cognitive impairment interaction tool are described in more detail in D1.4.1.

In the scope of the VERITAS use cases, as described so far in the DoW,
cognitive impairments are mentioned in particular within the automotive
application area (here related to reaction time) and concerning the design of
user interfaces in the areas of infotainment, office workspace and personal
healthcare systems. VERITAS allows the simulation of end-user groups and
their disabilities. This again allows designers to early and automatically discover
errors and to incorporate the gained knowledge about problems and spaces of
improvement for the respective application areas.

The usage of VERITAS for later tests of developments through the VERITAS
simulation platform with a set of virtual users (here in particular through the
application of the cognitive user models) will ideally cover a wide range of the
target end-user group. Thus, low- and higher fidelity prototypes of accessible
and high quality applications and all the issues addressed and identified by
VERITAS with regard to the specific cognitively impaired user group can be
tested by developers in a time- and cost-efficient manner.

Considering all these information provided through the VERITAS cognitive user
models, use cases and application scenarios the cognitive simulation tool
developed here should focus on the simulation and experience of memory
limitations as these explain most of the characteristics common for many
cognitive impairment disabilities. Memory limitation encompass a wide range of
attributes pointed out, such as slow reaction times, difficulties in sustaining
attention, orientation issues as well as problems (on a higher level) in decision
making processes. This is particularly relevant for the application areas:

December 2010 28 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

automobile (car interior design), domotics, collaborative work and personal


healthcare.

Simulations of working memory limitations can be realised for instance through


an increase of cognitive load on the working memory (WM) of the designer
(Baddeley & Hitch; 1974). One very prominent approach in this regard is the N-
Back Memory Task (Jaeggi et al., 2008; Pass, Renkl & Sweller, 2003; Perrige,
Hollenstein & Oelhafen; 2009), where people have to fulfil a specific task while
evaluating for instance a certain ICT or non-ICT product, system or physical
environment. The N-Back Memory Task will therefore be the functional
foundation for a first development of a VERITAS cognitive simulation tool
prototype.

2.3.1 Conceptual specification of the simulation tool: The n-back task


The n-back task is widely used as a way of placing a continuous demand on
working memory in neuroimaging and behavioural dual-task experiments (for
reviews, see Owen et al., 2005; Kane et al., 2007; Jaeggi et al., 2010). This task
requires online monitoring, updating of WM and rule-governed as opposed to
familiarity-based decisions, and therefore places a considerable load on the
executive component of the WM (e.g., Owen et al., 2005). Participants monitor
a series of visually-presented digits or letters and decide if each item repeats
the one presented n back in the sequence. Participants may be asked to press
a response button when the current stimulus is an n-back repeat or,
alternatively, may respond to each stimulus in turn by pressing one of two keys
for yes/no. A key advantage of this method is that the demands on executive
WM can be easily adjusted by changing the size of n (i.e., by comparing 0, 1, 2
or 3-back). In addition, researchers often contrast n-back tasks involving
different materials and modalities for example, verbal information (i.e.,
visually-presented digits or letters) and spatial locations (e.g., Smith et al., 1996;
Nystrom et al., 2000).

Dual task studies frequently use n-back tasks to examine the influence of
divided attention on a primary task (e.g., Baddeley, Hitch & Allen, 2009;
McKinnon & Moscovitch, 2007). Researchers have also compared verbal and
spatial n-back to determine the involvement of verbal and visuo-spatial
components of WM (e.g., Baddeley et al., 2009). However, n-back tasks
typically involve visual rather than auditory presentation (even for verbal
materials) and, to facilitate data-analysis, almost invariably involve button press
as opposed to spoken responses. This prevents the n-back paradigm from
being used in combination with primary tasks that involve vision and action (in
VERITAS for instance the evaluation of user interfaces or simulations of
physical environments). This is a challenge that needs to be addressed through
the cognitive interaction tool.

December 2010 29 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

This dual task method, as it imposes cognitive load on designer, is cognitively


demanding and may simulate the attentional deficits of patients with cognitive
impairment (e.g., on patients with dementia and stroke, cf. Corbett et al., 2010;
Giovannetti et al., 2006; 2008; Schwartz et al., 1998; Duncan et al., 1996;
Giovannetti et al., 2007; Morady & Humphreys, 2009), providing designers
insights into the difficulties that people with cognitive limitations face in tasks of
daily living. As this is indeed a novel approach, it needs testing with actual users
(here designers and end beneficiaries) to see to which extent the simulation
enables the experience of cognitive impairment, and according to which of the
VERITAS user groups (e.g. patients with PD, AD or older people). In addition,
this might provide useful parameters not only to adjust and fine tune the
cognitive simulation tools, but also to be considered for, and included in, the
OSP as well.

The cognitive simulation tool developed here will be an automated version of n-


back, involving auditory presentation and spoken responses, allowing designer
to explore the impact of divided attention through imposed cognitive load for
several minutes. The experiencing designer listens to a series of digits and
attempts to repeat them with a lag of 1, 2 or 3-back. For example, on hearing
the sequence 5, 2, 8, 9, 1 a designer performing 2-back would stay silent for
the first two digits, then say 5 after hearing 8, 2 after hearing 9, and so on.
The software system will be trained to recognise designers spoken responses
and will determine if each spoken digit is correct or incorrect, avoiding the need
for time-consuming manual coding (which is currently the case with existing
software).

2.3.2 Technical specification and integration of the cognitive simulation


tool
The cognitive simulation tool will utilise SAPI-compliant speech recognizers,
such as the one bundled with Microsoft Windows XP, Vista and Windows 7. The
application will be written in C# programming language on the Microsoft .NET
Framework.

The experiencing designers have to wear a wireless headset with headphones


and a boom microphone to facilitate and allow free movement (e.g., Model
LogitechClearChat as can be seen in Fig.2.12). They hear a sequence of
synthesised digits through the headphones. This may be of fixed length and
read from a file or randomly generated and continuous to be terminated
depending on the degree of cognitive load wanted to be simulated. The
designers microphone picks up what they say and speech recognition software
records each digit spoken with a time stamp.

December 2010 30 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 15 LogitechClearChat wireless USB headset with microphone.

The software used for recognizing designers spoken numbers will use a fixed
set of single-token Speech Recognition Grammars 3, one for each of the
numerals one to nine. The speech recognizer will be configured to only
perform recognition against these grammars, preventing it from trying to match
against any other possible speech. This should make the software application
more robust against recognition errors. Time boundaries are set to determine
whether the digit spoken occurred within the parameters of the n-back task
being performed.

The user interface of the software will offer designers a menu and toolbar with
controls, allowing them to start and stop the n-back process and to configure
the degree of cognitive load to be imposed. The results of the simulation
session can be stored (as Comma-Separated Value (CSV) text file, as it is a file
format widely supported by analysis software tools) for later analysis.

The cognitive interaction tool will be a software application that can be installed
and run on the common operating systems Windows XP, Vista and Windows 7,
and as such can be easily integrated with the other VERITAS interaction tools
(IIT) of the VERITAS platform.

Specifications are summarized in Table A6 in Annex A.

3
see http://en.wikipedia.org/wiki/Speech_Recognition_Grammar_Specification

December 2010 31 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

2.4 Behavioural & Psychological interaction tools

Behavioural & Psychological ITs will be devoted to let the designer experience
the expected psychological reactions of VERITAS end-user while performing
tasks that are considered in the VERITAS scenarios via the VERITAS ISP.

Stress, fatigue, motivation, and emotions are closely related to cognition and
influence cognitive performance. Therefore, behavioural and psychological
interaction tools should be closely related to the cognitive interaction tools
developed within A2.7.3.

Cognitive load in a dual-tasking setting (see section 2.2 on cognitive interaction


tools above, also see Ritter et al., 2007) can induce mental stress. Mental
arithmetic is a proven method of stress induction (Presser et al., 1999; Dedovic
et al., 2005; Seraganian, 1997). Prior to performing the VERITAS tasks in the
immersive environment, stress induction (pre-stressing the user) can be
realized with a GUI, e.g. based on the Montreal Imaging Stress Task (MIST,
Dedovic et al., 2005, see below). During the VERITAS tasks in the immersive
environment, the presentation of the arithmetic task can be either auditory using
headphones, or visual via superimposing the text of the arithmetic task over the
graphic output generated by the immersive platform. For receiving the answer
from the user, speech input via microphone can be used (verbal mental
arithmetic, Seraganian, 1997).

The emotional state of the user can be manipulated via audiovisual stimuli:
videos (Gross & Levenson, 1995; Kreibig et al., 2007), pictures (e.g. IAPS
images, http://csea.phhp.ufl.edu/media.html), music (Bishop et al. 2009,
Schmidt & Trainor, 2001, Krumhansl, 1997) or affective digitized sounds (IADS,
http://csea.phhp.ufl.edu/media.html). Visual stimuli (videos, pictures) can be
used to elicit emotions in the user prior to performing the VERITAS tasks in the
immersive environment, while auditory stimulation (music) can be used also in
the immersive environment while performing the VERITAS tasks.

However, directly causing mental fatigue in the users of cognitive interaction


tools requires a long exposure to cognitive tasks and might not be a practical
solution. Additionally, changes in mood in fatigue can be simulated via inducing
emotions with audiovisual stimuli.

The same applies to motivation: it seems to be not feasible to directly


manipulate the user's motivation. However, motivation is influenced by the
emotional state (Raghunathan & Pham, 1999), which, in turn, can be induced in
the way mentioned above.

December 2010 32 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

In view of the above, the behavioural and psychological interaction tools should
mainly provide means for inducing mental stress and manipulating the
emotional state of the user (emotion elicitation).

Additionally, the behavioural and psychological interaction tools could provide


means to monitor the level of stress, emotions, or fatigue via physiological
sensors (Kreibig, 2010; Healey & Picard, 2005; Picard et al., 2001; Choi et al.,
2006; Liying & Haoxiang, 2008; Liu et al., 2002; Eriksson & Papanikotopoulos,
1997; Ruiz et al., 2009; Yang et al., 2010; Kreibig et al., 2007; Jones &
Jonsson, 2007; Kostoulas et al., 2008) and computational models based on the
affective computing approach (Picard, 1995; see also
http://affect.media.mit.edu/index.php) that were considered in detail in ID1.5.1
Report on the study of the state of the art of Behavioural and psychological
models.

The following cognitive aspects of stress could be of importance (Ritter et al.,


2007) for designers of user interfaces:

- Perceptual tunnelling (Wickens et al., 1998; Wickens & Hollands, 2000), which
manifests itself in constriction of the effective visual perceptual field while the
items in the periphery are less attended. The focus is usually on the perceived
stressor and other items (e.g. at the periphery of the visual field) are less
available to cognitive processing.

- Cognitive tunneling (Wickens et al., 1998; Wickens & Hollands, 2000), when a
limited number of possibilities are considered by central cognition.

The capacity of working memory is impaired under stress (Wickens et al., 1998)
as it seems to be less available for saving and rehearsing information, and less
useful for attention-demanding tasks. Wickens et al. (1998) also state that long-
term memory seems to be little affected. The effect of stress on cognition
manifests itself with decreased attention (Hancock, 1986, Wickens et al., 1998).

2.5 Specification of the tool for Stress Induction

2.5.1 Prior to performing the VERITAS tasks: using GUI


Trier Mental Challenge Test (Pruessner et al., 1999) and its modification,
Montreal Imaging Stress Test (MIST, Dedovic et al., 2005), can be used as a
basis for creating a tool for stress induction prior to performing the VERITAS
tasks. The description of the stress induction tool below is adapted from
(Dedovic et al., 2005).

The principal component of the MIST is a computer program that displays a


mental arithmetic task, a rotary dial for submission of a response, a text field
that provides feedback on the submitted response (correct, incorrect or
December 2010 33 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

timeout) and 2 performance indicators, one for the individual subjects


performance and one for average performance of all users (Figure 16).

Figure 16 Graphical user interface of the Montreal Imaging Stress Task (MIST, Dedovic et
al., 2005). From top to bottom, the figure shows the performance indicators (top arrow =
average performance, bottom arrow = individual subjects performance), the mental
arithmetic task, the progress bar reflecting the imposed time limit, the text field for
feedback, and the rotary dial for the response submission.

In the condition of stress induction, a time limit is enforced for each task; the
elapsed time is displayed by a progress bar moving from left to right on the
computer screen, with the exact time allowed for each task depending on the
users previous performance. The program can be written in C# programming
language on the Microsoft .NET Framework for Microsoft Windows.

The basic algorithm of the program should create mental arithmetic tasks using
up to 4 numbers ranging from 0 to 99 and up to 4 operands (+ for addition, for
subtraction, * for multiplication and / for division). The algorithm should create
tasks for which the solution will be an integer between 0 and 9, such that a
single keystroke is needed for the response.

Difficulty level can be adjusted for 5 different categories. In the 2 easiest


categories, only tasks with 2 or 3 one-digit integers are created, and the
operands are limited to + or (example: 2 + 9 7). In the medium-difficulty
categories, tasks with up to 4 integers are created, with up to 2 of these integers
in the 2-digit range, and the * operand is also allowed (example: 3 * 12 29).
Finally, in the fifth and most difficult category, tasks with 4 integers are created,
the * and / operands are used, and all numbers may be in the 2-digit range
(example: 12 * 12 / 8 9).

The user selects a number on the rotary dial either by pressing the left or right
arrow keys on the keyboard or by pressing the left or right mouse buttons.
Pressing the left arrow or left mouse button moves the highlighted number on
the rotary dial of the programs user interface counter clockwise, whereas

December 2010 34 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

pressing the right arrow key or mouse button moves the highlighted number on
the rotary dial of the programs user interface clockwise (see Fig.2.13). Pressing
the down arrow key or the middle mouse button submits the highlighted number
on the rotary dial as the subjects response to the arithmetic task. This response
is then compared with the correct answer for the task, and the appropriate
feedback (correct or incorrect) is presented in the feedback field of the
computer screen. If no response is recorded within the time limit, the response
timeout is displayed.

In the training session, the users ability to perform mental arithmetic is


assessed by recording the average time needed to solve problems at various
difficulty levels. For this purpose, no time limit is enforced, and no time progress
bar is shown on the screen. In addition, no performance indicators (for the
subjects own performance or the average performance of all subjects) are
displayed. However, the recorded time is used to set a default time limit for the
experimental condition. The testing session must be at least 2 minutes long for
the program to determine the average time the subject takes to perform mental
arithmetic in the different categories; usually, the subject is given 5 minutes to
practise the program before the imaging session.

During each session of stress induction, the program is set to a time limit that is
10% less than the users average response time; this approach induces a high
failure rate. In addition, the program continuously records the subjects average
response time and the number of correct responses. If the user answers a
series of 3 consecutive mental arithmetic tasks correctly, the program reduces
the time limit to 10% less than the average time for the 3 correctly solved tasks.
Conversely, if the user answers a series of 3 consecutive tasks incorrectly, the
program increases the time limit for the following tasks by 10%. In this way, a
range of about 20% to 45% correct answers is enforced.

Individual runs can last between 2 and 6 minutes, as in the original approach for
fMRI or PET imaging (Dedovic et al., 2005). During the runs, the colour bar at
the top of the screen shows 2 performance indicators, for the users own
performance and average performance of all users. The average performance
arrow appears in the green area, at the right of the screen, while the users
individual performance usually appears in the red area, to the left of the screen.
Between the runs, the user can be informed about his or her performance,
reminding him or her that the average performance is about 80%90% correct
answers. The user is then reminded that there is a required minimum
performance and that his or her individual performance must be close or equal
to the average performance of all users so that the simulation can be realistic.

December 2010 35 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

2.5.1.1 While performing the VERITAS tasks: using voice input


While performing the VERITAS task in the immersive environment, stress can
also be induced via mental arithmetic described above but with a different
method of presentation of the arithmetic task. The arithmetic task should be
given either visually as text superimposed over the graphic output of the
immersive environment, or aurally via headphones.

In the latter case, the designers hear a sequence of synthesised digits


combined with required arithmetic operations through the headphones. The
sequence can be generated by the same algorithm as in 2.3.1.1.1. Difficulty
levels and time limits should be adjusted (increased) to match the users
performance in the dual-tasking setting of the VERITAS interaction tool
(especially for auditory presentation of the arithmetic task via headphones) as
compared to the original single-tasking approach of (Dedovic et al., 2005). For
this purpose, the software should provide dialog boxes for manipulating the
respective parameters.

To reduce interference with the VERITAS task which requires using the
keyboard and/or mouse or other input devices of the VR environment, the
answer to the arithmetic task should be received via microphone. The
experiencing designers have to wear a wireless headset to facilitate and allow
free movement. The designers microphone picks up what they say and speech
recognition software records each digit spoken with a time stamp, as in
subsection 2.2.2.

It is important that the answer should always be a digit between 0 and 9, so


that the same speech recognition functions can be used as for 2.2.2.

Information about the users performance and timeouts should be presented as


voice messages (e.g. Error!, Timeout!).

2.5.2 Specification of the tool for Emotion Elicitation


2.5.2.1 Prior to performing the VERITAS tasks: using visual stimuli
Gross & Levenson (1995) elicited amusement, anger, contentment, disgust,
fear, sadness, and surprise using film clips of varying lengths between 9 sec
and 8 min.

Further examples on how video can be used for elicitation of negative emotions
can be found, e.g., in (Kreibig et al., 2007) where fear and sadness were
induced using two different film clips for each of the two emotions. The
differences in emotions were reliably detected using multichannel physiological
recordings and discriminant analysis. The clips were about 10-12 minutes long.
Exact details on the films and what scenes from these films were included in the
clips for emotion elicitations can be found in (Kreibig, 2004).
December 2010 36 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

The tool will include a repository of multiple film clips. Each time the interaction
tool is started, a random clip should be presented.

Alternatively, static images from the International Affective Picture System


(IAPS) can be used for emotion elicitation. The IAPS is being developed to
provide a set of normative emotional stimuli for experimental investigations of
emotion and attention. The goal is to develop a large set of standardized,
emotionally-evocative, internationally-accessible, colour photographs that
includes contents across a wide range of semantic categories. The IAPS
(pronounced eye-aps) is being developed and distributed by the NIMH Center
for Emotion and Attention (CSEA) at the University of Florida.

2.5.2.2 While performing the VERITAS tasks: using auditory stimuli


Music was demonstrated to elicit emotions that can be detected via
physiological measurements (Bishop et al. 2009, Schmidt & Trainor, 2001,
Krumhansl, 1997) and choice reaction time (CRT) (Bishop et al. 2009).

In (Krumhansl, 1997), the stimulus materials consisted of six excerpts of


approximately three minutes in duration from the beginning of the following
pieces: 1) Gustav Hoist: Mars the Bringer of War from The Planets,
Orchestre National de France/Lorin Maazel, 2) Antonio Vivaldi, La Primavera
(Spring) from The Four Seasons, Catherine Mackintosh/ King's Consort/Robert
King, 3) Tomaso Albinoni, Adagio in G minor for Strings and Orchestra, Berlin
Philharmonic/Herbert von Karajan, 4) Modest Mussorgsky, Night on Bare
Mountain, Philadelphia Orchestra/Eugene Ormandy, 5) Samuel Barber, Adagio
for Strings, Op. 11, Los Angeles Philharmonic Orchestra/Leonard Bernstein, 6)
Hugo Alfven, Midsommarvaka, Swedish Radio Symphony Orchestra/Esa-Pekka
Salonen. The excerpts by Albinoni and Barber were chosen to represent
sadness, those by Hoist and Mussorgsky were chosen to represent fear, and
those by Vivaldi and Alfven were chosen to represent happiness.

In (Schmidt & Trainor, 2001), the musical stimuli comprised four orchestral
excerpts that reflected different affective valence (i.e., pleasant vs. unpleasant)
and intensity (i.e., intense vs. calm): intense-unpleasant emotion (e.g., fear),
Peter and the Wolf by Prokofiev, wolf excerpt; intense-pleasant (e.g., joy),
Brandenburg Concerto No. 5 by Bach, first movement; calm-pleasant emotion
(e.g., happy), Spring by Vivaldi, second movement; and calm-unpleasant
emotion (e.g., sadness), Adagio by Barber.

As for videos, there should be best a repository of multiple music fragments.


Each time the interaction tool is started, a random fragment should be
presented.

Alternatively, International Affective Digital Sounds (IADS,


http://csea.phhp.ufl.edu/media.html) can be used. The IADS system provides a
December 2010 37 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

set of acoustic emotional stimuli for experimental investigations of emotion and


attention. This set of standardized, emotionally-evocative, internationally
accessible sound stimuli includes contents across a wide range of semantic
categories. The IADS (pronounced "eye-ads") is being developed and
distributed by the NIMH Center for Emotion and Attention (CSEA) at the
University of Florida.

December 2010 38 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3 Architecture of the Interaction Tools


3.1 Overall considerations

The interaction tools (IT) developed in WP2.7 aim at simulating or intuitively


communicating a specified disability or impairment to the designer during an
immersive experience. The IT architecture must be carefully designed since
integration with Immersive Simulation Platform is required. To this purpose a
software module, the Interaction Manager (IM) will be developed in order to act
as the only interface with the ISP.

The interaction tools are widely heterogeneous, as they address deeply


different issues using peculiar modalities, however there are two main modules
which address general issues and therefore are shared among the different
tools:

Parameter Encoder (PE) module, which is in charge of receiving the selected


User Model data and translate this data into parameters able to tune the
interaction tools in order to let them assume a specific behaviour,

Logger module, which is in charge of creating text logs reporting opportune


performance/result data produced during the immersive experience (some of
this data, like timestamps, will be generated for every tool, other data will be
related only to a specific tool).

In the following a brief description resuming the main features of each tool is
provided together with tables including a list of the main input/output data
exchanged and a scheme of the related architecture.

December 2010 39 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.2 Visual Impairment IT

This tool (see figure 3.1) implements a simulation of disabilities related to visual
acuity impairment; the simulation is based on the generation of a vision cone,
limiting the designer field of view, whose parameters (see section 2.1.1) are
determined as follows:

the opening angle of the vision cone is computed by the Parameter Encoder
after receiving the User Model data;

the designers eye position and gaze direction are continuously tracked by a
specific head-eye tracking device.

Based on the tracked eye and vision data the designer will perceive the space
outside the vision cone as disturbed or hidden completely; alternatively, the
corresponding vision cone can be visualized in the virtual environment as a
geometrical shape (placed a short distance ahead of the designer in order to
give a visual feedback of the limited field of view). This tool assumes the
existence of a specific purposely developed module in the ISP (Visual Cone
Reshaper), in charge of altering in real-time the displayed portions of the scene
according to the computed vision cone and/or to display the vision cone as a 3D
geometrical shape.

Linked
Data I/O Attributes Class Type
Entity

Visual cone float IM (PE)


Input Static
opening angle

Eye Position Input Dynamic vector IM

Gaze Direction Input Dynamic vector IM

Head orientation Input Dynamic vector IM

Visual Cone array IM


Output Dynamic
Parameters

Table 1 Visual Impairment Tool Data Specification

December 2010 40 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.3 Interaction Manager

The role of the IM for this tool is to translate, via the Parameter Encoder, the
User Model data (according to age and disability) into the opening angle of the
visual cone, and to transmit to the tool the data received from the Immersive
Simulation Platform related to the eye (position and orientation) and to the head
(orientation) which are used to completely define the visual cone parameters.
The IM will also transmit to the ISP the visual cone parameters dynamically
computed by the IT.

I/O Linked
Data Class Type
Attributes Entity

User Model data Input Static Array OSP

Visual cone opening Float IT


Output Static
angle

Input / Vector ISP / IT


Eye Position Dynamic
Output

Input / Vector ISP / IT


Gaze Direction Dynamic
Output

Input / Vector ISP / IT


Head orientation Dynamic
Output

Input / Array IT / ISP


Visual Cone Parameters Dynamic
Output

Table 2 Interaction Manager Data Specification

December 2010 41 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 17 Visual Impairment Tool Architectural Scheme

December 2010 42 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.4 Kinematic Functional Limitation IT Impl.1

This tool (see figure 3.2) implements a simulation of disabilities related to limited
mobility impairment; the simulation is based on providing a virtual feedback
when the designer movements are not possible due to the constraints posed on
the kinematical joint angles of the body. The tool, based on the value of the joint
angles retrieved by a Motion Capture system, will visually signal a warning
when the measured joint angles exceed the allowed ranges. If the ISP operates
a Head Mounted Display, these joint angles are used to update the pose of the
avatar accordingly with these limits.

The tool assumes the existence of a specific purposely developed module


(Visual Warning Manager) in the ISP in charge of producing an opportune visual
warning based on the ID received.

The tool produces as relevant data for logging, via the IM logger module, the
constrained Joint angles and an ID of the generated warning.

I/O Type Linked


Data Class
Attributes Entity

Avatar body array IM (PE)


Input Static
parameters

Joint angles limits Input Static array IM (PE)

Joint Angles Input Dynamic array IM

Constrained joint array IM


Output Dynamic
angles

Warning ID Output Dynamic int IM

Table 3 Kinematic Functional Limitation It (Impl.1) Data Specification

3.4.1 Interaction Manager


The role of the IM for this tool is to translate, via the Parameter Encoder, the
User Model data (according to age and disability) into the digital body
parameters and the joint angle limits, needed by the IT to impose movement
constraints and compose the avatar corrected posture which the IM receives
from the IT and communicates to the ISP together with the generated warning
ID, which will trigger opportune signalling in the ISP.
December 2010 43 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

The logger module logs both the warning ID and the constrained joint angles
computed by the IT.

Type Linked
Data I/O Attributes Class
Entity

User Model data Input Static array OSP

Avatar body array IT


Output Static
parameters

Joint angles limits Output Static array IT

Joint Angles Input/ Output Dynamic array ISP / IT

Constrained joint array IT / LOG,


Input / Output Dynamic
angles ISP

int IT / ISP,
Warning ID Input / Output Dynamic
LOG

Table 4 Interaction Manager Data Specification

December 2010 44 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 18 Kinematic Functional Limitation IT (1) Architectural Scheme

December 2010 45 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.5 Kinematic Functional Limitation IT Impl.2

This tool (see figure 3.3) implements a simulation of disabilities related to limited
mobility impairment; the simulation is based on providing a vibration feedback
when the designer movements are not possible due to the constraints posed on
the kinematical joint angles of the body. The tool, based on the value of the
hands position retrieved by a Tracking system, whenever the hand position
exceeds the allowed range, generates a vibration on a bracelet worn by the
designer. The intensity and the direction of the vibration can provide additional
information related to amount and direction of the movement overflow

Type Linked
Data I/O Attributes Class
Entity

Vibration intensity Input Dynamic float IM

Vibration direction Input Dynamic vector IM

Table 5 Kinematic Functional Limitation It (Impl.2) Data Specification

3.5.1 Interaction Manager


The role of the IM for this tool is to translate, via the Parameter Encoder, the
User Model data into a range of positions allowed for the hand; the Bracelet
Manager module use this data to check, in real-time, if the tracked position of
the hand exceeds this range. As the admissible range is expressed in a
reference system relative to the user and the tracked hand positions are
acquired in a global reference system, the position of the head is tracked as
well in order to allow the translation between the two reference systems.
Depending on the type of the tracker made available in the ISP, the direction of
the movement is either directly received from the tracker or internally computed
based on positional data. This information is used to generate opportune
intensity and direction for the bracelet vibration. The logger module stores the
retrieved positions of the bracelet and the generated warnings, if any.

Linked
Data I/O Attributes Class Type
Entity

User Model data Input Static array OSP

Head position Input Dynamic vector ISP

December 2010 46 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Linked
Data I/O Attributes Class Type
Entity

Bracelet position Input / Output Dynamic vector ISP / LOG

Hand movement direction Input Dynamic vector ISP

Vibration intensity Output Dynamic float IT

Vibration direction Output Dynamic vector IT

Warning ID Output Dynamic Int LOG

Table 6 Interaction Manager Data Specification

Figure 19 Motor Impairment Kinematics Tool Architectural Scheme

December 2010 47 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.6 Dynamic Functional Limitation IT

This tool (see figure 3.4) implements a simulation of disabilities related to


functional limitations on the magnitude of the force needed to manipulate an
object; the simulation is based on a specific tool, a force feedback steering
wheel, and therefore applies only to the automotive simulation environment.
Depending on the User Model data and on specific scenario data extra forces
will be applied to the steering wheel so as to let the designer perceive an
artificial weakness simulating the functional disability. The resulting angle of
the steering wheel is acquired and communicated in order to allow a correct
simulation update.

Linked
Data I/O Attributes Class Type
Entity

Force feedback Input Dynamic float IM

Steering angle Output Dynamic float IM

Table 7 Dynamic Functional Limitation It Data Specification

3.6.1 Interaction Manager


The role of the IM for this tool is to translate, via the Parameter Encoder, the
User Model data into a related amount of extra force needed by the specified
user to operate the steering wheel. In order to compute the exact force
feedback to be applied to the steering wheel, the IM needs also some static
data ( the dynamic model of the car) and the simulated car speed. The IM
receives from the IT the value of the steering angle which is communicated to
the main application in order to accordingly update the scenario.

The logger module stores the resulting steering angle and the computed force
estimation.

I/O
Data Class Type Linked Entity
Attributes

User Model data Input Static array OSP

Car dynamic model array ISP (app)


Input Static
params

December 2010 48 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

I/O
Data Class Type Linked Entity
Attributes

Car speed Input Dynamic float ISP (app)

Force feedback Output Dynamic Float IT, LOG

float IT / LOG, ISP


Steering angle Input / Output Dynamic
(app)

Table 8-Interaction Manager Data Specification

Figure 20 Dynamic Functional Limitation IT Architectural Scheme

December 2010 49 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.7 Control Functional Limitation IT

This tool (see figure 3.5) implements a simulation of disabilities related to


functional limitations on the control of the movements; the simulation is based
on a haptic device which interferes with the designer movement introducing
disturbing forces, on a selected band of frequency, simulating the functional
limitation.

Type Linked
Data I/O Attributes Class
Entity

Controlled position Input Dynamic vector IM

Table 9 Control Functional Limitation It Tool Data Specification

3.7.1 Interaction Manager


The role of the IM for this tool is to translate, via the Parameter Encoder, the
User Model data into a set of parameters related to the operational band of
frequency, needed to generate the disturbing forces on the haptic device. This
data is used to control the position of the finger via the haptic device; the same
data is sent to the ISP Avatar Manager in order to update (if needed) the avatar
hand posture.

The logger module stores the resulting finger controlled position.

I/O Type Linked


Data Class
Attributes Entity

User Model data Input Static array OSP

Finger Controlled vector ISP, LOG, IT


Output Dynamic
Position

Table 10 Interaction Manager Data Specification

December 2010 50 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 21 Control Functional Limitation IT Architectural Scheme

December 2010 51 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.8 Cognitive IT

This tool (see figure 3.6) implements a simulation of disabilities related to


cognition by imposing an additional cognitive load through the methodology of
the dual task. This is achieved with the implementation of an audio version of
the n-back task that interferes with the scenario task that the designer is
accomplishing. Depending on the extra cognitive load to be generated (based
on parameters computed from the User Model data and on the Designer Model
data*), a sequence of number is generated and transformed in an audio
message via a Text To Speech module. The vocal responses of the designer
are acquired by a microphone and analyzed with a Speech Recognition module
in order to verify the correctness of the answer. The global n-back task
performance is eventually logged at the end of the task.

* the extra cognitive load is a function not only of the model of the simulated
user but also of the cognitive model of the designer; therefore such a
characterization of the specific designer should be made available before using
the platform: To this purpose specific activities need to be carried out in order to
provide means to perform this characterization.

Linked
Data I/O Attributes Class Type
Entity

Cognitive load Input Static int IM (PE)

Warning ID Output Dynamic int IM

Table 11 Cognitive It Data Specification

3.8.1 Interaction Manager


The role of the IM for this tool is to translate, via the Parameter Encoder, the
User Model data into a measure of the additional cognitive load to impose to the
designer for the n-back task. If an error is detected by the IT, the IM receives
the ID of the warning which is transmitted to the ISP in order to trigger an
opportune audio/visual signalling.

The logger module stores the generated warning ID.

December 2010 52 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Linked
Data I/O Attributes Class Type
Entity

User Model data Input Static Array OSP

Designer Model Array OSP


Input Static
data

Cognitive load Output Static Int IT

Warning ID Input / Output Dynamic Int IT / LOG

Table 12 Interaction Manager Data Specification

Figure 22 Cognitive IT Architectural Scheme

December 2010 53 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.9 Behavioural Stress Induction IT

This tool (see figure 3.7) implements a simulation of behavioural and


psychological states which can be simulated by inducing an opportune level of
stress. This is achieved with the implementation of an audio dual task, similar to
that described in 0, which interferes with the scenario task that the designer is
accomplishing. Depending on a certain level of difficulty (based on parameters
computed from the User Model data), a math formula is generated and
transformed in an audio message via a Text To Speech module. The vocal
responses of the designer are acquired by a microphone and analyzed with a
Speech Recognition module in order to verify the correctness of the
mathematical performance. If a mistake is recognized, an appropriate audio-
visual warning is generated on the ISP. If made possible by the ISP, an optional
visual variant consists in visualizing the formula overimposed on the Virtual
Environment,

The tool assumes the existence of a purposely developed Visual Warning


Manager module in the ISP, as already specified in 3.4, in order to visualize a
warning and/or to visualize the formula in the alternative visual version. Optional
audio warnings can be provided via the ISP audio manager.

The required level of stress can be induced also by making the designer use
offline this tool (see figure 3.8), prior to have the immersive experience. In this
case the tool is self-standing and a typical desktop setup is used; the interaction
takes place with standard tools like a visual GUI and the use of mouse,
keyboard and a monitor. The IM is used only to encode parameters and to log
relevant data.

Type Linked
Data I/O Attributes Class
Entity

Level of difficulty Input Static int IM (PE)

Warning ID Output Dynamic int IM

Formula Output Dynamic String IM

Table 13 Behavioural Stress Induction It Data Specification

3.9.1 Interaction Manager


The role of the IM for this tool is to translate, via the Parameter Encoder, the
User Model data into a measure of the additional cognitive load to impose to the
designer for the n-back task. If an error is detected by the IT, the IM receives
December 2010 54 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

the ID of the warning which is transmitted to the ISP in order to trigger an


opportune audio/visual signalling.

The logger module stores the generated formulas, the results and the possible
warning ID.

Linked
Data I/O Attributes Class Type
Entity

User Model data Input Static Array OSP

Level of difficulty Output Static Int IT

Warning ID Input / Output Dynamic Int IT / ISP

Formula Input / Output Dynamic string IT / ISP

Table 14 Interaction Manager Data Specification

Figure 23 Behavioural Stress Induction IT Architectural Scheme

December 2010 55 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 24 Behavioural Stress Induction Offline IT Architectural Scheme

December 2010 56 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.10 Behavioural Emotion Elicitation IT

This tool (see figure 3.9) implements a simulation of behavioural and


psychological states which can be simulated by eliciting an opportune emotion.
This is achieved by reproducing particular music fragments which are selected
based on the type of emotion to be induced. The music can be reproduced
either using a headset or by the ISP audio manager.

In the second case, the tool assumes the existence of a database of music
fragments, accessible by the ISP Audio Manager, whose playing status can be
triggered upon the reception of the ID of the fragment.

The required level of emotion can be induced also by making the designer use
offline this tool (see figure 3.10), prior to have the immersive experience. In this
case the tool is self-standing and a typical desktop setup is used; the particular
emotion is induced by displaying selected videos or images. The IM is used
only to encode parameters.

Linked
Data I/O Attributes Class Type
Entity

Emotion ID Input Dynamic int IM (PE)

Music ID Output Dynamic int IM

Table 15 Behavioural Emotion Elicitation It Data Specification

3.10.1 Interaction Manager


The role of the IM for this tool is to translate, via the Parameter Encoder, the
User Model data into an identifier specifying the type of emotion to be induced.
The resulting identifier of the music fragment received from the IT is sent to the
ISP in order to trigger the playback of the fragment on the ISP audio manager.

Linked
Data I/O Attributes Class Type
Entity

User Model data Input Static array OSP

Emotion ID Output Dynamic int IT

December 2010 57 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Linked
Data I/O Attributes Class Type
Entity

Music ID Input / Output Dynamic Int IT / ISP

Table 16 Interaction Manager Data Specification

Figure 25 Behavioural Emotion Elicitation IT Architectural Scheme

December 2010 58 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 26 Behavioural Emotion Elicitation Offline IT Architectural Scheme

December 2010 59 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

3.11 Integrated Interaction Tool

Given the high heterogeneity of the Interaction Tools it is not possible to think to
realize an actual physical integration of the various tools, also because they are
very likely to operate separately from each other. Nevertheless, the Interaction
Tools can be thought as logically integrated through the interaction manager
which serves as an abstraction layer in order to achieve the communication
between the Interaction Tools and the Immersive Simulation Platform. In the
following a complete list of the data exchanged between the Interaction
Manager, the Interaction Tools and the Immersive Simulation Platform is
provided together with a resumptive scheme describing the overall logical
architecture.

I/O Type Linked Entity


Data Class
Attributes

User Model data Input Static array OSP

Designer Model data Input Static array OSP

Head position 3 Input Dynamic vector ISP

Hand movement
direction Input Dynamic vector ISP
(if available)

Car dynamic model


Input Static array ISP (app)
data

Car speed Input Dynamic float ISP (app)

Eye position Input / Output Dynamic vector ISP / IIT

Gaze Direction 3 Input / Output Dynamic vector ISP / IIT

Head orientation 3 Input / Output Dynamic vector ISP / IIT

December 2010 60 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

I/O Type Linked Entity


Data Class
Attributes

Visual Cone
Input / Output Dynamic array IIT / ISP
Parameters

Joint Angles 3 Input/ Output Dynamic array ISP / IIT

Constrained joint
Input / Output Dynamic array IT / LOG, ISP
angles 2

Warning ID 2 Input / Output Dynamic int IIT / ISP, LOG

Bracelet position Input / Output Dynamic vector ISP / LOG

IT / LOG, ISP
Steering angle 2 Input / Output Dynamic float
(app)

Formula (optional) 2 Input / Output Dynamic string IIT / ISP

Music ID 2 Input / Output Dynamic int IIT / ISP

Visual cone opening


Output Static float IIT
angle 1

Avatar body
Output Static array IIT
parameters 1

Joint angles limits 1 Output Static array IIT

Vibration intensity Output Dynamic float IIT

Vibration direction Output Dynamic vector IIT

Force feedback Output Dynamic float IIT

December 2010 61 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

I/O Type Linked Entity


Data Class
Attributes

Finger Controlled
Output Dynamic vector ISP, LOG, IIT
position

Cognitive load 1 Output Static int IIT

Degree of scrambling
1 Output Static int IIT

Level of difficulty 1 Output Static int IIT

Emotion ID 1 Output Dynamic int IIT

Table 17 Interaction Manager Data Specification

1
this data are gathered in the following scheme under the name Encoded
Parameters
2
this data, when exiting from the IIT, are gathered in the following scheme
under the name Computed data
3
this data, transmitted by the ISP to the IIT via the IM, are gathered in the
following scheme under the name Bridged data

*all the output data linked to the LOG entity are gathered in the following
scheme under the name Logged data

December 2010 62 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Figure 27 Integrated Interaction Tool Architectural Scheme

December 2010 63 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

4 Conclusions
In this document it is provided an overview of the software architecture and the
specifications of the VERITAS Interaction Tools for Designer Experience that
will be developed within the WP 2.7.

VERITAS Interaction Tools are interfaces that are developed to enhance the
designer's experience, to the point that he will be able to feel or intuitively
understand the user's disabilities. The main underlying concept is that the novel
interaction tools shall provide either sensorial stimulations altered according to
the disability to be simulated or other kind of information able to reproduce an
equivalent stress as experienced by disabled users.

The first section of this document is dedicated to the description of the general
approach that has been assumed for the simulation certain disabilities and for
the definition of the basic functionalities of the different tools. The concept of
simulation through Direct Experience and through Augmented Experience are
explained and a brief overview of the different type of disabilities is given
through a categorization of Physical Impairments, Cognitive Impairments and
Behavioural and Psychological impairments.

The second section the details related to each novel interaction tools are
provided. The work done for the definition of specification and functionalities is
reported.

Through an analysis of the requirements of each interaction tool, the overall


software architecture has been defined and is reported in the third section of
this document. The relations between IT and ISP through the Interaction
Manager has been defined. The different level of integration required by the
various interaction tools lead to a very heterogeneous level of interaction
between tools and ISP.

A summary table of the different tools and their specification is also provided in
Annex A of this document.

December 2010 65 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Annex A Overview of the Interaction Tools


Physical Interaction Tools

Visual Impairment IT

Short description The tool focuses on the simulation of visual impairment


and provides the designer a mean to perceive how a
beneficiary suffering from visual impairment may perceive
the environment. The tool is developed for being
implemented inside an immersive Virtual Environment
(VE) where stereoscopic vision and eye tracking are
provided. The designer immersed into the VE will be able
to see as a visual impaired person. The 3D images will
be filtered in order to simulate the limitations to visual field
that affects visual impaired people.

The tools will be developed in an integrated way inside the


Immersive Simulation Platform (ISP).

The tool will provide also a 3rd person mode that will allow
the designer to look at his avatar in the VE from a 3 rd
person point of view in order to better analyze the
scenario.

Software/ Software: Integrated software module within the ISP


Hardware
Hardware: Head and Eye Tracker integrated in the ISP

Hardware Eyes and head trackers specification must comply with


Specifications specification for immersive environments applications.

Basic To track eyes position and orientation


Functionalities
To superimpose filters on the images that are rendered by

December 2010 66 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

the ISP

Provide a 3rd person visualization for different perspective


observation

Operating System According to the ISP Operative System

Batch mode option: No batch mode, it can run only within the ISP

Input& Output: Input: Real-Time Position of Head and eye orientation


refresh rate 20-80Hz

Output: Scene visualization held by the ISP

Ability to access High level interaction with the ISP, Direct access to
input & output interfaces is not foreseen.
interfaces

Parameterizing for Based on visual impairment analysis provided in D1.3.1


platform

User Interface Immersive natural interaction in the VE

Usage protocol 1. Selection of user type and the type of disabilities

2. Performing the task in the VE experiencing the user


visual disability.

3. Optional analysis of log file

Table A 1 - Visual Impairment IT

December 2010 67 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Physical Interaction Tools


Kinematic Functional Limitation IT Impl.1

Short description The tool is the first of the two kinds of tool that focus on
the simulation of kinematics functional limitation and
provides the designer a mean to intuitively perceive the
limitation in the articular motion of the impaired user. The
tool is developed for being implemented inside an
immersive Virtual Environment (VE). The movements of
the designer body will be tracked by a motion tracking
system. The designer will receive warnings on the video if
he is performing movements that are not compatible with
the disability of the user that has been selected.

The tools will be developed in an integrated way inside the


Immersive Simulation Platform (ISP). The avatar inside
the VE will be directly under control of the designer. A
specific software module will be able to constantly keep
track of the joint position of the designer body and
compare it to the joint rotation limitation of a specific
beneficiary. When a limitation is reached a visual warning
is raised indicating on the avatar the constraint that has
been violated.

Software/ Software: Integrated software module within the ISP


Hardware
Hardware: Body Motion Tracker integrated in the ISP:

Hardware Motion trackers specification must comply to specification


Specifications for immersive environments applications

Basic To track the designer body movements

December 2010 68 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Functionalities To compare joint position with selected end user


limitations

To raise visual warnings if joint limitations are overcame

Operating System According to the ISP operative system

Batch mode: No batch mode, it can run only within the ISP

Input& Output: Input: Articular joint rotations

Output: Visual warnings indicating the joints that reach the


maximum rotation

Ability to access High level interaction with the ISP, Direct access to
input & output interfaces is not foreseen.
interfaces

Parameterizing for Based on joint motor constraint provided in D1.3.1


platform

User Interface Immersive natural interaction in the VE

Usage protocol 1. Selection of user type and the type of disabilities

2. Performing the task in the VE being constantly


informed of the possible reaching of a motion constraint
due to limited joint rotation of the impaired user.

3. Optional analysis of log file

Table A 2 - Kinematic Functional Limitation IT Impl.1

December 2010 69 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Physical Interaction Tools


Kinematic Functional Limitation IT Impl.2

Short description The tool is the second of the two kind of tool that focus on
the simulation of kinematics functional limitation and
provides the designer a mean to intuitively perceive the
limitation in the articular motion of the impaired user. The
tool is developed for being implemented inside an
immersive Virtual Environment (VE). The movements of
the designer body will be tracked by a motion tracking
system. A specific software module will be able to
constantly keep track of the joint position of the designer
body and compare it to the joint rotation limitation of a
specific beneficiary. When a limitation id reached a
vibratory warning is transmitted to the designer wrist
indicating also the direction of approach to the joint
limitation.

Software/ Software: Low Level Control Software for the encoding of


Hardware the vibrotactile warnings

Hardware: A Wireless Vibrotactile Bracelet equipped with


4 vibration motor disposed around the wrist; Body motion
tracker integrated in the Immersive Simulation Platform
(ISP).

Hardware Bracelet:
Specifications
Wireless Bluetooth connection (50m range)

Battery Capacity: 8h of duration (1 active motor with duty


cycle at 50% )

December 2010 70 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Weight: 100g

Basic To track the designer arm movements


Functionalities
To compare joint position with selected end user
limitations

To raise warnings if joint limitations are overcame

Operating System Windows Based and software interface will be defined


according to the ISP OS

Batch mode: Yes body configuration and position data should be


generated with a desktop tracker for pre-pilots experiment
foreseen in A3.6.3.

Input& Output: Input: Articular joint rotations

Output: Vibrotactile pulsating warning indicating the


approaching direction towards the constraint that is
reached.

Ability to access Low level control of the bracelet imposing the frequency of
input & output the pulsating vibration and the duty cycle.
interfaces

Parameterizing for Based on joint motor constraint provided in D1.3.1


platform

User Interface Immersive Natural Interaction in the VE

Usage protocol 1. Selection of user type and the type of disabilities

2. Performing the task in the VE being informed


through the bracelet of the possible reaching of limitation
of joint rotation for the impaired user.

3. Optional analysis of log file

Table A 3 - Kinematic Functional Limitation IT Impl.2

December 2010 71 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Physical Interaction Tools


Dynamic Functional Limitation IT

Short description The tool aim at the simulation of dynamics functional


limitation and provides the designer a mean to intuitively
perceive the increased effort that feel the impaired user
for controlling the steering wheel of a car. The tool will be
responsible of computing and delivering the force profile
according to the car and adding a certain amount of force
for providing the designer with the impaired person
feeling. For the interaction a force feedback steering
wheel will be employed.

The tool will be able to measure in real-time the position of


the steering wheel for the interaction with the Virtual
Environment (VE). According to the driving speed and to
the car dynamics the tools will compute the torque to be
transmitted to the designer and add the necessary
alteration of such torque according to the impaired subject
that is going to be simulated.

Software/ Software: Low Level Control Software for the control of


Hardware the Haptic Steering Wheel.

Hardware: A Force-Feedback Steering Wheel able to


measure rotation of the steering wheel and to provide
programmed torques to the designer.

Hardware Maximum torque 5Nm


Specifications
No rotation limits

Position accuracy: 0.1

Basic To estimate the steering torque for a certain driving

December 2010 72 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Functionalities condition

To impose the torque on the haptic device as sum of the


estimated torque plus the addition for simulating the
selected impairment

To provide the angular position of the steering wheel to


command the car virtual model

Operating System Windows Based.

Batch mode: Yes body configuration and position data should be


generated with a desktop system for pre-pilots experiment
foreseen in A3.6.3.

Input& Output: Input: Vehicle and pneumatic friction parameters , speed


of the vehicle

Output Toque to the designer and position of the steering


wheel.

Ability to access Low level control of the force feedback steering wheel.
input & output
interfaces

Parameterizing for Based on dynamics functional limitation analysis provided


platform in D1.3.1 and through some additional pre-pilots test for
tuning the parameters.

User Interface Immersive natural interaction with the Force Feedback


Steering Wheel

Usage protocol 1. Selection of user type and the type of disabilities

2. Performing the task in the VE experiencing a


modified effort on the steering wheel in order to perceive
the feeling of the impaired user.

3. Optional analysis of log file of force and position

Table A 4 - Dynamic Functional Limitation IT

December 2010 73 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Physical Interaction Tools


Control Functional Limitation IT

Short description The tool aim at the simulation of control functional


limitation of movements. The tool is developed for being
implemented inside an immersive Virtual Environment
(VE) and desktop application as well. The tool makes use
of a desktop haptic interface called GRAB able to deliver a
3DoF force on the designer finger. The will impose certain
disturbances to the motion of the designer hands and
finger in order to simulate the typical tremor movements
associated with typical pathologies.

The tool will basically take as input the initial settings


related to the specific end-user to be simulated and will
provide an oscillatory force on the designer finger(or
hand) that will disturb the voluntary movements of the
designer. The tools will provide position of the end-tip of
the device as output

Software/ Software: Low Level Control Software for the control of


Hardware the GRAB device.

Hardware: GRAB haptic device able to measure position


of the user fingers and to provide programmed forces to
the designer fingers.

Hardware DoF: 3
Specifications
Force: 5N

Refresh Rate: 1kHz

Workspace: Cube (60x60x60 cm)

Basic To track designer hand movements


Functionalities
To transmit oscillatory forces in order to simulate the

December 2010 74 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

selected motor control limitations

Operating System Windows Based

Batch mode: Yes in the batch mode the device will simply impose
oscillatory forces and the trajectory will be recorder and
stored.

Input & Output Input: Initial Settings

Output: Oscillatory force to the designer finger (or hand);


real time position

Ability to access Low level direct access to the GRAB haptic Interface
input & output
interfaces

Parameterizing for Based on control functional limitation analysis provided in


platform D1.3.1 and through some additional pre-pilots test for
tuning the parameters.

User Interface Interaction takes place through the haptic device

Usage protocol 1. Selection of user type and the type of disabilities

2. Performing the task in the VE experiencing the


control disturbs imposed by the oscillatory forces form the
haptic device.

3. Optional analysis of log file of force and position

Literature http://www.percro.org/index.php?pageId=GRAB&page=desc_1

references or
website .

Table A 5 - Control Functional Limitation IT

December 2010 75 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Cognitive Interaction Tools


Cognitive IT

Short description This tool implements a simulation of disabilities related to


cognition by imposing an additional cognitive load through
the methodology of the dual task. This is achieved with
the implementation of an audio version of the n-back
task that interferes with the scenario task that the
designer is accomplishing. Depending on the extra
cognitive load to be generated (based on parameters
computed from the User Model data and on the Designer
Model data*), a sequence of number is generated and
transformed in an audio message via a Text To Speech
module. The vocal responses of the designer are acquired
by a microphone and analyzed with a Speech Recognition
module in order to verify the correctness of the answer.
The global n-back task performance is eventually logged
at the end of the task.

Software/ Software: Software for the generation of the questions and


Hardware acquisition of answers

Hardware: Of the shelf Wireless USB headset

Hardware Standard Wireless Headset


Specifications

Basic To provide a characterization of designer before the


Functionalities simulation (Designer Model)

To generate the arrays of numbers for the N-Back task.

To convert Text to speech and Speech to Text for asking

December 2010 76 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

questions and gather answers

To provide analysis of designer answers

Operating System Windows Based

Batch mode: Yes

Input& Output: Input: Automatic generated numbers

Output: Log files CSV text files

Ability to access Audio input-out not necessarily related with audio


input & output management of the Immersive Simulation Platform
interfaces:

Parameterizing for Not defined yet (needs tests with end-users)


platform

User Interface GUI window with a menu to select user group to be


simulated, options for play, pause and stop of sequence,
logging of data (and possibility to store/export results),
function as to train the speech recognition system, etc.

Usage protocol Characterization of designer cognitive abilities (before the


simulation)

Selection of user type and the type of disabilities

Performing the task in the VE

Optional analysis of log file and tasks

Table A 6 - Congnitive IT

December 2010 77 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Behavioral and Psychological Interaction Tools


Behavioural Stress Induction IT

Short description This tool implements a simulation of behavioural and


psychological disabilities which can be simulated by
inducing an opportune level of stress. This is achieved
with the implementation of an audio dual task which
interferes with the scenario task that the designer is
accomplishing.

Two modes of operation are foreseen:

Online: Depending on a certain level of difficulty (based on


parameters computed from the User Model data), a math
formula is generated and transformed in an audio
message via a Text To Speech module. The vocal
responses of the designer are acquired by a microphone
and analyzed with a Speech Recognition module in order
to verify the correctness of the mathematical performance.
If a mistake is recognized, an appropriate audio-visual
warning is generated on the Immersive Simulation
Platform (ISP).

Offline: The required level of stress is induced by making


the designer use offline the same tool in offline mode,
prior to have the immersive experience. In this case the
tool is self-standing and a typical desktop setup is used;
the interaction takes place with standard tools like a visual
GUI and the use of mouse, keyboard and a monitor. The
IM is used only to encode parameters and to log relevant
data.

Software/ Software: Software for the generation of the questions and


Hardware acquisition of answers

December 2010 78 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Hardware: Of the shelf Wireless USB headset

Hardware Standard Wireless Headset


Specifications

Basic Generate math formula to be solved by the designer


Functionalities
To convert Text to speech and Speech to Text for asking
questions and gather answers

To provide warnings if the designers answer is wrong

Optionally the stress condition of the designer might be


monitored

Operating System Windows Based

Batch mode: Yes

Input& Output: Input: Math formula

Output files: Log CSV text file

Ability to access Audio input-out not necessarily related with audio


input & output management of the Immersive Simulation Platform
interfaces:

Parameterizing for Not defined yet (needs tests with end-users)


platform

User Interface Online Version: Interaction takes place in the ISP through
headset

Offline Version: GUI window

Usage protocol Selection of user type and the type of disabilities

Performing the task in the VE

Optional analysis of log file and tasks

Table A 7 - Behavioural Stress Induction IT

December 2010 79 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Behavioural and Psychological Interaction Tools


Behavioural Elicitation IT

Short description This tool implements a simulation of behavioural and


psychological disabilities which can be simulated by
eliciting an opportune emotion. This is achieved by
reproducing particular music fragments which are selected
based on the type of emotion to be induced. The music
can be reproduced either using a headset or by the
Immersive Simulation Platform (ISP) audio manager. In
the second case, the tool assumes the existence of a
database of music fragments, accessible by the ISP Audio
Manager, whose playing status can be triggered upon the
reception of the ID of the fragment.

Software/ Software: Software for the generation of the questions and


Hardware acquisition of answers

Hardware: Of the shelf Wireless USB headset

Hardware Standard Wireless Headset


Specifications

Basic Reproduce some music samples according to the


Functionalities selected emotion to elicit

Operating System Windows Based

Batch mode: Yes

Input& Output: Input: Automatic generated numbers

Output files: Log CSV text file of the music ID

Ability to access Audio input-out not necessarily related with audio


input management of the ISP

December 2010 80 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

& output interfaces

Parameterizing for Not defined yet (needs tests with end-users)


platform

User Interface Online Version: Interaction takes place in the ISP through
headset

Offline Version: GUI window

Usage protocol Selection of user type and the type of disabilities

Performing the task in the VE

Optional analysis of log file and tasks

Table A 8 - Behavioural Elicitation IT

December 2010 81 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

References
Albert, M. L., Feldman, R. G., & Willis, A. L. (1974). The subcortical dementia of
progressive supranuclear palsy. Journal of Neurology, Neurosurgery and
Psychiatry, 37: 121-130.

Altmann, E. M. & Gray, W. D. (2000). An integrated model of set shifting and


maintenance. In N. Taatgen & J. Aasman (Eds.), In Proceedings of the third
international conference on cognitive modeling (pp. 17-24). Veenendaal, The
Netherlands: Universal Press.

Avizzano, C.A.; Marcheschi, S.; Angerilli, M.; Fontana, M.; Bergamasco, M; A


Multi-Finger Haptic Interface for Visual Impaired. Proceedings of the12th IEEE
Workshop Interactive Communication, October 31 - November 2, 2003,
Millbrae, California US, ISBN: 0-7803-8136-X

Bckman, L., Small, B.J., & Fratiglioni, L. (2001). Stability of the preclinical
episodic memory deficit in Alzheimer's disease. Brain, 124(1): 96-102.

Baddeley, A. D., & Hitch, G. (1974). Working memory. Psychology of Learning


and Motivation, 8, 47-89.

Baddeley, A.D., Bressi, S., Della Sala, S., Logie, R., & Spinnler, H. (1991). The
decline of working memory in Altzheimers disease. A longitudinal study. Brain,
114(6): 2521-2542.

Baddeley, A.D., Hitch, G.J., Allen, R.J. (2009) Working memory and binding in
sentence recall. Journal of Memory and Language, 61, 438456.

Bates, T. & Stough, C. (1998). Improved reaction time method, information


processing speed, and intelligence. Intelligence, 26 (1): 53-62.

Brand, M., Labudda, K., Kalbe, E., Hilker, R., Emmans, D., Fuchs, G., Kessler,
J., Markowitsch, H.J. (2004). Decision-making impairments in patients with
Parkinson's disease. Journal Behavioural Neurology. 15(3-4): 77-85.

Cassell, K., Shaw, K., & Stern, G. (1973). A computerized tracking technique for
the assessment of Parkinsonian motor disorders. Brain, 96: 815-826.

Cerella, J. (1990). Aging and information processing rate. In: J.E. Birren, K.W.
Schaie (Eds.): Handbook of the psychology of aging, 3rd ed., San Diego:
Academic Press, pp. 201221.

Cerella, J., Poon, L.W., & Williams, D.M. (1980). Age and the complexity
hypothesis. In: L.W. Poon (Ed.): Aging in the 1980s, Washington: American
Psychological Association. pp. 33240.
December 2010 82 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

Constantinescu D., S. Salcudean, and E. Croft, Haptic rendering of topological


constraints to users manipulating serial virtual linkages Robotic Welding,
Intelligence And Automation Lecture Notes in Control and Information Sciences,
2007, Volume 362/2007, 51-59, DOI: 10.1007/978-3-540-73374-4_6

Corbett, F., Jefferies, E., Lambon Ralph, M. A. (2009) Exploring multimodal


semantic control impairments in semantic aphasia: Evidence from naturalistic
object use. Neuropsychologia, 47, 2721-2731.

Cronin-Golomb, A., Corkin, S., & Growdon, J.H. (1994). Neuropsychologia, 32


(5):579-593.

Cronin-Golomb, A., Corkin, S., Rizzo, J.F., Cohen, J., Growdon, J.H., & Banks,
K.S. (1991). Visual dysfunction in Alzheimer's disease: relation to normal aging.
Ann. Neurol., 29: 4152.

Delazer, M., Sinz, H., Zamarian, L., & Benke, T. (2007). Decision-making with
explicit and stable rules in mild Alzheimers disease. Neuropsychologia, 45:
16321641.

Dewick, H.C., Hanley, J.R., Davies, A.D.M., Playfer, J., & Turnbull, C. (1991).
Perception and Memory for faces in Parkinsons Disease, Neuropsychologica,
29(8): 785-802.

Duncan, J. Emslie, H., Williams, P., Johnson, R., Freer, C. (1996) Intelligence
and the frontal lobe: The organisation of goal-directed behaviour. Cognitive
psychology, 30, 257-303.

F. Sergi, D. Accoto, D. Campolo, E. Guglielmelli, Forearm Orientation


Guidance with a Vibrotactile Feedback Bracelet: on the Directionality of Tactile
Motor Communication, 2008 IEEE International Conference on Biomedical
Robotics and Biomechatronics, Biorob 2008, Scottsdale, Arizona (USA),
October 19-22 2008.

FRISOLI A., CARROZZINO M., S. MARCHESCHI, F. SALSEDO, M.


BERGAMASCO (2005). Haptic systems for simulation of primary commands of
cars. In: Research in Interactive Design: Proceedings of Virtual Concept 2005.
Biarritz (France), November 2005, ISBN/ISSN: 2-287-48363-2

Giffard, B., Desgranges, B., Nore-Mary, F., Lalevee, C., de la Sayette, V.,
Pasquier, F., & Eustache, F. (2001). The nature of semantic memory deficits in
Alzheimer's disease: new insights from hyperpriming effects. Brain 124:1522-
1532.

Giovannetti T., Schmidt, K., Sestito N., Libon D, Gallo, J (2006). Everyday
Action in Dementia: Evidence for Differential Deficits in Alzheimers Disease
December 2010 83 PERCRO
VERITAS D2.7.1 PU Grant Agreement # 247765

versus Subcortical Vascular Dementia. Journal of the International


Neuropsychological Society, 12, 45-53.

Giovannetti T., Schwartz M.F., Buxbaum L.J. (2007). The Coffee Challenge: A
New Method for the Study of Everyday Action Errors. Journal of Clinical and
Experimental Neuropsychology, 29, 609 - 705.

Giovannetti, T., Bettcher, B. Magouirk, Brennan, L., Libon, D.J., Kessler, R.K.,
Duey, K.(2008). Coffee with jelly or unbuttered toast: Omissions and
commissions are dissociable aspects of everyday action impairment in
Alzheimers disease. Neuropsychology, 22, 235 245.

Glisky, E.L. (2007). Changes in Cognitive Function in Human Aging In: D.R.
Riddle (Ed.): Brain Aging Models, Methods, and Mechanisms. Frontiers in
Neuroscience, Wake Forest University School of Medicine, Winston-Salem, NC
Boca Raton (FL): CRC Press.

Gordon, B. & Carson, K. (1990). The basis for choice reaction time slowing in
Alzheimers disease. Brain and Cognition 13: 148-166.

Gunzelmann, G., Gross, J.B., Gluck, K. A., & Dinges, D.F. (2009a). Sleep
Deprivation and Sustained Attention Performance: Integrating Mathematical and
Cognitive Modeling. Cognitive Science, 33: 880910.

Gunzelmann, G., Moore, R. L., Salvucci, D. D., & Gluck, K. A. (2009b).


Fluctuations in alertness and sustained attention: Prediction driver performance.
In Proceedings of the 9th International Conference of Cognitive Modeling (paper
239), Manchester, United Kingdom.

Hawthorn, D. (2000). Possible implications of aging for interface designers.


Interacting with Computers, 12: 507528.

Hayes, A.M., Davidson, M.C., Keele, S.W., Rafal, R.D. (1998). Toward a
Functional Analysis of the Basal Ganglia. Journal of Cognitive Neuroscience 10
(2): 178198.

Heikkil, V.-M., Turkka, J., Korpelainen, J., Kallanranta, T., & Summala, H.
(1998). Decreased driving ability in people with Parkinsons disease. J Neurol
Neurosurg Psychiatry, 64:325330.

Heim, S., Tschierse, J., Amunts, K., Wilms, M., Vossel, S., Willmes, K.,
Grabowska, A., & Huber, W. (2008). Cognitive subtypes of dyslexia. Acta
Neurobiol Exp (Wars). 68(1): 73-82.

December 2010 84 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Iliadou, V. & Kaprinis, S. (2003). Clinical psychoacoustics in Alzheimer's


disease central auditory processing disorders and speech deterioration. Annals
of General Hospital Psychiatry, 2:12

Diederichs F. Marco Fontana Giacomo Bencini Stella Nikolaou Roberto


Montanari Andrea Spadoni Harald Widlroither Niccol Baldanzini New HMI
Concept for Motorcycles-The Saferider Approach. 358-366 2009 HCI (17).

Jaeggi, S. M., Buschkuehl, M., Jonides, J., & Perrige, W. J. (2008). Improving
fluid intelligence with training on working memory. In Proceedings of the
National Academy of Sciences, 105(19), 6829-6833.

Jaeggi, S.M., Buschkuehl, M., Perrig, W. J., Meier, B (2010) The concurrent
validity of the N-back task as a working memory measure. Memory, 18, 394-
412.

Kane, M. J., Conway, A. R., Miura, T. K., Colflesh, G.J. (2007) Working
memory, attentional control and the N-back task: A question of contruct validity.
Journal of Experimental Psychology: Learning, Memory and Cognition, 33, 615-
622.

Knight, R.G., Godfrey, H.P.D., & Shelton, E.J. (1988). The psychological deficits
associated with Parkinsons disease. Clinical Psychology Review, 8: 391-410.

Lima, S.D., Hale S., & Myerson, J. (1991). How general is general slowing?
Evidence from the lexical domain. Psychol Aging, 6:416425.

Lovett, M.C. (2002). Modeling selective attention: Not just another model of
Stroop (NJAMOS). Cognitive Systems Research, 3(1): 67-76.

McGuinness, B., Barrett, S.L., Craig, D., Lawson, J., & Passmore, A.P. (2008).
Attention deficits in Alzheimers disease and vascular dementia. J Neurol
Neurosurg Psychiatry, 81:157-159.

McKinnon, M.C., & Moscovitch, M. (2007). Domain-general contributions to


social reasoning: Theory of Mind and deontic reasoning reexplored. Cognition,
102, 179218.

Meyer, D.E., Glass, J.M., Mueller, S.T., Seymour, T.L., & Kieras, D.E. (2001).
Executive-process interactive control: A unified computational theory for
answering 20 questions (and more) about cognitive ageing. European Journal
of Cognitive Psychology, 13 (1/2): 123164.

Miyake, A., Friedman, N.P., Emerson, M.J., Witzki, A.H., Howerter, A., &
Wager, T.D. (2000). The unity and diversity of executive functions and their

December 2010 85 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

contributions to complex Frontal Lobe tasks: A latent variable analysis.


Cognitive Psychology, 41:49100.

Monacelli, A.M., Gushman, L.A., Kavcic, V., & Duffy, C.J. (2003). Spatial
disorientation in Alzheimers disease: The remembrance of things passed.
Neurology, 61: 1491-1497.

Monsell, S. (2003). Trends in Cognitive Sciences, 7(3): 134-140.

Morady, K. and Humphreys, G. W. (2009) Comparing action disorganization


syndrome and dual-task load on normal performance in everyday action tasks.
Neurocase 15 (1) 1-12.

Nystrom, L.E., Braver, T.S., Sabb, F.W., Delgado, M.R., Noll, D.C., Cohen, J.D.
(2000) Working memory for letters, shapes and locations: fMRI evidence
against stimulus-based regional organisation in human prefrontal cortex.
Neuroimage, 11, 424-446.

Owen, A.M., McMillan, K. M., Laird, A.R., Bullmore, E. (2005) N-back working
memory paradigm: A meta-analysis of normative functional neuroimaging
studies. Human Brain Mapping, 25, 46-59.

Pass, F., Renkl, A., & Sweller, J. (2003). Cognitive load theory and instructional
design: Recent developments. Educational Psychologist, 38(1), 1-4.

Perrige W.J., Hollenstein, M., Oelhafen, S. (2009). Can we improve fluid


intelligence with training on working memory in persons with intellectual
disabilities? Journal of Cognitive Education and Psychology, 8(2), 148-167.

Perry, M.E., McDonald, C.R., Hagler, D.J., Gharapetian, L., Kuperman, J.M.,
Koyama, A.K., Dale, A.M., McEvoy, L.K. (2009) White matter tracts associated
with set-shifting in healthy aging. Neuropsychologia, 47: 28352842.

Perry, R.J. & Hodges, J.R. (1999). Attention and executive deficits in
Alzheimers disease. A critical review. Brain, 122: 383404.

Pignatti, R., Rabuffetti, M., Imbornone, E., Mantovani, F., Alberoni, M., Farina,
E., & Canal, N. (2005). Specific Impairments of Selective Attention in Mild
Alzheimer's Disease. Journal of Clinical and Experimental Neuropsychology,
27:436448.

Sarter, M., Turchi, J. (2002). Age- and Dementia-Associated Impairments in


Divided Attention: Psychological Constructs, Animal Models, and Underlying
Neuronal Mechanisms. Dement Geriatr Cogn Disord,13:4658.

December 2010 86 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

Schwartz MF, Montgomery MW, Buxbaum LJ, Lee SS, Carew TG, Coslett HB,
et al. Naturalistic action impairment in closed head injury. Neuropsychology
1998; 12: 13-28.

Smith, E.E., Jonides, J., Koeppe, R.A. (1996) Dissociating verbal and spatial
working memory using PET. Cerebral Cortex, 6, 11-20.

Stemme, A., Deco, G., & Busch, A. (2007). The neuronal dynamics underlying
cognitive flexibility in set shifting tasks. J Comput Neurosci, 23:313331.

Sungjae Hwang; Jung-hee Ryu; , "The Haptic steering Wheel: Vibro-tactile


based navigation for the driving environment," Pervasive Computing and
Communications Workshops (PERCOM Workshops), 2010 8th IEEE
International Conference on , vol., no., pp.660-665, March 29 2010-April 2 2010

Taler, V. & Phillips, N.A. (2008). Language performance in Alzheimers disease


and mild cognitive impairment: A comparative review. Journal of Clinical and
Experimental Neuropsychology, 30 (5), 501556.

Verhaeghen, P. & Cerella, J. (2002). Aging, executive control, and attention: a


review of meta-analyses. Neurosci Behav Rev., 26: 849-857.

Wilson, R. S., Kaszniak, A. W., Klawans, H. L., & Garron, D. C. (1980). High
speed memory scanning in Parkinsonism. Cortex, 16: 67-72.

Wood, J.M. & Troutbeck, R. (1995). Elderly drivers and simulated visual
impairment. Opt Vis Sci, 72(2): 115-124.

List of Cited VERITAS Deliverables

D1.3.1 Abstract physical models definition

SubProject: SP1-User Modelling

Workpackage: WP1.3-Physical Models

D1.5.1: Abstract Behavioural and Psychological User models definition

SubProject: SP1-User Modelling

Workpackage: WP1.5- Behavioural and psychological models

December 2010 87 PERCRO


VERITAS D2.7.1 PU Grant Agreement # 247765

ID1.3.1 Report on the study of the state of the art of human physical models

SubProject: SP1-User Modelling

Workpackage: WP1.3- Physical Models

ID1.4.1: Report on the study of the state of the art of cognitive models

SubProject: SP1-User Modelling

Workpackage: WP1.4- Cognitive models

ID1.5.1 Report on the study of the state of the art of behavioural and
psychological models

SubProject: SP1-User Modelling

Workpackage: WP1.5- Behavioural and psychological models

ID2.7.1: Interaction Tools Specification

SubProject: SP2- Innovative VR models, tools and simulation environments

Workpackage: WP2.7 - New interaction tools for designer experience

December 2010 88 PERCRO

Das könnte Ihnen auch gefallen