Sie sind auf Seite 1von 27

CMT3321 HAPTICS REPORT

Will The Use of Haptics Feedback be more Efficient than Visual in Path Finding and Object
Discovery in Mining of Natural Resources
Name: Eshimokhai Charles
Number: M00476792
Lecturer: Nitish Chooramun
Lab: Anwar Bapikee

Group members
Name

Number

Moses Arfo

M00478479

Agbam Henry

M00479994

Abdullahi Emmanuel

M00483713

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


Abstract
Interaction between humans and systems over the years have been carried out using audio and
visual feedbacks. However, the addition of tactile feedback via haptics opens a new level of
interaction for people who are visually impaired and can also be used in situation where people
cannot see the items there are working with. For example in the mining industry, workers cannot
really detect where minerals are located underground. This report outlines the benefits of
implementing haptics feedback to aid the discovery of natural resources via pathfinding and shape
detection. Prior to writing this report, a research was carried out, in it a testbed comprising of
pathfinding and shape detection was developed. Different set of users were asked to test the
application, an analysis of the result of the evaluation is detail in this report.

Page 1 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

1 TABLE OF CONTENTS
2

INTRODUCTION ..................................................................................................................................... 3

CONCEPT ............................................................................................................................................... 5

PROTOTYPE ........................................................................................................................................... 9

4.1.1

State diagram ........................................................................................................................ 9

4.1.2

Prototype features ................................................................................................................ 9

4.1.3

Calibration ........................................................................................................................... 10

4.1.4

Interaction with testbed ..................................................................................................... 11

4.1.5

Design process .................................................................................................................... 13

4.1.6

Libraries ............................................................................................................................... 13

EVALUATION ....................................................................................................................................... 13
5.1

Evaluation methods ..................................................................................................................... 14

5.1.1

Observation with Think aloud............................................................................................. 14

5.2

Evaluation set up and Instrument............................................................................................... 14

5.3

Assumption pros and cons .......................................................................................................... 15

5.3.1

Pros ..................................................................................................................................... 15

5.3.2

Cons..................................................................................................................................... 15

5.4

Evaluation procedure .................................................................................................................. 15

5.5

Data gathering ............................................................................................................................ 18

5.6

Results ......................................................................................................................................... 18

5.7

Results analysis ........................................................................................................................... 20

5.7.1

User1 ................................................................................................................................... 20

5.7.2

User 2 .................................................................................................................................. 21

5.7.3

User3 ................................................................................................................................... 22

5.7.4

Overall Result ...................................................................................................................... 23

CONCLUSION ....................................................................................................................................... 24

REFERENCE .......................................................................................................................................... 25

Page 2 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

2 INTRODUCTION
Before the advent of tools, people have been able to manipulate objects using their hands, by
feeling the properties of the object via a sense of touch and resistance to a bend a feedback is
received. Feedback is a transference of information between two objects in the process of
communication. Feedback is essential at every level of communication because it is what keep
users informed about their immediate environment (Tan and Pentland, 1997). In human computer
interaction (HCI) there are numerous forms of feedback, these include visual, audio and haptic.
However, for the purpose of these research, we will be analysing the efficiency of two categories
of feedback. These are visual and haptic forms of feedback.
Statistics from a research conducted by Mauter and Katzi (2003), shows that 80 to 90 percent of
all feedbacks are received from the eyes i.e. visual feedback. Visual feedback is the norm in terms
of feedback however, it is not really useful to the visually impaired or blind. In our world today,
audio feedback has been integrated into the multi-media system in addition to visual (Tan and
Pentland, 1997). However, the human body is capable of interacting via five senses, with
computers interacting basically via visual and audio channels, there are lots of information tailback
(Mauter and Katzi 2003).
With the introduction of haptics, these tailed back information is now made available via a new
channel of communication with the sense of touch. Haptics increases the means of communication
between humans and computers by adding sensory feedbacks (Magnusson, Szymczak and
Brewster, 2012). Haptics feedback refers to both tactile which is based on cutaneous input and
force which is based on kinaesthetic input (Lederman and klazky, 2009). Tactile, also described
as touch feedback, refers to the sensitivity felt by the skin when placed under certain temperature
(hot and cold). Tactile feedback also allows the users to feel the roughness of a surface and
vibrations (Conti et al., 2014). A good example of tactile feedback can be found on gamepads and
mobile phones. Force feedback in computing refers to the simulating of physical attributes such as
vibrations and weights in gaming allowing the user to interact with the system. It can further be
defined as the physical sensation of resistance felt in the tendons and joint (Burdea, 1996).
The word Haptic was originated from the Greek word, haptesthai, which when translated to
English, means sense of touch. Haptics can be defined as the science of applying tactile
sensation to human interaction with computers (Brewster and Murray-Smith, 2001).

Page 3 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


The term Haptics encompasses, haptic technology which refers to interface for interaction via
touch. Haptic communication is a means by which people communicate with other beings like
animals. And haptic perception refers to the ability to recognise objects through touch (Nisha
Sharma, 2011).
Haptic devices or haptic interfaces refers to machine-driven devices that aid communication
between a user and a system (Kern, 2009). These devices avails users the opportunity to operate
and manipulate the virtual environment using tangible objects. Haptic devices are more than
regular input devices such as mouse and keyboard which provides no form of feedback to the user.
Haptic devices support both input and output features, by providing users an avenue to both
observe and react to their immediate environment (Kern, 2009), (Hayward et al., 2004). Examples
of haptic devices includes consumers gaming peripheral devices such as joystick equipped with
motors and sensors that supports force feedback. An advance system of haptic devices is used in
industrial and medical settings. An example of a sophisticated haptic device is the Phantom, the
scenario in which this device is used mostly is medical, simulation and training exercises in which
the pointer emulates the physical sensations like drill, and cut - using a syringe drill (Tsai et al,
2007). According to Ferscha et al (2008), depending exclusively on visual and audio feedback may
lead to false perception occasionally.
Hence this research is mainly focused on a subcategory of Haptic Technology which is focus on
vibrotactile feedback. This research aims to answer the following questions:
Will The Use of Haptics Feedback be more Efficient than Visual in Path Finding and Object
Discovery in Mining of Natural Resources
In this report, in an attempt to answer the research question above. At inception, a testbed idea was
conceptualized; subsequently, a prototype of the testbed system was developed and used to collect
information from numerous test users. Lastly, the data collected were analysed to provide an
answer to the research question.

Page 4 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

3 CONCEPT
Following the application development life-cycle, the first task was to come with a scenario and
detect a problem that could be solved. After brainstorming, we decided to work on the mining
industry. Hence the research question Will the use of haptic feedback improve the efficiency in
path finding and shape detection in the mining industry compared to visual was agreed on. The
concept behind the prototype is to design a testbed with the solitary purpose of providing a valid
result to the research question. In order to achieve this aim, two major test activities comprising
of, path finding and shape detection will be carried out. Each of these activities, will subsequently
be divided into three scenarios, thus the prototype will be tested using visual, haptic and a
combination of visual plus haptic.
The reason for using both pathfinding and shape detection for this research is basically industry
related. The mining of natural resources requires more than locating an object via pathfinding.
Resources such as gold can also be detected via shape, in order to effectively detect the efficiency
of haptics in the industry, we will combine both shape detection and pathfinding.
A testbed is a platform, generally used for testing and analysing of large development projects. It
supports severe, reusable and clear, testing and analysis of computational tools, scientific theories
and new technologies (Jiang, 2009). An example of a typical testbed is the arena web browser
created by the World Wide Web consortium (W3C) for testing cascading style sheet (CSS),
HTML3 and portable network graphics (PNG). The arena web browser was eventually replaced
with Amaya, a system for testing new web standard. In order to answer the research question, a
total of six different testbed were designed for both scenarios. Three testbed comprising of haptic,
visual and visual + haptic was designed for pathfinding. Similarly, three testbed comprising of
haptic, visual and visual + haptic was designed for shape detection. After several design
consideration and manipulation, six (6) test scenarios were designed (see table 2.1), 3 for
pathfinding and 3 for shape detection. The shapes and paths were altered to ensure reliability in
measurement that will be carried out when the prototype is developed.
Testbed Design

Page 5 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

Pathfinding visual

Pathfinding haptics

Shape detection visual

Shape detection haptics

Pathfinding visual +haptic

Shape

detection

visual

haptics

Table 1: testbed design concept


PATHFINDING: The aim of pathfinding is to locate the path from beginning (point a) to the finish

(point b). The paths will be invisible to the users as it is covered with an opaque terrain (Basagni
et al., 2004). The point here is for the user to rely on either visual or haptic cues or both to achieve
the task while avoiding obstacles. A research recently conducted by (Becker et al., 2012),
experimented the effectiveness of different technologies using pathfinding. The research revealed
how pathfinding can be used to assess and improve a class of computational models for simulating
the activities and collaborations of autonomous agents (ABM). With regards to evaluating their
effects on the exploration of an unknown terrain. This test in particular, is effective in assisting
victims of an emergency incident. For example, on the event of a natural disaster and the mining
factory is damaged by an earthquake and this resulted in the maps being worthless. Therefore, the
approach of pathfinding offers a worthy method of data collection to assess how effective a
vibration feedback can be and may also provide answers to the research questions.
SHAPE DETECTION: After conducting a couple of research, it was observed that several studies

have been carried out to test haptic performance, and in most cases shape detection or similar
activity concerning object recognition were used. An example of this second activity is seen in a
research by Norman et al, (2004). In the research, users were asked to compare natural shaped
objects in three dimensional formats (3-D) with the use of their sense of touch and sight. The users
maneuver an object using haptics, thereafter, the users were asked to identify which visible object
Page 6 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


is same shape with the invisible one. Shape detection activity uses stimuli that can only be
categorized based on the geometric arrangement of their features (Douglas, 2000). Just as it is in
pathfinding, Acemap will implement the use of visual, haptics and visual + haptics cues to help
users detect the shape of an object.
In both test scenarios that is, pathfinding and shape detection, different haptic and visual cues will
be used in the testbed. These cues, visual and vibro-tactile (VibroBox) will aid the user to detect
obstacles and bends. The user will rely on the different forms of feedback i.e. visual, haptics or
both to follow a path or detect shapes.
Visual feedback: Acemap will provide users with testbed that uses colours has forms of feedback,

this will aid the users navigate through the testbed with marginal mistakes. The colour code
provided by our prototype are green, amber and red. This colours are similar to those of traffic
light, these colours are chosen because of what they represent and can be recognised easily
(Greenhalgh and Mirmehdi 2012, Maldonado-Bascon, 2007), (Norman, 2002). Norman principle of

design will used in the development of this prototype so as to aid users in understanding what is
expected of them without much thinking (Preece et al, 2012).
Haptic feedback: in the second testbed Acemap will provide users with different thresholds of

vibrations to give feedback to users on what action to take next while navigating the testbed. When
the user is on the right path there will be no vibration. At sections when the user is approaching a
bend, strong pulse will be used to inform the user of the bend. Furthermore, a steady vibration
feedback will be given when the user is approaching the edges of a path. Finally, the frequency of
the vibration will be intensified when the user eventually strikes or crosses the edges.
Psychophysical Analysis: as part of measures to ensure that the vibration is felt by all users, a
psychophysical analysis of three (3) different users will be conducted, the resulting figures will be
used to set different vibration threshold. Psychophysics is a quantitatively analysis of the
relationship between physical stimuli and the sensations it influence (Cohen and Gldszmidt, 2004).
The purpose of the psychophysical analysis is to determine the relationship between physical
stimuli and vibrations and accepting its influence (Cohen, 2004). The absolute and differential
thresholds will be used to set the lower and higher limits to calibrate a section of the testbed.
Measurement variables: this are values that will determine well the user perform by measuring
the efficiency of each feedback in the following pattern:

Page 7 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


Approach time: this will record the total time taken by a user complete the task on a given
testbed. The time allocation will be calculated in seconds.
Slip off time: this will record the amount of time spent off track by the user before
returning to the right part. This will be represented in yellow.
Off target: this will record the number of times the user went off track before completing
the task. This feature will represented in red
A haptic device (VibroBox), will be used as a tangible object to interact with the testbed. Similarly,
the VibroBox which contains a vibrotactile actuator will be used to transmit vibration feedback to
users. (Refer to appendix 1) for images of a VibroBox. A fiducial marker will be placed on the
VibroBox to enable digital coupling and interaction with the testbed. This interactions will be
made possible via reacTIVision with the use of the TUIO library which tracks the marker. A java
integrated development (IDE) (Processing) will be used to communicate and control the
application. Furthermore, Arduino IDE an open source electronics prototyping app will be used to
connect the VibroBox to the system so that the processing IDE can access and use it for the testbed
(Banzi, 2014)
A database system that will capture and compile the output of all the six testbed will be used. This
is to ensure reliable results after evaluating. Three main values (approach time, slip off time and
off target) will be recorded for each task carried out on the X and Y axis. This will also combine
visual, haptic and visual + haptic for the three main values listed above.
Finally, with regards to Norman (2002) principle of affordance, a navigation system (menu) will
be used to control the testbed. This way, users can find their way around with ease.

Page 8 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

4 PROTOTYPE
In a bid to answer the research question above, a prototype consisting of an application software
(testbed) and hardware device (VibroBox) was used to create the system.
4.1.1 State diagram
The first procedure was to draw a state diagram, this was done to ensure no stage or phase of the
application is accidentally left out. A state diagram show the connection between stages of the
application (Fleck, 2013). Below is the state diagram for our prototype.

Figure 2 State diagram of the game


4.1.2 Prototype features
The prototype was designed following the state diagram, the menu was necessary to ease
navigation on the interface and give some level of control to users (Nielson, 1994).
Below are screenshot of the prototype.

Page 9 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

Menu screen

Testbed selection

Calibration

Testbed overlay
Screenshots of the Prototype

4.1.3 Calibration
While developing the prototype, it was paramount to carry out a psychophysical experiment, this
will

ensure

that

the

absolute

and

differential

threshold

are

properly

defined.

Gescheider (2007), defined threshold as the point of intensity where a person detects a stimuli. It
can be further considered as the point of which they is a change in stimuli.
The absolute threshold refers to the initial or lowest intensity that is felt by the users (Gescheider,
1997). On the other hand, differential threshold refers to the lowest difference between two stimuli
that a user can feel (Gescheider, 2013). The values of both threshold were collected from the
psychophysical experiment conducted with three (3) users. This experiment was conducted by
changing the intensity of the vibration generated by the VibroBox when the device was touched
by the users. Values were recorded at every point the vibration was felt by the user and was
captured as the absolute threshold. Similarly, after changing the intensity of the VibroBox, the
users were again tested and the differential threshold were captured. Subsequently, the results were
Page 10 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


analysed and used to setup the calibration. This process was carried out incompliance to Normans
principle design of mapping, which mainly focus on controls and their effect in real-time settings
(Preece et al, 2012).
Below is a tabular representation of the threshold recorded from the experiment carried out.
USERS

Absolute

Threshold Differential

Hz

Threshold

Hz

USER 1

74

109

USER 2

70

102

USER 3

80

90

74.6Hz

100.3Hz

AVERAGE

Table 3 this shows the psychophysical Experiment


In the table above, the average absolute threshold is 74.6Hz, this value was used in the
implementation of the absolute threshold in the prototype. This value was use to set the feedback
whenever a user slips off track. On the other hand, the differential threshold of 100.3Hz will be
used to determine when the user is off target.
4.1.4 Interaction with testbed
The menu navigation in the interface of the application was designed to use either keyboard or a
dynamic marker. The VibroBox is used for navigating the testbed. This actions made the
application flexible and easy to use (Nielsen, 1995). The dynamic marker was used to make sure
the users dont get confused by having several markers on the table. This was done in accordance
to Nielsen (1995) heuristics on Aesthetic and minimalist design. The option of the keyboard was
added to make the navigation with the testbed faster. Finally, the menus were made as headers on
every section of the interface to communicate the current state of application. This feature is in
compliance with Nielson heuristics and Normans design principle of visibility (Nielson, 1995).

Page 11 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

Sample testbed
In order to use the VibroBox to navigate the testbed, the device (VibroBox) was digitally coupled
with a fiducial marker. This will make sure the tangible object is tracked in the same space and
time with the testbed. The application was designed to use different colours to regulate the present
path of the user. In summary the application relies mostly on the different visual (colours) and
vibrotactile (VibroBox) as means of interaction.

Table 5 Sample VibroBox


The regular traffic light colour code was used to represent the different forms of visual feedback.
This was done in line with principle of design, which states that users can easily adapt to familiar
environment (Neilson, 1995). Green will indicate that the user is on the right track, yellow signifies
the user is going off track and red indicate the user is off target. Below is an image of colours and
what they represents.

Warning ( slip off)


Go (on the right path)
Table 6 Sample Traffic colours
Page 12 of 26

Danger (0ff target)

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


4.1.5 Design process
The testbed was developed using a 64bit version of processing 2.1.1. This IDE was used because
it contained most of the libraries needed for the application to be developed. It also combines easily
with Arduino IDE. This made interaction with the VibroBox a smooth process.
4.1.6 Libraries
The following libraries were used in developing the prototype

5 EVALUATION
As defined by Precee et al (2002), evaluation is a process of systematically collecting and analysing
data that gives us information about what it feels like for an individual user or group of users to
use a system for a specific task in a certain type of environment. The goal of an evaluation is to
determine how well a system fulfils a user requirement (Psathas et al, 1986). There are different
types of evaluation, depending on the method used, data collected could be qualitative or
quantitative. For this research, the data will be quantitative. The aim of this evaluation is to check
if:

Page 13 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


the use of haptic feedback is more efficient in pathfinding and shape detection when compared to
visual.
5.1

Evaluation methods

There are several types of evaluation techniques, ranging from heuristic evaluation with experts,
cognitive walkthrough, observation and think-aloud. However, for this evaluation the team
focused on combining observation with think aloud methods.
5.1.1 Observation with Think aloud
Towards the end of the prototype development, prior to delivery, three (3) students were asked to
carry out the different task and they were under observation the entire time. While the users went
through the pathfinding and shape detection testbed, they were asked to think-aloud the activities
and the reasons why they take different actions. Records of the actions and expressions of the three
users were put down in a note. The sections below, gives details on how the process was
accomplished.

5.2 Evaluation set up and Instrument


In a bid to successfully carry out the evaluation of our prototype, the team went to school lab and
made the basic arrangement to set the location. In order to set up the place, the following
instrument were used:
1. A computer system: this is among the key instruments used for the evaluation. It is the
centre of connection between all other software applications and hardware devices.
2. A webcam: this will be connected to the computer via a USB port, it will be used to track
the fiducial marker on the VibroBox.
3. A VibroBox: this tangible object is the haptic device used to communicate with the
application testbed. Furthermore, it will be used to sense the different vibrotactile feedback.
The VibroBox will be connected to the computer using a USB cable.
4. A flat screen monitor: this will serve as the virtual environment for the application testbed.
The monitor will be laid on the table.
5. A tripod stand: this will be used to hold the camera (webcam) over the monitor. The tripod
will be used to provide stability of the camera during the test.
6. The application testbed: this is the prototype as defined in the earlier part of this report.

Page 14 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

Figure 7 Table top setup

5.3 Assumption pros and cons


The evaluation assumes that all users are normal and also they were no issues with their vision.
Furthermore, it assumes that the users were not colour blind, finally it assumed that the users had
no previous experience or knowledge of haptic technologies.
5.3.1 Pros
This evaluation will help measure the performance of our prototype. It will help locate errors and
improve our prototype. Finally, it will help improve the effectiveness of the prototype.
5.3.2 Cons
The assumption that all users are normal and have no defects whatsoever might affect the results
of the evaluation.
5.4

Evaluation procedure

The evaluation was carried out in five (5) different stages, following the order from, the instruction
stage to the calibration stage, then the demo stage and finally the pathfinding and shape detection.
Instructions: at this point, instruction were given to the different users from university who were
asked to test the application. A letter of consent containing the necessary instruction on how the
testbed works, was handed to the candidates. This was done to explain the purpose of the test to
the candidates.
Calibration: at this second stage, the users were asked to calibrate the absolute threshold with the
calibration provided in the testbed. This was done so the value derived from the calibration test
can be used to set the absolute threshold for the users. This will ensure that the vibration feedback,
is felt by all the users during the experiment.

Page 15 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


Demo: after carrying out the activities in stage one and two above, the users were introduce to the
demo stage. As mentioned during our prototype demonstration, there was need to show the users
how the system works. To that regard, we had to show the users a demo video on how the testbed
is used. After that, the users were introduce to the demo section of the testbed so there can get
acquitted with the system and with different forms of feedback. Then the team observed as the
users react to the different forms of feedback. The three (3) students used were observed as they
navigate through the testbed. The observation was done to ensure the user are following the
instruction given at the start of the evaluation process. Although, the experiment still focus on the
data gotten from the testbed after completion.
Pathfinding: with the setup complete and the users ready, the team introduce the users to the
pathfinding scenarios individually. The users were tested separately to ensure that, the next tester
is not swayed by result of the testers before them. The first test scenario carried out by the users
was the visual testbed. Subsequently, the user were presented with the haptics testbed scenario and
then finally, there were asked to complete the task using the visual plus haptics test scenarios.
After all the users were tested, the measurement variables were saved automatically to a CSV file.
These informations will be used plot the necessary graphs to analyse the process.
Shape detection: the final procedure, at this stage the testers were introduced to the shape
detection testbed scenario, this stage of the experiment was carried out in a similar fashion to the
pathfinding scenario above. To begin with, the testers are to use visual cues to figure out shapes.
Afterwards, they had to rely on haptic cues from the VibroBox to complete the shape in the haptic
testbed scenario. To end with, the testers were given the opportunity to use visual plus haptics cues
to complete a shape. With all users tested, all the necessary information such as measurement
variables were saved in the database.

Figure 8 users evaluating the prototype

Page 16 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


In conclusion, the evaluation process was successful, all initial plans to ensure that users are tested
under the same condition was carefully followed. There were all presented with the same test
scenarios in a controlled setting. For example, the lightening in the lab was controlled so that the
reacTIVision camera can easily track the fiducial marker. More so, the noise was kept to a
minimum so the users will not be distracted. The evaluation took place between 11:30am to
1:30pm.
After completing the process, the testers were appreciated and were also given contact details of
the research team if they is need to know more about the research. Furthermore, a post evaluation
interview was carried out with the users and the following Feedback were gotten, added to what
the team observed:
The haptic feedback provided by the VibroBox was described as novel by all users. This
is because they had no previous experiences with the device.
The initial briefing and instruction given was effective in introducing the test procedures
to the users.
Finally, the continuous change of the start point on the different testbed made task a bit
complex for the testers. However, the team explained to the users that the reason for the
changes was to ensure a reliable result at the end of the experiment.

Visual pathfinding

Haptics pathfinding

Visual+Haptics pathfinding

Visual shape detection

Haptics shape detection

Visual+Haptics shape detection

Page 17 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


5.5 Data gathering
After completing all forms of testing, all the data of the different measurement variables (approach
time, slip offs times and off target times) stored in the Acemap database were exported to a CSV
file (excel). These data will be analyse to determine which system (haptics or visual) is more
efficient in both pathfinding and shape detection scenarios.

5.6 Results
As seen in the graph below, it is very likely to spot areas where the testers went off target (circle
in red) on both the pathfinding and shape detection testbed, and precisely how the testers completed
the task. Though the graph does not show the other measurement variables (approach time and slip
offs times).

User completed Haptics path

User completed Haptics shape

Below is a tabular representation of the different results gathered from the evaluation process in
respect to their different test scenarios.

Table 9.3 approach time


Page 18 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

Table 9.4 slip off time

Table 9.5 off target count

Page 19 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

Table 9.6 Overall Result

5.7 Results analysis


In order to properly evaluate and analyse the results derived from the experiment, each user result
will be individually examined to determine which form feedback is the most effective. Thereafter,
the psychophysical result recorded will be crossed-analysed to finalise on which form feedback is
the most efficient in pathfinding and shape detection in the mining industry.
5.7.1 User1

USER 1
70
60
50
40
30
20
10
0
Pathfinding Pathfinding
visual
haptics
approach time

Pathfinding
Shape
visual +
detection
haptics
visual

Shape
detection
haptics

Shape
detection
haptics +
visual

59.33

48.12

38.4

55.23

52.1

41.27

slip off time

21

31

16

off target count

22

28

12

approach time

slip off time

off target count

Table 10.1 chart of user 1 evaluation result


Page 20 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


In the graph above it is observed that the user spent a lot of time to complete the task, he went of
5 times for visual compared to 8 times for haptics and 4 times when combined on pathfinding. In
shape detection, the user made a lot of errors but completed the task quicker. From the think-aloud,
the user said the reason for the improvement in time is due to fact that there have completed the
pathfinding.
5.7.2 User 2

USER 2
60
50

40
30
20
10
0

Pathfinding Pathfinding
visual
haptics
user 2 approach time

Pathfinding
visual +
haptics

Shape
detection
visual

Shape
detection
haptics

Shape
detection
haptics +
visual

52.1

43.22

40.2

49.1

35.02

31.01

user 2 slip off time

12

20

10

user 2 off target count

10

11

user 2 approach time

user 2 slip off time

user 2 off target count

Table 10.2 chart of user 2 evaluation result


From the graph above it is seen that the user went off course twice and slip off three times on the
visual pathfinding. The user also went off target four times for haptics though he completed the
task faster. However he performed better and finished the task fastest when visual and haptics is
combined. Similar observations can be seen in the shape detection scenario.

Page 21 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


5.7.3 User3

USER 3
60
50
40
30
20
10
0
Pathfinding Pathfinding
visual
haptics
approach time

Pathfinding
visual +
haptics

Shape
detection
visual

Shape
detection
haptics

Shape
detection
haptics +
visual

44.3

55.01

35.11

35.44

41.33

24.51

slip off time

12

10

off target count

10

approach time

slip off time

off target count

Table 10.3 chart of user 3 evaluation result


User 3 performance is seen in the graph above, it can be seen that the user made few errors when
both haptics and visual is combined in both pathfinding and shape detection.

Page 22 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


5.7.4 Overall Result

ovarall result
60
51.91
48.78333333

50

46.59
42.81666667
37.90333333

40

32.26333333

30
20.33333333

20

12.66666667
12
4.333333333

10
3.666666667

15

9.333333333
8.3333333

8
3

0
Pathfinding Visual

Pathfinding
Haptics

Pathfinding Visual Shape perception Shape perception Shape perception


+ Haptics
Visual
Haptics
Visual + Haptics

Completion Time

Slip off Time

Off Target Count

Table 20.4 chart of overall result


The graph above shows a compilation of the results of three different users, the highest amount of
time is spent on the visual scenario and a better time range spent on the haptics scenario, but the
users made less errors on the visual scenario when compared to the haptics scenario. The best
performance can be seen when haptics is combined with visual.
With the statistic provided above that is, haven seen an improvement in time taken to complete the
task in both pathfinding and shape detection with a slightly higher rate of error, it can be said that
the use of Haptics Feedback will be more Efficient than Visual in Path Finding and Object
Discovery in Mining of Natural Resources.

Page 23 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

6 CONCLUSION
The use of visual and audio feedback has been the main channel interaction between humans and
computers. In recent times, novel technologies has made communication via stimuli possible. This
involves the use of haptics to communicate with humans via a device called VibroBox. This novel
technology, has seen in the evaluation carried out on this report, can help the visually impaired in
navigating a path or detecting a shape. This research set out to determine if haptics can improve
the efficiency in pathfinding and shape detection in the mining industry. Bearing this in mind, an
application with six testbed was developed and based on the psychophysical experiment carried
out, the following was concluded.
The different testers used for the evaluation could feel and use the different types of feedback from
different points. The visual cue was effective in avoiding errors but took longer time to complete
the task. On the other hand, the haptic cue was effective in time management but recorded a slightly
higher error rate. However, a combination of both haptic and visual feedback prove more effective
in completing the task in good time and with marginal errors.
In conclusion, the combination of both haptics and visual cue will be more effective and efficient
in pathfinding and shape detection in the mining industry.
As part of our future work, the team will research and likely improve on the current prototype by
adding features that will aid evacuation from a disaster site. Example like the incident in Chile
were miners were trapped underground for months after earthquake.

6.1 Group contribution


My main contribution to the team was researching and defining the research question and industry.
Furthermore, I designed the pathfinding testbed.

Page 24 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792

7 REFERENCE
Basagni, S., Conti, M., Giordano, S. and Stojmenovic, I. 2004. Mobile ad hoc networking.
Piscataway, NJ: IEEE Press.
Becker, M., Blatt, F. and Szczerbicka, H. 2012. Agent-based approaches for exploration and
pathfinding in unknown environments. pp. 1--4.
Cohen, I. and Goldszmidt, M. 2004. Properties and benefits of calibrated classifiers. Springer,
pp. 125--136.
Ferscha, A., Emsenhuber, B., Riener, A., Holzmann, C., Hechinger, M. and Hochreiter, D. 2008.
Vibro-tactile space-awareness. Citeseer.
Gescheider, G. A. 1997. Psychophysical measurement of thresholds: Differential sensitivity.
Psychophysics: The Fundamentals, pp. 1--15.
Gescheider, G. A. 2013. Psychophysics. Hoboken: Taylor and Francis Hayward, V., Astley, O.
R., Cruz-Hern, Ez, M., Grant, D. and Robles-De-La-Torre, G. 2004. Haptic interfaces and
devices. Sensor Review, 24 (1), pp. 16--29.
Ishii, H. 2008. The tangible user interface and its evolution. Communications of the ACM, 51
(6), pp. 32--36.
Jiang, L. 2009. Portable haptic feedback for training and rehabilitation. Ann Arbor, Mich:
ProQuest LLC.
Lederman, S. J. and Klatzky, R. L. 2009. Haptic perception: A tutorial. Attention, Perception,
\& Psychophysics, 71 (7), pp. 1439--1459
Mauter, G. and Katzki, S. 2003. The Application of Operational Haptics in Automotive
Engineering. Global Automotive Manufacturing & Technology 2003, Team for Operational
Haptics, Audi AG, p. 7880.
Nielsen, J. 1995. 10 Usability Heuristics for User Interface Design. Nielsen Norman Group:
Evidence-Based User Experience Research, Training, and Consulting.
Norman, D. A. 2002. The Design of everyday things. 2nd ed. New York: Basic Books.
Norman, J. F., Norman, H. F., Clayton, A. M., Lianekhammy, J. and Zielke, G. 2004. The visual
and haptic perception of natural object shape. Perception \& Psychophysics, 66 (2), pp. 342-Page 25 of 26

CMT3321 | Novel Interactive Technologies | Eshimokhai Charles | M00476792


351. Preece, J., Rogers, Y. and Sharp, H. 2012. Beyond Human - Computer Interaction. 3rd ed.
New York, NY: J. Wiley & Sons.
Psathas, G., Stewart, A., Morgan, D. L., Lee, R. M., Gubrium, J. F., Holstein, J. A., Erickson,
K., Stull, D., Bartunek, J. M., Louis, M. R., Alteide, D. L., Feldman, M. S., Hamel, J., Riessman,
C. K., Mitchell, R. G., Kleinman, S., Kleinman, S., Copp, M. A., Thomas, J., Atkinson, P., Ball,
M.
Tan, H. Z. and Pentl. 1997. Tactual displays for wearable computing. Personal Technologies, 1
(4), pp. 225--230.
.

Page 26 of 26

Das könnte Ihnen auch gefallen