Beruflich Dokumente
Kultur Dokumente
Govind Kharat
Principal
Sharadchandra Pawar College of Engineering
Otur, India
gukharat@gmail.com
In first approach Sensor-based recognition systems
depend on instrumented gloves to acquire the
gestures information. In general, equipped sensors
measure information related to the shape, orientation,
movement, and location of the hand. Due to glove the
segmentation is easily achieved which is difficult
compared to bare hand segmentation in vision based
system. However the Glove based system is little bit
hectic as signer has to wear many sensors on wrist and
arms while performing sign.
I. INTRODUCTION
The number of the deaf-mute is more than 5% of
the population. Sign language is mainly employed by
deaf-mutes to communicate with each other. It is a
kind of language communicated through gestures and
visions. As sign languages are non-verbal languages,
information is conveyed visually, using a combination
of manual and non-manual means of expression.
Manual parameters are hand shape, hand posture,
hand location, and hand motion. The non manual
parameters include head and body posture, facial
expression, gaze and lip patterns. Non-manual
parameters are essential in sign language, since they
carry grammatical and prosodic information. Some
signs can be distinguished by manual parameters
alone, while others remain ambiguous unless
additional non-manual information, in particular facial
expression, is made available. According to author
sign language recognition system is defined in
following ways. According to the media of sign
language input, there are two kinds of sign language
recognition system: system based on instrumented
glove and system based on vision [1].
RELATED WORK
323
III.
SYSTEM OVERVIEW
324
A. Profile Setup
Initially profile name is to be set like ASL using
button SAVE PROFILE AS, it stores the gesture
performed in ASL profile indicating sign belongs to
American Sign Language. We have to add gesture using
ADD GESTURE button as shown in Fig. 2. After that
the gesture assignment window is open as shown in
Fig. 3. First click on GRAB button. Simultaneously
user has to perform sign in front of Leap Motion. Leap
Motion has an option to visualize the hand motion as a
skeletal hand motion on screen. We have utilized this
option for getting the input coordinate values. The
program outputs 3D coordinates (x, y, z) value of in
each sign in the following format for both left and right
hand (Centre of the palm, Thumb, Index finger, Middle
finger, Ring finger, Pinky. The sign is captured and
stored by assigning gesture name and what it outs when
sign is recognized like speech.
B. Recognition setup
Here first click on LOAD PROFILE button to
load expected profile like ASL, BSL or ISL. So that
sign performed in that language will be accepted and
recognized. To recognize sign click on GESTURE
RECOGNITION, now signer can perform any sign
which is stored in database (Profile).
325
(6)
However to make system more accurate some more
distance measures are added as follows.
(7)
(8)
(9)
when the signer gives perform the sign in front of
Leap Motion camera, the current hand coordinates are
used to calculate the d1,d2,d3,d4,d5,d6,d7,d8 as
mentioned above.
(1)
Whichever sign has small measure value is
output in the window as shown in Fig. 5 using text to
speech converter to get audio output.
IV. D
similarity,
d
magnitude as
(11)
Array1
=
(12)
ArrayCurrent
(13)
D1,D2,D3,D4,D5,D6,D7,D8]
[d1,d2,d3,d4,d5,d6,d7,d8]
(14)
For our setup we have kept the threshold as follows
For Euclidean Distance threshold is 1227. It means if
the result of distance measure is above 1227, the given
sign is considered as not recognized. The distance
measure is calculated for all store sign and the sign
which gives minimum value is considered as matched
sign.
(2)
(3)
(4)
(5)
326
Correct
rate
Correct
100 %
rate
100 %
O
C
10
10
00
10
10
100
100 %
%
P
D
10
10
23
87
80
70 %
%
Q
E
10
10
30
7
10
70 %
100
%
R
F
10
10
50
5
10
50 %
100
%
S
G
10
10
5
10
5
10
50 %
100
%
T
H
10
10
07
10
3
100
%
30 %
U
I
10
10
00
10
10
100
100 %
%
V
J
10
10
00
10
10
100
100 %
%
W
K
10
10
05
10
5
100
%
50 %
X
L
10
10
00
10
10
100
100 %
%
Y
M
10
10
00
10
10
100
100 %
%
Z
N
10
10
60
4
10
40 %
100
%
10
10
100 %
10
10
100 %
10
10
100 %
10
10
100 %
10
10
100 %
Recorded
Sign
Total number
of
signs
presented
Failure
number of
signs
Successfully
recognized
signs
10
50 %
10
10
100 %
10
10
100 %
10
10
100 %
10
10
100 %
10
10
100 %
10
10
100 %
10
60 %
10
10
100 %
10
10
100 %
10
10
100 %
10
10
100 %
10
10
100 %
V. CONCLUSION
The Euclidean and Cosine based Indian Sign
Language Recognition system is presented in this paper.
The system is able to recognize all alphabets A to Z and
numbers 1 to 9 of
88.39 %
327
Correct
rate
90.32 %
TABLE II. CORRECT RATE OF SIGN POSTURE RECOGNITON system has one constraint that Leap Motion Sensor has to be
USING COSINE SIMILARITY
Correct rate of sign posture recognition
using Cosine Similarity
kept little inclined while signs are performed and system
10
10
100 %
10
80 %
REFERENCES
10
10
100 %
10
10
100 %
10
10
10
100 %
10
50 %
10
10
100 %
10
10
100 %
10
50 %
10
10
100 %
10
10
100 %
10
10
100 %
10
10
100 %
Recorded
Sign
Total number
of
signs
presented
Failure
number of
signs
Successfully
recognized
signs
Correct
rate
10
10
100 %
10
10
100 %
10
80 %
10
70 %
10
60 %
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
328
[9]
[10] Tan Tian Swee, Sh-Hussain Salleh, A.K. Ariff, CheeMing Ting, Siew Kean Seng, and Leong Seng Huat,
Malay Sign Language Gesture Recognition System,
International Conference on Intelligent and Advanced
Systems, ICIAS 2007, pp. 982-985, 2007.
[11] Asha Thalange, Dr.Shantanu Dixit, "Effect of Thinning
Extent on ASL Number Recognition Using Open-finger
Distance Feature Measurement Technique", International
Conference on Signal Processing And Communication
Engineering Systems (SPACES), pp.39-43, 2015.
[12] P.V.V.Kishore, M.V.D.Prasad, Ch.Raghava Prasad,
R.Rahul,"4-Camera Model for Sign Language
Recognition Using Elliptical Fourier Descriptors and
ANN", International Conference on Signal Processing
And Communication Engineering Systems (SPACES),
pp.34-38, 2015.
[13] Lucas
Rioux-Maldague,
Philippe
Gigu`ere, "Sign
Language
Fingerspelling Classification from Depth and Color
Images using a Deep Belief Network", Canadian
Conference on Computer and Robot Vision, pp. 92-97,
2014.
[14] A.S.Elons, Menna Ahmed, Hwaidaa Shedid and
M.F.Tolba Arabic Sign Language Recognition Using
Leap Motion Sensor 9th International Conference on
Computer Engineering & Systems (ICCES), pp. 368673, 2014.
[15] Wikipedia. Leap Motion Wikipedia.org. [Online].
Available
https://en.wikipedia.org/wiki/Leap_Motion
[Last Modified: 26 July 2015, 07:16].
[16] Neha V. Tavari, Prof. A. V. Deorankar, Indian Sign
Language Recognition based on Histogram of Oriented
Gradient, International Journal of Computer Science
and Information Technologies, Vol. 5 (3) , pp. 36573660, 2014.
329