Sie sind auf Seite 1von 6

Human Following Robot based on Control of Particle Distribution

with Integrated Range Sensors


Fumiaki Hoshino, Kazuyuki Morioka
Dept. of Electrical Engineering, School of Science and Technology,
Meiji University, Kanagawa, 214-8571, Japan
Email: ce11079@meiji.ac.jp, morioka@isc.meiji.ac.jp

Abstractʊ,Q WKLV VWXG\ D KXPDQIROORZLQJ URERW EDVHG RQ D space[2]. Laser range scanners were attached in the mobile
laser range scanner and a Kinect sensor is developed for robot and also distributed in the intelligent space. They were
tracking and identifying target human. Most of tracking connected via network and tracked target human
systems with laser range scanners have the problems of cooperatively. Chung also developed a mobile robot with
occlusions and the harmful effects of noise. Generally, human
laser range scanner for following human[3]. In this robot,
tracking systems utilize particle filter to solve those problems.
However, if there are several people in the measurement range range data distributions of human legs were analyzed
of laser scanners, particles are sometimes distributed to only a carefully. Target human were detected and tracked according
certain human. With a particle filter based on control of particle to analyzed results. Morioka developed intelligent
distribution and with integrated sensors between LRF and space-based human following robot[4]. Intelligent space with
Kinect, this study aims to identify the target human and achieve distributed network cameras detected and tracked target
human-following behavior by the robot Tracking experiments human and a mobile robot. In this system, the mobile robot
of several people and human-following experiments were without any sensors was able to follow target human in the
performed, and some results are provided in the paper. intelligent space. Since laser range scanner can acquire only
range data, it is difficult to keep tracking and identifying the
I. INTRODUCTION target human with separating the other human. Tracking with

R OBOT technologies for security, nursing care,


housework, rescue and etc. have been expected recently.
Especially, it is supposed that human friendly robots are
cameras is lower robustness than laser range scanners
because of lighting condition.
In this study, a human-following robot based on a laser
widely used and cooperative with human in near future. In range scanner and a Kinect sensor is developed for tracking
order to support human, robots have to get close to human, and identifying target human. Laser range scanners can
follow human movement and keep that condition. Such acquire range data on 2D plane as described above. On the
robots are called “human-following robots”. Several other hand, a Kinect sensor can acquire range image. Then,
human-following robots have been developed until human detection based on a different approach compared
now[1][2][3][4]. with laser range scanners can be performed. However, a
Generally, human beings learn environments around Kinect sensor cannot monitor wide area. That is not enough
themselves based on following behaviors of the others during for human-following. This study aims to achieve
growing process. Human-following is also one of essential human-following behavior based on integration between laser
behaviors for robots to acquire intelligence. Especially, range scanner and Kinect sensor. In this paper, human
mobile robots have to know structures, maps and objects in tracking with standalone laser range scanner is introduced at
human living environments in order to cooperate with human. first. Next, tracking algorithm based on reliability control of
Human-following behavior might be efficient for mobile laser range scanner and Kinect sensor is proposed. Finally,
robots to learn them effectively in unknown and dynamic the algorithm is implemented in the robot and some
environments. This study aims to develop human-following experimental results are shown.
robots for supporting human and learning environments
without missing the target human. One of the most important II. A HUMAN-FOLLOWING ROBOT WITH A LASER RANGE
issues is detection of target human position and tracking. SCANNER AND A KINECT SENSOR
External sensors are usually attached in the mobile robots for
A. Outline
human detection. Recently, laser range scanners are widely
used for human detection. Generally, laser range scanner can In this study, a mobile robot with two external sensors is
acquire accurate range data on 2-dimansional plane at high developed as a human-following robot. The mobile robot is
sampling rate. Okusako developed a mobile robot with laser Pioneer3-DX made by MobileRobots Inc. One sensor is a
range scanner for following human[1]. Legs of target human laser range scanner, UTM-30-LX, made from Hokuyo
were tracked by using range data. Oinaga developed a Automatic Corp. and the other sensor is a Kinect sensor made
human-following robot which is interactive with intelligent from Microsoft Corp. In this study, each sensors are attached

978-1-4577-1524-2/11/$26.00 ©2011 IEEE - 212 - SI International 2011


(a) (b) (c) (d)
(a) Raw x-y coordinates converted from range data. (c) Removing noise clusters based on the number of points in one cluster
(b) Clustered result (d) Removing noise clusters near the walls
Fig.1. The 2D plane image acquired with a laser range scanner
in the mobile robot. A Kinect sensor can keep tracking the described in the third chapter.
target human without missing compared with a laser range
C. Human Tracking with a Kinect sensor
scanner. However, it cannot monitor wide area. Then, the
robot cannot follow human with only a Kinect sensor. Also A Kinet sensor is a game device of Microsoft. With the
compared with a Kinect sensor, tracking performance of a Kinect sensor, player can enjoy games intuitively without
laser range scanner is inferior because laser range sensors can controllers. Since the Kinect sensor includes an RGB camera,
acquire only range data. However, it can monitor wider area an infrared camera to measure depth image, and a multi-array
than a Kinect sensor. Therefore, a human-tracking system microphone, some robotic systems including Kinect sensors
including these advantages should be built. It is required to are developed recently. The Kinect sensor can detect human
distinguish the target human from the other human for target with high precision in camera image with an RGB camera and
tracking in human following robot. Two types of sensors an infrared camera. And, the Kinect sensor can measure the
should have different roles to achieve it. A laser range scanner distance between the sensor and the players. Since a Kinect
observes around the mobile robot widely. It is used for sensor can measure the distance between the sensor and the
detection of new humans around the robot. On the other hand, players. In this study, depth data acquired by a Kinect sensor
a Kinect sensor observes around target human intensively. are projected on an image, and a centroid coordinate of target
When the other humans detected by a laser range scanner human in the 2D image is converted to the world coordinate.
approach to the target, detection and tracking results in a laser A mobile robot detects and tracks the human world
range scanner are given to a Kinect sensor. Then, the target coordinate. A Kinect sensor is also installed on a mobile robot
can be distinguished from the other human. In this chapter, and detects the target human. However, a Kinect sensor loses
systems of extracting the target human with each sensors and the human because it cannot monitor wide area during human
outline of integrating sensors are explained. following behavior of the mobile robot. Then this study aims
to achieve more stable human-following behavior by
B. Human Extract with a Laser Range Scanner integrating a laser range scanner with wide measurement area
In this study, a laser range scanner is attached in a mobile and a Kinect sensor with precise human detection.
robot. So, a plane parallel to the floor at a height of about
D. Integrating a laser range scanner and a Kinect sensor
30cm from the floor is scanned. A set of range data is
acquired with a laser range scanner attached on a mobile When a laser range scanner and a Kinect sensor are
robot as a scanned result. Range data is converted to x-y integrated, the advantages of two sensors must be fused
coordinates on a scanned plane. Fig.1(a) shows points of x-y efficiently. As long as a Kinect sensor keep detecting the
coordinates which are converted from range data. First, these target human, the coordinate acquired by the Kinect sensor
points are clustered, based on the nearest neighbor clustering should be trusted. Unless a Kinect sensor detects the target
method. The clustering result is shown in Fig.1(b). Next, human, the coordinate acquired by a laser range scanner
points except human are removed. Legs of the target human
should be extracted for target tracking. Clusters of walls or
the other structures are removed according to the number of
points in one cluster. The result is shown as Fig.1(c). As a
result, clusters of human legs are extracted. However, the
edges of walls or the other structures are also extracted as
human legs. Such noise clusters must be removed. Then,
Hough transform-based line detection is exploited. The
clusters near lines are removed as noise clusters. After this
process, only the clusters of legs can be extracted as shown in
Fig.1(d). In this study, the clusters of legs extracted by this
process are tracked as the target human with a tracking
algorithm based on particle filter. The tracking method is Fig. 2. Algorithm for selecting sensors

- 213 - SI International 2011


should be trusted. In case that the coordinate acquired by a 4). Selection
Kinect sensor is far from the coordinate acquired by a laser Particles in next time step are regenerated in proportion to
range scanner, there is a high possibility that each sensor weights of particles. Then, particles with high weights are
detects different humans. It is evaluated whether the sensor selected and regenerated many times. On the other hand,
detects the correct target in the sensor fusion algorithm. As particles with low weights are disappeared without selection.
shown in Fig,2, more precise human detection based on a
laser range scanner improve human tracking performance in ii. Design of likelihood function for human tracking
the integrated tracking method. In the next chapter, the Likelihood function must be designed for tracking of
tracking method based on particle filter that is an algorism for human legs with a laser range scanner. Likelihood function
estimating next state is described. needs to give high weights to particles near the target human.
Then, this study focused on two features for human
III. HUMAN TRACKING WITH A LASER RANGE SCANNER recognition as follows. One feature is that human has two legs,
and they are close each other. Another feature is that one
A. Outline
cluster sometimes represents one human in the case that a leg
It is difficult to keep following the target human only by is behind the other leg from a laser range scanner. According
extracting clusters of legs, because the other objects to measurement results of human walking steps, most of steps
sometimes fail to be removed, and several human exist in the are within 250 mm around 30 cm height from a floor. The
area monitored by a laser range scanner. In order to keep likelihood function is designed as shown in Eq.(2). It depends
following the target human in such cases, particle filter, on the number of clusters within 250 radius mm from a
which is stochastic state estimation algorithm based on particle.
distribution of particles, is applied in this study. ‫ܣ‬ ௗమ  ൌ ͶͲͲǡ ɐ ൌ ʹͲͲ ‫•‰‡Žʹˆ‘‡•ƒ… ڮ‬
ି ೔మ
l୧ = ݁ ଶఙ ൜
B. Human tracking by particle filter ቐ 2ߨߪ  ൌ ʹͲͲǡ ɐ ൌ ͳͲͲ ‫ ‰‡Žͳˆ‘‡•ƒ… ڮ‬
Particle filter is state estimation algorithm with many l୧ = 0 … The others
…(2)
particles based on the previous estimation and the current
In Eq.(2), l୧ represents a particle likelihood of particle i, ݀௜
observation[5]. “Prediction”, “Observation”, “Estimation”
represents the distance between a particle i and a cluster of leg.
and “Selection” are iterated during tracking target objects.
“In case of 2 legs” and “In case of 1 leg” in Eq.(2) mean the
Particles in next step are generated according likelihood of
number of clusters within 250 mm radius from a particle. If
previous particles. Particles with higher likelihood have
there is one cluster within 250 mm radius from a particle, it is
larger effects to particles in next step. Likelihood function
assumed that the cluster represents one leg of human. Then,
must be designed as high likelihood is given to particles
Eq.(2) in case of 1 leg is used to calculate the likelihood.
around the target object based on observation. Then, many
Similarly, if there are two clusters within 250 mm radius from
particles gather to the target object and tracking is achieved.
a particle, they present two legs. In this case, “d” is distance
between a particle and a closer cluster to the particle. If there
i. Algorithm of particle filter
are more than three clusters or there are no clusters within 250
1). Prediction
mm radius from a particle, they are not considered as human,
States of particles in time “t” are predicted with state
and likelihood is set to zero. Next, weights of particles are
equation from the state in previous time “t-1”. In this study,
calculated by Eq.(3). Weights represent normalized
Eq.(1) is used as state equation. Uniform linear motion is
likelihood. Here, w୧ represents weights of particles and N
assumed as the target motion in this equation.
x୲ = x୲ିଵ + u୲ିଵ represents the number of particles. In this study, 100 particles
y = y + v୲ିଵ  are used for tracking one person.
൞ ୲ u ୲ିଵ …(1) w୧ = σొ ౟

…(3)
୲ = u୲ିଵ
౟సబ ୪౟
v୲ = v୲ିଵ
“x” and “y”are 2-dimensional coordinates of the target human. IV. TRACKING TARGET HUMAN
“u” and “v” denote x and y component of human velocity
If a human-following robot can track target human
respectively. “t” represents time step.
2). Observation constantly and tell target human from the other humans when
Likelihood of particles is calculated according to the the other humans approach target human, the
observation in the present time “t” in the state predicted from human-following behavior will be sure to be performed
the previous time “t-1”. A weight of particle is decided from accurately. However, if humans gather and one human has
particle likelihood. Details of likelihood calculation, that is higher likelihood than the other humans, particles will be
design of likelihood function, are described in next distributed around one human and fail to track humans. In this
subsection. study, biased distribution of particles is big problem. So,
3). Estimation methods to solve the problem are considered and described as
The state of human in time “t” is estimated from the below. If there is one target human, particles can keep
weighted average of particles, gathering densely and tracking the target accurately in the
particle filter described above. However if there are several

- 214 - SI International 2011


humans close to the robot, particles often gather around one detection area around the target, and particles begin to track
of humans, and miss the other humans. It is because the the human. After the human detection, the set of particles
particle filter has a feature that particles are not strewed make transition from Detect filter to Track Filter. Track Filter
equally over all targets, and particles apt to gather around performs all steps from 1) to 6). Labels representing human
only some targets that higher likelihoods are calculated. number are assigned to particles in Track Filter. When
Additionally, one set of particles is given to one human in the transition to Track Filter, a new set of particles is distributed
human tracking by the particle filter. New humans as a new Detect Filter in order to detect the other new human.
approaching close to the target human should be tracked with Then, detection area of the new persons is defined by
new sets of particles for tracking both humans and separating excluding the areas where Track Filters already exist. Detect
the target from the others. This study aims to achieve tracking Filter searches new humans in the defined area. Once
humans for target human following behavior based on two detected humans are not detected again with Detect Filter. As
methods as follows. New sets of particles need to be a result, every time a human newly appears the area, a set of
distributed around close to the target human to detect new particles can track the human independently.
humans. On the other hand, humans that already tracked by
B. Adjustment of Likelihood
the robot must not be detected and particles must not be
distributed again. By excluding the areas where humans are i. K-means Clustering
already tracked around the target human from distribution of Even if each human has each set of particles, distribution of
new particle sets, humans can be tracked by independent particles is sometimes biased and mixed with the other set of
particles respectively. Control of particles distribution is particles. It is because particles sometimes track different
based on likelihood adjustment. That depends on the number human from human in the previous step based on likelihood
of humans tracked by the robot, and changes of humans calculation. As shown in Fig.4(b), particles should be
tracked by particles. distributed equally toward the humans. Then, control of
Roughly procedure of proposed method is described as particle distribution is applied for keeping the number of
below particles in each human almost equally in this study. In order
1) Prediction to achieve the control of particle distribution, likelihood of
2) Observation particles should be adjusted according to number of particles
3) K-means clustering (ϫ.B.ϸ) of each Tracking Filter and change of label that each particle
4) Adjustment of likelihood (ϫ.B.Ϲ and ϫ.B.Ϻ) has. One human is tracked by more number of particles than
5) Estimation the other human. Then, particles are biased. Track Filter with
6) Selection more particles should decrease the number of particles. In that
More details of these are described below. case, likelihood of particles in such Track Filter should
decrease for control of number of particles. When a label of a
A. Detection Area of New Humans particle changes to the other label, bias of particles might start
This paragraph describes the algorithm of new humans to occur. In that case, likelihood of the particle that the label
detection using exclusion of areas around tracked humans[6]. changes should also decrease. On the other hand, a particle
First, a set of particles for detection of new human is that has a same label with the previous step is considered as
distributed uniformly in the area around the target human. tracking the same human. Likelihood of that particle should
This set of particles is named Detect Filter. Detect Filter increase. Mixed particles need to be divided into groups as
performs only Prediction and Observation. When Detect many as the number of humans tracked at that time, and the
Filter detects a new human in the area, particles begin to track number of particles and labels of particles in the each group
the new human. When more than 5 particles have likelihood should be evaluated. Then, particles are clustered by K-means
beyond 0 by Eq.(2), Detect Filter judges that human enters to clustering algorithm. K-means clustering algorithm can
classify particles into the arbitrary number of groups. The
process of K-means clustering algorithm is described as
below. Here, the cluster label is the final result label by
k-means clustering. Each particle has the label as described

(a)Detect Filter is generated.


(b)Human-tracking starts.
(c)New Detect Filter is generated. And it cannot enter around Track
Filter.
(d)Each Track Filter keeps tracking each target. (a)Biased Distribution (b)Evenly Distribution
Fig. 3. Detection of New Human Fig. 4. Particle Distribution

- 215 - SI International 2011


above. l୧ = l୧ × 㸦1 +
୒౩
㸧…(5)
୒౪
(1) The arbitrary value, “k” is defined. ୒౪ ି୒౩
Each particle has the cluster label, “n” (n = 1,2,,,k) as the l୧ = l୧ × 㸦1 㸧…(6)
୒౪
initial value of the cluster label.
(2) The centroid coordinates of clusters with the same cluster V. EXPERIMENT
labels are calculated.
A. 5.1. Human Tracking with a laser range scanner
(3) The cluster label of a particle changes into the label of the
cluster with the nearest centroid coordinate. i. Experimental Method
This is applied to all particles. Experiments to evaluate human tracking by proposed
(4) If there are any changes of the cluster label, this algorithm method are performed. In the experiments, a laser range
is iterated from (2). If there is no change, this algorithm ends. scanner was placed in an environment not a robot in order to
In this study, “k” is the number of humans tracked by evaluate only tracking performance except for human
Tracking Filters. In each particle, the initial cluster label on following performance. Two humans walked and passed each
k-means clustering in each step is the cluster label in the other 10 times in front of the sensor. The experiments
previous step. checked when mis-tracking happened and how many times it
happened. Two types of passing each other were performed in
ii. Adjustment of Likelihood compared with the Standard the experiments. One is that two humans passed parallel to
number of Particle laser direction of the laser range scanner. The other is that two
One hundred particles per one human are generated in this humans passed vertically to laser direction. The latter
study. So, it is desirable that one human is tracked by one movements include occlusions of each human by the other
hundred particles. If more than one hundred particles track human. Following three methods were compared in the
one human, the tracking performance usually get worse in experiments as for adjustment of likelihood.
tracking of multiple humans. Then, this paper proposes (1) Only particle filter
adjusting the number of particles compared with the standard (2) Adjustment of likelihood in subsection ϫ-A-Ϲ.
number of particle. In this study, the standard number of (3) Adjustment of likelihood in subsections ϫ-A-Ϲand ϫ
particle is one hundred as described above. If a cluster has -A-Ϻ.
more than one hundred particles, the number of particles in In the beginning of the experiments, track filters of two
that cluster needs to be decreased. On the other hand, if a humans had 100 particles respectively.
cluster has less than one hundred particles, the number of
particles of that cluster needs to be increased. Then, by ii. Results
increasing and decreasing likelihood of particles with Eq.(4), 5 sets of walking and passing were performed in the
the number of particles in one cluster converges to the experiments. A set includes passing each other 10 times.
standard number, and the failure of tracking based on bias of Table.1 shows the results when mis-tracking happened first
particles is removed. Here, “P଴ ” is the number of particles and how many times mis-tracking happened during passing
which track one human. “P୧ିୡ୪ ” is the number of particles in a 10 times. Averages of 5 sets are indicated in the table. Fig.5
cluster where i-th particle belongs. shows an example of failure tracking, and Fig.6 shows an
୔బ ି ୔౟షౙౢ example of success tracking.
l୧ = l୧ × 㸦1 + 㸧 …(4)
୔బ Table.1 Comparison of results by three methods
iii. Adjustment of Likelihood based on changes of label The calculation of likelihood
The labels of particles change into other labels because of (1) (2) (3)
clustering. Then, this paper focuses on the particles whose
When mis-tracking Vertical 1.2 2.0 4.6
cluster labels change. Adjusting the likelihood of the particles
happened Parallel 2.0 4.4 Nothing
are also proposed. In case that the current cluster label equals
How many times Vertical 6.2 5.0 1.8
to the previous label in a particle, the particle seems to keep
mistaking happened Parallel 7.4 4.6 0
tracking the same human between two steps. As for particles
whose labels don’t change, the likelihood should be raised As shown Table.1, mis-tracking was less happened
because tracking is successful. Also, in case that the current according to applying the proposed likelihood adjustment.
cluster label is different from the previous label in a particle, That shows that the proposed methods were effective for
the particle seems to track different human between two steps. human tracking. Especially, almost perfect tracking was
As for particles whose labels change, the likelihood should be achieved in the movement of passing each other parallel to
lowered because different humans are tracked by the particles. the laser direction. Such parallel movements often appear in
The likelihood is calculated with the Eq.(5) and Eq.(6). Here, human following task by a robot in corridors of buildings.
ܰ௦ represents the number of particles whose labels change. N୲ Then, the results also show the effectiveness to human
represents the number of particles whose labels don’t change. following task. Calculation time of (1) was 9.845s in 100
Eq.(5) is used for the calculation of particles whose labels loops. On the other hand, calculation time of (2) and (3)
don’t change. Eq.(6) is used for the calculation of particles including likelihood adjustment were 9.846s and 9.860s
whose labels change.

- 216 - SI International 2011


600 angles where the target human are detected by the robot in the
cases of success tracking. In Fig.8, “Only Kinect” shows the
y coordinate (mm)
400
200 case of only a Kinect sensor, and “Integrated Sensors” shows
0 the case of sensor integration with a Kinect sensor and a laser
-200 range scanner.
-400 The results show that the human following with sensor
0
500 1000 1500 2000 integration can be achieved in wider angles than only a Kinect
x coordinate (mm) Target㸯 Target㸮
Fig. 5. Failure Tracking (The Calculation of Likelihood: No.1)
sensor. This means that human following with only a Kinect
sensor restricts the movement of the target human even if
human it has good detection and tracking performance. In
y coordinate (mm)

600
400 order to achieve human following without restriction of
200 movement, the sensors that can monitor wide areas such as
0 laser range scanners must be installed in the robot.
-200 Improvement of tracking performance with laser range
-400 scanners contributes achievement of human-following robot
0 500 1000 1500 2000
directly.
x coordinate (mm) Target㸯 Target㸮
Fig. 6. Success Tracking (The Calculation of Likelihood: No.3)
VI. SUMMARY
respectively. Computational cost does not change In this study, human extraction method with a laser range
significantly. scanner, and distribution control of particles with likelihood
B. Human-Following with Two Types of Sensors adjustment were proposed for improvement of human
tracking in a human-following robot. Integration of a laser
i. Experimental Method range scanner and a Kinect scanner was also introduced.
Human-following experiments were performed. The target
Some experimental results show that the proposed tracking
human walked along a square way of 8m x 10m. The mobile
with a laser range sensor works well. Preliminary
robot detects and follows the target human. In the robot
experiments of human following indicated that the sensor
control, relative coordinates of the target human from the
integration is needed for unrestricted movement of the target
robot were used as the reference points.
human. As the future works, switching and weighting two
ii. Results sensors should be reviewed for more meaningful sensor
The human-following behaviors in success cases with
integration. Size and shape of detection area of new humans
sensor integration are shown in following figures. Fig.7
in human tracking should be also considered. In the
shows examples of trajectories of the target human and the
camera-ready paper, more details of sensor integration and
human-following robot during tracking task. According to
human following results will be described.
these figures, a mobile robot moved following the coordinates
of target human correctly. Fig.8 shows transitions of relative
REFERENCES
2000 [1] Shinichi Okusako, Shigeyuki Sakane, “Human Tracking with a Mobile
Robot using a Laser Range-Finder”, Journal of RSJ, Vol.24, No.5,
0
pp.605-613, 2006 (in Japanese)
-2000 [2] Kazuyuki Morioka, Yudai Oinaga, Yuichi Nakamura, “Control of
y coordinate (mm)

Human-Following Robot Based on Cooperative Positioning with an


-4000
Intelligent Space”, IEEJ Transactions on Electronics, Information and
-6000 Systems, Vol.131, No.5, pp.1050-1058, 2011 (in Japanese)
-8000 [3] Yoonkyu Yoo, Woojin Chung, “Detection and Following of Human
legs using the SVDD (Support Vector Data Description) scheme for a
-10000 Mobile Robot with a Single Laser Range Finder”, International
-5000 0 5000 10000 Conference on Electrical, Control and Computer Engineering, Pahang,
Malaysia, 2011
x coordinate (mm) Robot Human [4] Kazuyuki Morioka, Joo-Ho Lee, Hideki Hashimoto,
Fig. 7. Trajectory of the Target Human and a Human-Following Robot “Human-Following Mobile Robot in a Distributed Intelligent Sensor
Network”, IEEE Transactions on Industrial Electronics, Vol.51, No.1,
90 pp.229-237, 2004
[5] M. Isard, A. Blake, “CONDENSATION – Conditional Density
45 Propagation for Visual Tracking”, International Journal of Computer
㸦°㸧

Vision, Vol.29, No.1, pp.5-9, 1998


Angle㸦

0 [6] S.Kamijo, M.Sakauchi "Simultaneous Tracking of Pedestrians and


Vehicles by the Spatio-Temporal Markov Random Field Model" 2003
-45 IEEE International Conference on Systems, Man and Cybernetics,
pp.3732-3737, 2003.
-90
0 500 1000 1500 2000 2500
Time Step Only Kinect Integrated Sensors
Fig. 8. Angles between a Target human and a Human-Following Sensors

- 217 - SI International 2011

Das könnte Ihnen auch gefallen