Sie sind auf Seite 1von 11

Term Paper for EE 105

On

Extended version of Dynamic Map Generating


rescuer surveillance robot

Submitted by

Shoaib Bin Masud


ID: 1294625
Abstract:

This term paper is mainly based on designing an HMI enabled dynamic map generating robot
which can offer assistance during e.g. earthquakes, structural collapse with its video and
audio feedback where and when it is risky for a human being to offer help. With the possibility
of being operated both by manually and automatically by itself, this robotic framework can
be proven very useful in rescue operations and disaster management. The explicit design of
the robot proposed here is meant to provide a definite map of the traversed area dynamically
and it can also follow a specific map to reach a destination. Consequently, an HMI (Human
Machine Interface) has been developed to control the substantial movement of the navigator
from a moderate distance. Effective sensory modules and a stable control mechanism have
been developed with the provision of consistent video feedback. In addition, the system is
provided with an intelligence to detect human existence based on audio-visual feedback and
an efficient self-protective design. The most unique feature of the system is if in any case
the communication link with the host breaks down, the robot would return to the initial
place following the traversed pathway which avoids the risk of losing the device. The system
has been tested in different set-ups under several challenging environments and the results
obtained affirm its reliability

I. Introduction:
The main objective of this paper is to propose an eloquent design to generate the dynamic
maps of the traversed surface and help the rescue team obtain the locations of the victims.
The generated map can be stored in a database so that it can be utilized in future to search
the location. Similar map generation algorithmic tasks have been presented in [7], [8], and
[9] which describe the methods of development of indoor surveillance maps using different
mathematical topologies. The live video capturing of this project enables the rescuers to
observe what happens inside the wreckage. The navigator is deliberately associated with
effective sensory devices to sense the environmental parameters like smoke, fire or magnetic
values and a self-protective infrastructure has been constructed as well. Moreover, as the
explorer can reach the places where human beings cannot, it can be treated as a relief
worker. There may be multiple receivers of the video stream of the rescue task and human
existence can be detected on the basis of audiovisual feedback. Bluetooth communication
protocol is applied to send commands to the rescuer robot whereas similar tasks have been
undergone for an indoor surveillance proposed in [4] and [5] by dint of GPS and MSN
modules. This Bluetooth based communication facilitates wireless interfacing for a range of
around 60-65 feet. For more distant communication assurances, RF or GPRS signals can be
utilized.
II. Proposed Design and implementation:

Figure: 1

Figure. 1 the functional design of the overall presented system is shown which incorporates
the utilization of distance controlling feedback using Encoder module, precision control using
Gyroscope, audio feedback using tone generating device, environmental feedback by relevant
sensing devices and wireless communication interface using HMI data transformation unit.

A. Hardware Design and Implementation:

1. Two Quadrature Encoder modules are attached to the shaft of the motor-wheels
which numerically calculate the number of rotations of the wheels while spinning along
a surface. Thus from the rotational counts the point to point distances can be
measured and distance controlling feedback has been enhanced.

2. Gyroscopic control unit provides the alignment of the robotic structure along its initial
reference axis so that the angular deviations can be minimized which is indispensable to
generate dynamic maps and also to regenerate the traversed maps accurately. Precise control
of position of the structure is pursued by gyroscopic control phenomenon.

3. Tone generating unit provides the enhancement of locating the position of a stranded
human being from a moderate distance. This unit consists of both a receiver and a
transmitter of sound waves of high frequency to send signals to the stranded people
and also receive signals from the outcry of the victims.
4. An effective module consisting of oxygen sensor, ultrasonic sound sensor, infrared
proximity light sensor, smoke detector and a magnetic compass has been associated
with the system model to detect and process the environmental parameters and other
relevant sensing values.

5. An HMI has been developed using Bluetooth communication protocol (HC-05) which
enhances the wireless command transformation from the controller to the exploring
robot.

Fig. 2. The Structural View of the Tested System with its Mechanical Segments

B. Software Development
Software design consists of three inter-dependent operational segments-

1. Firmware of the Device Movement: The firmware has been developed on Arduino
platform. The overall code has been composed of three separate files- .ino file for the
main program; .h file for the Header directives and .cpp file for configuring code snippets
along with self-defined functions. .h file conceives the customized header libraries to
execute the main program efficiently. Customization of header libraries ensures the
authentication of the programming affiliation of the system. .cpp file conceives the
configured pin numbers of Arduino breakout board to which the external sensing
modules and circuitries are attached.

2. Controller Software of the HMI Mechanism: The controller software of the system has
been generated using Processing- version 2.2.1. The software (with .pde extension)
involves two fragments: one is for the initiations and the other is the fundamental
program. The whole compilation of the software is pursued on Java platform.

3. Smart Phone Application for User Interface: An Android Application named IP


Webcam is associated to provide the instantaneous video and pictorial feedback from
the navigator. This application requires portable Wi-Fi Hotspot technology to transfer
the utilities from the source to the receivers. Being connected to the same IP (Internet
Protocol) address, multiple devices can be treated as the receivers.

III. PROPOSED MECHANISM OF ROBOT CONTROL AND MAP GENERATION


This segment elaborates the technical prospects of the demonstrated work-

A. Design of the HMI System


Inclusion of a dynamic animated profile has been introduced so that the controller can
effectively operate the overall task from a distant place. There is a virtually impersonated
robotic field consisting of square grids where an observer can watch a dot representing the
robot moving vertically and horizontally. This arena is followed by a virtual space for
representing the virtual movement. The provision of a simultaneous virtual profile of the
system is not included in the works described in [2]-[9]. In the presented domains in [4], [7],
[8] and [9] there are scopes for consistent map generation and data acquisition but still there
is no inclusion of a locomotive virtual robot. The animated profile has been formed through
interpolation of linear graphical segments and the movement has been accomplished via
rotating the structure with respect to its center of gravity. The consistent stimulus of the
moving object has been readily inspected by considering its previous angular positions like x
= x(θ) + rcos(θ) and y = y(θ) + rsin(θ); where θ is the angular deviation accounted for and
r is the computed radius of confinement to rotation. And the virtual arena has been designed
by (m × n) row-column matrix computations. In Fig. 3, a circle is shown to be located beside
the object. When the robot is in its calibration state, the circle shows that its color Red
which depicts that no command can be executed at this moment. And the Green circle
notifies the ready-to-go phase of the navigator.

Fig. 3. Animated Version of the System; (a) Red Dotted Representation of the Calibration State, (b) Green Dotted Depiction of the
Command Execution Phase
Fig. 4. The Crossed Maps with Precise Point-to-Point Distances, Labeling of Checkpoints and Respective Cumulative Distances

B. Algorithm for Dynamic Map Generation and Automatic Return


1. Dynamic Map Generation: In order to execute the controllers instructions accurately,
the robotic system has been fabricated with a smart domain of creating the maps
automatically. When the robot moves across the surface, the encoder modules
attached to the motor-wheels calculate the number of rotations. Simultaneously,
the Gyro sensor tries to align the axis of the robot along its reference line. The
difference between the referred angle and the actual one generates the angular
displacement. Through logical arithmetic computations, the distance and angle
from one point to another are evaluated. Accordingly the measured distances and
angles are saved in two arrays. Combining these arrays, some points like (X, Y)
coordinates are placed arbitrarily. A code fragment has been evolved to draw lines
joining the indefinite points and eventually the map is created similar to the sample
shown in Fig. 4.
The implemented topology for dynamic map generation is presented in Algorithm 1.

Array of Coordinates → A
Array Index → I
State Variable → X
X ← 0;previousstate
X ← 1;ChangeofState
Operation Complete Variable → OC
OC ← 0;OperationRunning
OC ← 1;OperationTerminated
Connection Variable → C
C ← 0; ConnectionOK
C ← 1;ConnectionBreak Present Coordinate Point → (r, θ) INITIALIZE:
I←0
X ← 0 OC ← 0 C ← 0
START:
Read Value of X
if (X == 0) &&(OC == 0) && (C == 0) then
A[I][0] ← r A[I][1] ← θ
I←I+1
end if
if (OC == 0 || C == 0) then for J ← 0;J < I;J ← J + 1 do x ← A[J][0] cos
(A[J][1]) y ← A[J][0]sin(A[J][1])
x ← x + x0 y ← y + y0 x0 ← x
y0 ← y
Plot (x, y) and connect with previous point end for
end if
Goto to START

2. Automatic Return: The promising feature of the proposed system is its capability of
regenerating the travelled Algorithm 2 Proportional Control Algorithm maps so that path
feedback enables the explorer to return to the initial position.

Present angular state of the robot → X0 Previous angular state of the robot → X1
Error → E
Base velocity of the robot ← VBase Velocity correction ← 4V
Constant velocity ← C
Proportional constant ← Kp
Left wheel speed ← VL Right wheel speed ← VR INITIALIZE:
X0 ← 0
X1 ← 0 E ← 0
VBase ← 0
4V ← 0
START:
X0 ← Present reading
E ← X1 − X0
4V ← VBase + 4V
VL ← VBase + 4V
VR ← VBase − 4V
X1 ← X0
Goto to START

3. Precise Robot Control Algorithm


In order to ensure the precise control operation over the physical movement of the
structure, a stable Proportional (P) topology has been applied. The systems presented in [2],
[3] and [6] consist of obstacles avoidance but there has been no stable control mechanism
for the physical locomotion; whereas this paper presents a control scheme to regulate the
robotic movement. The proposed topology is represented in Algorithm 2.

The systematic procedure of the controllable robotic movement contains two particular
mechanisms
1. Encoder-Based Movement Mechanism: The dynamics of the Encoder module have been
calculated for moving the structure to a distant location during forward movement and also for
regenerating the path during reverse operation. Being connected to the hardware interrupt pin
of the microcontroller board, at each falling edge an interrupt signal has been generated. For the
specified robotic wheels, one cyclic rotation has been mathematically found equivalent to a
distance of 15.7cm. This distance corresponds to approximately 10450 counts which imply
665.5478 counts per cm distance covered by the wheel. Hence for any distance such as x this
constant counting value is multiplied with the measured cm value to calculate the number of
rotational counts.
2. Gyroscope-Based Movement Mechanism: Gyroscope integrated IMU sensor MPU-
6050 has been utilized to detect the attitude and the tri-axial references of the
exploring object so that the point to point angular deviations during movement along
a surface can be measured. Gyroscopes work on the principle of Coriolis acceleration
which deals with the 3 axes Euler angles referred as Pitch (X-axis), Yaw (Y-axis) and
Roll (Z-axis). To interface with microcontroller, the I2C (Inter-Integrated Circuit)
communication protocol is mandatory. Moreover the raw data collected from the
Gyroscope would not provide much stability; hence there is an inherent DMP (Digital
Motion Processor) feature to derive stable output from the sensor.

IV. RESULT AND ANALYSIS


The developed system has been tested in certain environmental set-ups where artificial
uneven damaged surfaces were fragmented. Prior to the testing solutions, some essential
numerical operations have been computed like conversion of speed from the analogous coded
values like 255 into 39.3cms−1 and so on. One revolution of the wheel corresponds to 15.7
cm. The rated maximum speed is noted to be 150 rpm (revolution per minute). Hence,
through a simple mathematical procedure the calculated speed equivalent to the maximum
coded speed of 255 should be (15.7×150) ÷60 = 39.3 cms−1. The overall simulation of the
system consists of several consequences and the relative performance parameters in terms
of angular deviation (AD) in Degree and mean angular deviation (MAD) in Degree are
evaluated.
α =Observed angle, β = Estimated reference angle, n=No. of observation
1) Case-1: Effect of Speed Variation on Mean Angular Deviation: A number of pathways
with several angles of curvature (θ in Degree) have been explored by the robot.
TABLE I
EVALUATED MADS DUE TO SPEED VARIATION

Speed MAD MAD MAD MAD


in (θ (θ (θ (θ
cms−1
= = = =
0deg) 30deg) 45deg) 60deg)
39.3 5 6 6 8
30.8 7 9 10 14
24.6 8 13 14 16
19.3 11 14 15 18
15.4 12 15 17 21

2) Case-2: Effect of distance variation on mean angular deviations the robot has been
subjected to explore different ranges of distant places along uneven surfaces with zigzag
turnings. From a particular starting position the robot has been driven along these paths
and also tested for the automatic return to its initial position
TABLE II

Travelled MAD(θ MAD(θ MAD(θ MAD(θ


Distance = = = =
( cm ) 0deg) 30deg) 45deg) 60deg)
500 3 5 6 8
800 5 6 7 9
1000 8 9 10 12
1200 9 12 13 15
1500 10 15 18 20

EVALUATED MADS DUE TO DISTANCE VARIATION

3) Case-3: Effect of Speed Variation on Point-to-Point Movement: Unlike the case-1


discussed earlier another performance analysis was conducted for point-to-point deviations
of the system due to speed variations. For an example, the robot was made to navigate
along a map shown in Fig 5. The deviations for point-to-point movement along the map
of the Fig 5 are stored in Table III.

Fig. 5. A Specific Map Provided to Evaluate the Point-to-Point Performance Analysis


TABLE III
EVALUATED DEVIATIONS FOR POINT-TO-POINT MOVEMENT

Speed AD AD AD AD AD AD AD MAD
in 0-1 1-2 2-3 3-4 4-5 5-6 6-7
cms−1

39.3 4 4 3 3 2 2 2 2.86
30.8 5 6 5 4 4 3 3 4.29
24.6 7 7 6 5 5 5 4 5.57
19.3 8 8 7 7 5 5 5 6.43
15.4 10 9 8 8 7 7 6 7.86

4) Case-4: Effect of Light Variation on Mean Angular Deviations: The system has been subjected to
variations of luminous intensity (change in wavelength weighted power of the light source) in
Candela per m2 and its performance in terms of mean deviations has been evaluated for five
consecutive times in three situations which are shown in Fig. 6.

Fig. 6. MAD versus luminous intensity (cdm−2) for a testing distance of 1000 cm

5) Case-5: Effect of Sound Variations on Human Existence Detection: The robot is supposed to
detect the location of the stranded people on the basis of audiovisual feedback provided by the
live video streaming of the camera and audio signal reception by a receiver like microphone.

Fig. 7. Accuracy of human detection (%) versus sound intensity (dB)


V. CONCLUSION
In this paper an exclusive robotic system aimed at rescue operations has been presented with
multi-dimensional features and facilities. The objective is to contribute to the automated
life-saving initiatives at the events of earthquakes and structural collapses. The developed
framework comprises of map exploring ability, control via both manual and autonomous
modes, HMI provision, storing the traversed path for future application, stable robot control
algorithm, detection of human existence, imitating an allocated map, live video feedback,
providing environmental parameters and automatic return to the initial place to avoid the
risks of loss and discrepancy.

Reference
[1] “One year after the rana plaza tragedy where do we stand? the victims, the sector and the value chain,”
April,2015. [Online]. Available: www.cpd.org.bd
[2] S. Harmon, “The ground surveillance robot (gsr): An autonomous vehicle designed to transit unknown
terrain,” IEEE Trans. Robotics and Automation,, vol. 3, no. 3, pp. 266–279, June 1987.
[3] F. Beainy and S. Commuri, “Development of an autonomous atv for reallife surveillance operations,” in Proc.
17th Mediterranean Conference on Control and Automation, 2009, June 2009, pp. 904–909.
[4] B. Doroodgar and G. Nejat, “A hierarchical reinforcement learning based control architecture for semi-
autonomous rescue robots in cluttered environments,” in Proc. IEEE Conference on Automation Science
and Engineering (CASE), Aug 2010, pp. 948–953.
[5] C. Ye, S. Ma, and B. Li, “Design and basic experiments of a shapeshifting mobile robot for urban search
and rescue,” in Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, Oct 2006 ,
pp. 3994–3999.
[6] H.-T. Lee, W.-C. Lin, C.-H. Huang, and Y.-J. Huang, “Wireless indoor surveillance robot,” in Proc. SICE
Annual Conference (SICE), Sept 2011 , pp. 2164–2169.
[7] A. Kolling and S. Carpin, “Extracting surveillance graphs from robot maps,” in Proc. IEEE/RSJ International
Conference on Intelligent Robots and Systems, Sept 2008, pp. 2323–2328.
[8] H. Mano, K. Miyazawa, R. Chatterjee, and F. Matsuno, “Autonomous generation of behavioral trace maps
using rescue robots,” in Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, Oct
2009, pp. 2809–2814.
[9] T. Matsuo, K. Tanaka, and N. Abe, “Automatic map generation with mobile robot,” in Proc. IEEE
International Conference on Systems, Man, and Cybernetics, vol. 4, 1999, pp. 680–685 vol.4.
[10] S. A. Fattah et al., "Dynamic map generating Rescuer offering surveillance robotic system with
autonomous path feedback capability," 2015 IEEE Region 10 Humanitarian Technology Conference (R10-
HTC), Cebu City, 2015, pp. 1-6

Das könnte Ihnen auch gefallen