Beruflich Dokumente
Kultur Dokumente
1. Teleoperators.
2. Numerically Controlled Milling Machines.
• The first robots essentially combined the
mechanical linkages of the teleoperator with the
autonomy and programmability of CNC
machines.
• Teleoperation indicates operation of a machine
at a distance and a device or machine is
operated by a person from a distance is called a
teleoperator.
Introduction - Robotics
MILESTONES IN THE HISTORY OF ROBOTICS
1986 — the underwater robot, Jason, of the Woods Hole Oceanographic Institute, explores the
wreck of the Titanic, found a year earlier by Dr. Robert Barnard.
1988 — Staubli Group purchases Unimation from Westinghouse
1993 — the experimental robot, ROTEX, of the German Aerospace Agency (DLR) was flown
aboard the space shuttle Columbia and performed a variety of tasks under both teleoperated
and sensor-based offline programmed modes
1996 — Honda unveils its Humanoid robot, a project begun in secret in 1986
Introduction - Robotics
MILESTONES IN THE HISTORY OF ROBOTICS
1997 — the first robot soccer competition, RoboCup-97, is held in Nagoya, Japan and draws 40
teams from around the world
1997 — the Sojourner mobile robot travels to Mars aboard NASA‘s Mars PathFinder mission
2001 — Sony begins to mass produce the first household robot, a robot dog named Aibo
2001 — the Space Station Remote Manipulation System (SSRMS) is launched in space on board
the space shuttle Endeavour to facilitate continued construction of the space station
2001 — the first telesurgery is performed when surgeons in New York performed a laparo-
scopic gall bladder removal on a woman in Strasbourg, France
2001 — robots are used to search for victims at the World Trade Center site after the
September 11th tragedy
2002 — Honda‘s Humanoid Robot ASIMO rings the opening bell at the New York Stock
Exchange on February 15th
Introduction - Robotics
MILESTONES IN THE HISTORY OF ROBOTICS
Mechanical
Engineering
Mechanical Engineering
Electrical Engineering Electronics Physics and
Engineering Biology
Computer Engineering
Electronics Engineering
Physics and Biology
Robotics
Mathematics
Electrical
Mathematics
Engineering
Computer
Engineering
Introduction - Robotics
INTRODUCTION - ROBOTICS
Features of a Robots:
Throughput
Accuracy
Quality
Consistency
Reliability
Safety
WHY ROBOTICS?
• Robots are real
• They are diverse in form and function
• Rapid increasing functionality
• Solutions of important problems of our time
Introduction - Robotics
ESSENTIAL CHARACTERISTICS
• Sensing: First of all your robot would have to be able to
sense its surroundings. Robot sensors are light sensors
(eyes), touch and pressure sensors (hands), chemical Sensing
sensors (nose), hearing and sonar sensors (ears) and taste
sensors (tongue).
Actuators
Sensors
The cycle represents the flow.
• A robot may not harm a human or, through inaction, allow a human to come to harm.
• A robot must obey the orders given by human beings, except when such orders conflict with the
First Law.
• A robot must protect its own existence as long as it does not conflict with the First or Second
Laws.
Introduction - Robotics
FEW ROBOTS EXAMPLES……
Decontaminating Robot for Welding robot for repetitive kind Scrumbat menial task robot
rescuing nuclear plants from of works
hazardous tasks
Introduction - Robotics
REASONS FOR USING ROBOTS IN INDUSTRY
Material handling
Material transfer
Machine loading
/unloading
Spot welding
Spray coating
Assembly
Inspection
Introduction - Robotics
COUNTRIES WITH ROBOTIC APPLICATIONS
• Japan
• US
• Germany
• Sweden
• France
• Great Britan
• Italy
Introduction - Robotics
SUMMARY
Down side is that robotics effects the labour pool and increases the educational requirements
for manufacturing personnel.
Robotics are used in most industries and will be used even more in the decades to come.
Mechanisms
Mechanisms - Robotics
ROBOT PARTS
Base
Shoulder
Elbow
Wrist
Tool-plate
End-effectors
Mechanisms - Robotics
DEFINITIONS - TERMINOLOGY
Inverse Kinematics :
Use of the kinematics equations to determine the joint parameters that
provide a desired position for each of the robot's end-effectors.
ROBOTIC ARM
Mechanisms - Robotics
Parallel Manipulators :
• Links are arranged closely rather than open kinematic chain.
• Great structural rigidity and great accuracy can be achieved.
Mechanisms - Robotics
SCARA Configuration
Cylindrical Configuration
Cartesian / Rectangular Configuration
Mechanisms - Robotics
JOINT ARM/ARTICULATED ROBOT
• Called as a revolute , or anthropomorphic manipulator
• All joints are replaced with revolute it is known as articulated – coordinate robot.
• This is the most widely used arm configuration because of its flexibility in reaching any part of the
working envelope.
• This configuration flexibility allows such complex applications as spray painting and welding to be
implemented successfully
Mechanisms - Robotics
POLAR/SPHERICAL CONFIGURATION
• Second/Elbow joint is replaced with revolute joint .
• Work envelope generates in this case is the volume between two concentric spheres.
• These robots can generate a large working envelope.
• The robots can allow large loads to be lifted.
• The semi-spherical operating volume leaves a considerable space near to the base that cannot be
reached.
• This design is used where a small number of vertical actions is adequate: the loading and unloading of
a punch press is a typical application.
Mechanisms - Robotics
• Three principal axes of control are linear are at right angles to each other.
• Due to their rigid structure they can manipulate high loads so they are commonly used for pick-
and-place operations, machine tool loading, in fact any application that uses a lot of moves in the
X,Y,Z planes.
• These robots occupy a large space, giving a low ratio of robot size to operating volume. They may
require some form of protective covering
Mechanisms - Robotics
WORKSPACE
Def - The workspace of a manipulator is the total volume swept out by the end effector as the manipulator
executes all possible motions constrained by mechanical arm.
Reachable Workspace :
Set of points reachable by the manipulators.
Dexterous Workspace :
These points manipulator can reach with an orientation of end effector arbitrary.
Singularity: Kinematic singularity is a point in the workspace where the robot loses its ability to move the
end effector in some direction no matter how it moves its joints. It typically occurs when two of the robot's
joints line up, making them redundant.
Point to Point:
• Discrete set of points.
• With teach pendant they will store the points.
• No control over the end effector.
Continuous Path:
• Entire path is controlled.
• Commanded to follow a contour or a straight line.
• Velocity and acceleration controlled.
• Sophisticated controllers & softwares required.
Mechanisms - Robotics
ROBOT SYSTEMS AND SPECIFICATIONS
Cycle time:
The measure of time required for complete one periodic operation.
Load Bearing Capacity: Load bearing capacity is the maximum weight-carrying capacity of the robot. Serial
robots that carry large weights, but must still be precise, are heavy and expensive, with poor (low) payload to
weight ratios.
Accuracy: Accuracy is the ability of a robot to go to the specified position without making a mistake. It is
impossible to position a machine exactly. Accuracy is therefore defined as the ability of the robot to position
itself to the desired location with the minimal error (usually 0.001 inch).
Repeatability: Repeatability is the ability of a robot to repeatedly position itself when asked to perform a task
multiple times. Accuracy is an absolute concept, repeatability is relative. Note that a robot that is repeatable
may not be very accurate. Likewise, an accurate robot may not be repeatable.
Precision: Precision is the ‗fineness‘ with which a sensor can report a value. For example, a sensor that reads
2.1178 is more precise than a sensor that reads 2.1 for the same physical variable. Precision is related to
significant figures. The number of significant figures is limited to the least precise number in a system of
sensing or string of calculations.
Mechanisms - Robotics
Three days in a row I bicycled the exact same route to and from work and I was amazed that my bike computer
recorded the following data.
TIME
DAY DISTANCE (KM) (HRS:MIN:SEC)
• Pneumatic cylinders (air pressure) Human Arm Model with McKibben Muscles
A pneumatically-driven robot is similar to one with a hydraulic drive system;
it can carry less weight, but is more compliant (less rigid to disturbing
forces).
Spherical Wrist
• First three axes are used to establish the position of the wrist.
• The resolution of a robot is a feature determined by the design of the control unit and is mainly dependent
on the position feedback sensor.
• The programming resolution is the smallest allowable position increment in robot programs and is
referred to as the basic resolution unit (BRU).
• For example, assume that an optical encoder which emits 1000 pulses per revolution of the shaft is directly
attached to a rotary axis. This encoder will emit one pulse for each of 0.36° of angular displacement of the
shaft.
• The unit 0.36° is the control resolution of this axis of motion. Angular increments smaller than 0.36° cannot
be detected. Best performance is obtained when programming resolution is equal to control resolution. In
this case both resolutions can be replaced with one term: the system resolution
Sensing
Sensing - Robotics
SENSE - DECIDE - ACT
To make an effective Robot, it must be made to interact well with its environment and control commands .
Sensing - Robotics
PERCEPTION INTRODUCTION - CHALLENGES
• Dealing with real-world situations
• Reasoning about a situation
• Cognitive systems have to interpret situations based on uncertain and only partially available information
• They need ways to learn functional and contextual information (semantics /understanding)
Sensing - Robotics
COMMON SENSORS
CLASSIFICATION OF SENSORS
What:
Proprioceptive sensors
Measure values internally to the system (robot)
E.g. motor speed, wheel load, heading of the robot, battery status
Exteroceptive sensors
Information from the robots environment like
distances to objects, intensity of the ambient light,
extraction of features from the environment
How:
Passive sensors
Measure energy coming from the environment; very much
influenced by the environment
Active sensors
Emit their proper energy and measure the reaction
better performance, but some influence on environment
Sensing - Robotics
CLASSIFICATION OF SENSORS
Sensing - Robotics
ENCODERS
Electro-mechanical Device that converts linear or angular position of a shaft to an analog or digital
signal, making it an linear/angular transducer
Sensing - Robotics
HEADING SENSORS
Definition:
Heading sensors are sensors that determine the robot‘s orientation and inclination with respect to a
given reference. These can be proprioceptive (gyroscope, accelerometer) or exteroceptive (compass,
inclinometer).
Together with an appropriate velocity information, they allow integrating the movement to a position
estimate. This procedure is called deduced reckoning (ship navigation)
Sensor types:
Gyroscope Accelerometer
Sensing - Robotics
MEMS ACCECLOMETER
A spring-like structure connects the device to a seismic mass vibrating in a capacity divider that converts the
displacement into an electric signal
Can be multi-directional
Can sense up to 50 g
Applications
• Dynamic acceleration
• Static acceleration (inclinometer)
• Airbag sensors (+- 35 g)
• Control of video games (e.g., Nintendo Wii)
Sensing - Robotics
INERTIAL MEASUREMENT UNIT
It uses gyroscopes and accelerometers to estimate the relative position (x, y, z), orientation (roll, pitch, yaw),
velocity, and acceleration of a moving vehicle with respect to an inertial frame.
• In order to estimate the motion, the gravity vector must be subtracted and the initial velocity has to be
known.
• After long periods of operation, drifts occurs: need external reference to cancel it.
Sensing - Robotics
RANGE SENSORS
Sonar
Structured light
Sensing - Robotics
RANGE SENSORS – TIME OF FLIGHT
• Ultrasonic sensors as well as laser range sensors make use of propagation speed of sound or
electromagnetic waves respectively.
3 meters - Distance
Camera Vision
Sensing - Robotics
VELODYNE LASER RANGE FINDER
The Velodyne HDL-64E uses 64 laser emitters.
• Turn-rate up to 15 Hz
• The field of view is 360° in azimuth and 26.8° in elevation
• Angular resolution is 0.09° and 0.4° respectively
• Delivers over 1.3 million data points per second
• The distance accuracy is better than 2 cm and can measure depth up to 50 m
• This sensor was the primary means of terrain map construction and obstacle detection for all the top
DARPA 2007 Urban Challenge teams. However, the Velodyne is currently still much more expensive
than Sick laser range finders (SICK ~ 2-4000 $, Velodyne ~40- 80,000 $)
Sensing - Robotics
STRUCTURED LIGHT-KINECT SENSOR
Major components:
• IR Projector
• IR Camera
• VGA Camera
• Microphone Array
• Motorized Tilt
Sensing - Robotics
ROBOTIC SENSORS
• Position sensors: are used to monitor the position of joints.
• Range sensors: measure distances from the reference point to other points of importance.
• Velocity sensors: are used to estimate the speed with which a manipulator is moved.
Sensing - Robotics
SENSORS MOUNTED ROBOTS
Encoder
Potentiometer IR
Sensors
LVDT
Ultrasonic Sensor
Actuation
Actuation - Robotics
CLASSIFICATION OF ACTUATORS
Hydraulic Actuators :
• A hydraulic actuator consists of cylinder or fluid motor that uses hydraulic power to facilitate mechanical
operation.
• The mechanical motion gives an output in terms of linear, rotatory or oscillatory motion.
Pneumatic Actuators :
• A pneumatic actuator converts energy formed by vacuum or compressed air at high pressure into either
linear or rotary motion.
Electric Actuators :
• An electric actuator is powered by a motor that converts electrical energy into mechanical torque.
• The electrical energy is used to actuate equipment such as multi-turn valves.
• It is one of the cleanest and most readily available forms of actuator because it does not directly involve oil
or other fossil fuels.
Actuation - Robotics
HYDRAULIC ACTUATORS
• Hydraulic actuators are rugged and suited for high-force applications.
• A hydraulic actuator can hold force and torque constant without the pump supplying more fluid or
pressure due to the incompressibility of fluids.
• High Power.
Hydraulic Cylinder
Robot using Hydraulic actuation
Actuation - Robotics
PNEUMATIC ACTUATORS
• Pneumatic actuators generate precise linear motion by providing accuracy.
• Pneumatic actuators typical applications involve areas of extreme temperatures.
• Simplicity.
• For this robot, the outputs are the positions and joint velocities of the end effector.
• The input variables, are basically the torques and forces τ.
Control - Robotics
MODELING
The system‘s mathematical model is obtained typically via one of the two following techniques :
• The dynamic model of robot manipulators is derived in the analytic form using basically the laws of
mechanics.
• Model is an n DOF system (multivariable nonlinear system).
Robustness : Faculty of a control system to cope with errors due to neglected dynamics.
Parametric identification : The objective is to obtain the numeric values of different physical
parameters.
Control - Robotics
CONTROL SPECIFICATIONS
Stability :
Consists in the property of a system by which it goes on working at certain regime or ‗closely‘ to it ‘for ever‘.
Lyapunov stability theory.
Input-Output stability theory.
Motion tracking :
Tracking control in joint coordinates.
Point to Point motion.
Trajectory (Continuous) motion.
Intelligence
Intelligence - Robotics
THREE WAVES OF ARTIFICIAL - DARPHA
Intelligence - Robotics
FIRST WAVE OF AI
Intelligence - Robotics
FEATURES OF FIRST WAVE IN AI
Intelligence - Robotics
SELF DRIVING CARS CHALLENGE
Intelligence - Robotics
SECOND WAVE OF AI
Intelligence - Robotics
SECOND WAVE OF AI
Intelligence - Robotics
SECOND WAVE - NATURAL DATA - MANIFOLDS
Intelligence - Robotics
MANIFOLDS FEATURE
Intelligence - Robotics
MANIFOLDS- CONCEPT
Intelligence - Robotics
NEURAL NETS
Intelligence - Robotics
ERRORS – NEURAL NETWORKS
Intelligence - Robotics
APPLICATIONS
Intelligence - Robotics
CHALLENGES - SECOND WAVE OF AI
Intelligence - Robotics
THIRD WAVE OF AI
Intelligence - Robotics
THIRD WAVE OF AI
Intelligence - Robotics
MODEL GENERATION
Intelligence - Robotics
THIRD WAVE OF AI
Intelligence - Robotics
ARTIFICIAL INTELLIGENCE
Artificial Intelligence (AI) is a way of making a computer, a computer-controlled robot, or a software think
intelligently, in the similar manner the intelligent humans think.
Areas of AI :
Natural language processing : to communicate with humans.
Knowledge representation : to store and retrieve information.
Automated reasoning : to use the stored information to take decisions and to draw new conclusions.
Machine learning : to adapt to new circumstances, to detect and extrapolate patterns.
Vision : To identify objects in the environment.
Intelligence - Robotics
NATURAL LANGUAGE PROCESSING
Natural language processing (NLP) is a field of computer science, artificial intelligence, and computational linguistics
concerned with the interactions between computers and human (natural) languages.
Machine learning is the subfield of computer science that gives computers the ability to learn without being explicitly
programmed.
Approaches :
Artificial Neural Networks : Computations are structured in terms of an interconnected group of artificial neurons,
processing information using a connectionist approach to computation.
Deep learning : Consists of multiple hidden layers in an artificial neural network. This approach tries to model the
way the human brain processes light and sound into vision and hearing.
Reinforcement learning : It is concerned with how an agent ought to take actions in an environment so as to
maximize some notion of long-term reward.
Genetic algorithms : A search heuristic that mimics the process of natural selection, and uses methods such as
mutation and crossover to generate new genotype in the hope of finding good solutions to a given problem.
Support vector machines : Support vector machines (SVMs) are a set of related supervised learning methods used
for classification and regression.
Intelligence - Robotics
ARTIFICIAL NEURAL NETWORK (ANN)
A computing system made up of a number of simple, highly interconnected processing elements, which
process information by their dynamic state response to external inputs.
Highly used in classification, object detection problems.
It can have any number of hidden layers depend on application.
Input is set of parameters and Output is classification weights while output of hidden layer is unknown.
• Viewpoint changes
• Illumination changes
• Inherent ambiguities:
many different 3D scenes can give rise to a particular 2D picture
Vision - Robotics
COMPUTER VISION | APPLICATIONS
Recognition
Motion capture
Augmented reality
Medical imaging
Vision - Robotics
COMPUTER VISION FOR ROBOTICS
Enormous descriptability of images a lot of data to process (human vision involves 60 billion neurons!)
Vision provides humans with a great deal of useful cues to explore the power of vision towards intelligent robots
Cameras:
• Vision is increasingly popular as a sensing modality:
• descriptive
• compactness, compatibility
• low cost
• HW advances necessary to support the processing of
images
Vision - Robotics
THE CAMERA | IMAGE FORMATION
Pinhole model:
Captures beam of rays – all rays through a single point (note: no lens!)
The point is called Center of Projection or Optical Center
The image is formed on the Image Plane
We will use the pinhole camera model to describe how the image is formed
Vision - Robotics
WHY USE A LENS?
The ideal pinhole:
only one ray of light reaches each point on the film a
image can be very dim; gives rise to diffraction effects
Making the pinhole bigger (i.e. aperture) makes the image blurry
One ray Pass Blur Image with
large aperture
• When the object is placed behind the picture plane, the perspective will show object reduced in size.
• When the object is placed in front of the picture plane, the perspective will show object enlarged in
size.
• When the picture plane coincides with the object, The perspective will show the true size of the object.
Vision - Robotics
DISTORTION AND RADIAL DISTORTION
In geometric optics, distortion is a deviation from rectilinear projection, a projection in which straight
lines in a scene remain straight in an image. It is a form of optical aberration.
The camera matrix does not account for lens distortion as an ideal pinhole camera does not have a lens.
To accurately represent a real camera, the camera model includes the radial and tangential lens
distortion.
Radial Distortion
Radial distortion occurs when light rays bend more near the edges of a lens than they do at its optical
center. The smaller the lens, the greater the distortion.
Vision - Robotics
TANGENTIAL DISTORTION
Tangential distortion occurs when the lens and the image plane are not parallel. The tangential distortion
coefficients model this type of distortion.
Vision - Robotics
TYPES OF DISTORTION
• Barrel distortion: When straight lines are curved inwards in a shape of a barrel, this type of aberration is called ―barrel
distortion‖. Commonly seen on wide angle lenses, barrel distortion happens because the field of view of the lens is
much wider than the size of the image sensor and hence it needs to be ―squeezed‖ to fit. As a result, straight lines are
visibly curved inwards, especially towards the extreme edges of the frame.
• Pincushion distortion: Pincushion distortion is the exact opposite of barrel distortion – straight lines are curved
outwards from the center. This type of distortion is commonly seen on telephoto lenses, and it occurs due to image
magnification increasing towards the edges of the frame from the optical axis. This time, the field of view is smaller
than the size of the image sensor and it thus needs to be ―stretched‖ to fit. As a result, straight lines appear to be
pulled upwards in the corners.
• Mustache distortion: A mixture of both types, sometimes referred to as mustache distortion (moustache distortion) or
complex distortion, is less common but not rare. It starts out as barrel distortion close to the image center and
gradually turns into pincushion distortion towards the image periphery, making horizontal lines in the top half of the
frame look like a handlebar mustache. Its characteristics are indeed complex and can be quite painful to deal with.
While this type of distortion can be potentially fixed, it often requires specialized software.
.
The world points are transformed to camera coordinates using the extrinsics parameters. The camera
coordinates are mapped into the image plane using the intrinsics parameters.
Vision - Robotics
CAMERA CALIBRATION PARAMETERS
Extrinsic Parameters
The extrinsic parameters consist of a rotation, R, and a
translation, t. The origin of the camera's coordinate
system is at its optical center and its x- and y-axis define
the image plane.
Intrinsic Parameters
The intrinsic parameters include the focal length, the
optical center, also known as the principal point, and
the skew coefficient.
Vision - Robotics
COMPUTER STEREO VISION
Computer stereo vision is the extraction of 3D information from digital
images, such as obtained by a CCD camera.
By comparing information about a scene from two vantage points, 3D
information can be extracted by examination of the relative positions
of objects in the two panels. This is similar to the biological process
Stereopsis.
In traditional stereo vision, two cameras, displaced horizontally from
one another are used to obtain two differing views on a scene, in a
manner similar to human binocular vision.
By comparing these two images, the relative depth information can be
obtained in the form of a disparity map, which encodes the difference Diagram describing relationship of image
in horizontal coordinates of corresponding image points. The values in displacement to depth with stereoscopic
images, assuming flat co-planar images.
this disparity map are inversely proportional to the scene depth at the
corresponding pixel location.
Vision - Robotics
PRE-PROCESSING STEPS-STEREO VISION
For a human to compare the two images, they must be superimposed in a stereoscopic device, with the
image from the right camera being shown to the observer's right eye and from the left one to the left eye.
• The image must first be undistorted, such that barrel distortion and tangential distortion are removed.
This ensures that the observed image matches the projection of an ideal pinhole camera.
• The image must be projected back to a common plane to allow comparison of the image pairs, known as
image rectification.
• An information measure which compares the two images is minimized. This gives the best estimate of
the position of features in the two images, and creates a disparity map.
• Optionally, the received disparity map is projected into a 3d point cloud. By utilizing the cameras'
projective parameters, the point cloud can be computed such that it provides measurements at a known
scale.
Vision - Robotics
IMAGE IN STEREO VISION
Vision - Robotics
DISPARITY
Disparity refers to the distance between two corresponding points
in the left and right image of a stereo pair. If you look at the image
below you see a labelled point X (ignore X1, X2 & X3). By following
the dotted line from X to OL you see the intersection point with the
left hand plane at XL. The same principal applies with the right-
hand image plane.
Disparity Map/Image
If you were to perform this matching process for every pixel in the
left hand image, finding its match in the right hand frame and
computing the distance between them you would end up with an
image where every pixel contained the distance/disparity value
for that pixel in the left image.
Vision - Robotics
STRUCTURE FROM MOTION
Structure from motion (SfM) is a photogrammetric range imaging technique for estimating three-
dimensional structures from two-dimensional image sequences that may be coupled with local motion
signals. It is studied in the fields of computer vision and visual perception.
In biological vision, SfM refers to the phenomenon by which humans (and other living creatures) can
recover 3D structure from the projected 2D (retinal) motion field of a moving object or scene.
Motion Planning
Motion Planning - Robotics
MOTION PLANNING
Goals:
• Collision-free trajectories.
• Robot should reach the goal location as fast as possible.
Dynamic Environments:
How to react to unforeseen obstacles?
• Efficiency
• Reliability
Configuration Space: A configuration describes the pose of the robot, and the configuration space C is the set
of all possible configurations.
For example: If the robot is a single point (zero-sized) translating in a 2-dimensional plane (the workspace), C
is a plane, and a configuration can be represented using two parameters (x, y).
If the robot is a 2D shape that can translate and rotate, the workspace is still 2-dimensional. However, C is the
special Euclidean group SE(2) = R2 SO(2) (where SO(2) is the special orthogonal group of 2D rotations), and
a configuration can be represented using 3 parameters (x, y, θ).
If the robot is a solid 3D shape that can translate and rotate, the workspace is 3-dimensional, but C is the
special Euclidean group SE(3)= R3 SO(3), and a configuration requires 6 parameters: (x, y, z) for translation,
and Euler angles (α, β, γ).
If the robot is a fixed-base manipulator with N revolute joints (and no closed-loops), C is N-dimensional.
Target space:
• Target space is a linear subspace of free space which denotes where we
want the robot to move to.
• To solve this problem, the robot goes through several virtual target
spaces, each of which is located within the observable area (around the
robot). A virtual target space is called a sub-goal.
Motion Planning - Robotics
GRID-BASED ALGORITHM
Grid-based search:
Reward-based algorithms assume that the robot in each state (position and internal state, including
direction) can choose between different actions (motion).
However, the result of each action is not definite. In other words, outcomes (displacement) are partly
random and partly under the control of the robot. The robot gets positive reward when it reaches the
target and gets negative reward if it collides with an obstacle.
These algorithms try to find a path which maximizes cumulative future rewards. The Markov decision
process (MDP) is a popular mathematical framework that is used in many reward-based algorithms.
The advantage of MDPs over other reward-based algorithms is that they generate the optimal path.
The disadvantage of MDPs is that they limit the robot to choose from a finite set of actions. Therefore, the
path is not smooth (similar to grid-based approaches). Fuzzy Markov decision processes (FDMPs) are an
extension of MDPs which generate smooth paths using a fuzzy inference system.
Motion Planning - Robotics
One approach is to treat the robot's configuration as a point (usually electron) in a potential field that
combines attraction to the goal, and repulsion from obstacles.
The resulting trajectory is output as the path. This approach has advantages in that the trajectory is
produced with little computation. However, they can become trapped in local minima of the potential field,
and fail to find a path.
Sampling-based algorithms:
Sampling-based algorithms represent the configuration space with a roadmap of sampled configurations.
A basic algorithm samples N configurations in C, and retains those in Cfree to use as milestones. A
roadmap is then constructed that connects two milestones P and Q if the line segment PQ is completely in
Cfree. Again, collision detection is used to test inclusion in Cfree. To find a path that connects S and G, they
are added to the roadmap. If a path in the roadmap links S and G, the planner succeeds, and returns that
path. If not, the reason is not definitive: either there is no path in Cfree, or the planner did not sample
enough milestones.