Sie sind auf Seite 1von 12

Robotics and Computer-Integrated Manufacturing 32 (2015) 2536

Contents lists available at ScienceDirect

Robotics and Computer-Integrated Manufacturing


journal homepage: www.elsevier.com/locate/rcim

Computer vision technology for seam tracking


in robotic GTAW and GMAW
Yanling Xu a,n, Gu Fang b, Na Lv a, Shanben Chen a, Ju Jia Zou b
a
School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai, China
b
School of Engineering, University of Western Sydney, Sydney, Australia

art ic l e i nf o a b s t r a c t

Article history: Due to ever increasing demand in precision in robotic welding automation and its inherent technical
Received 1 January 2014 difculties, seam tracking has become the research hotspot. This paper introduces the research in
Received in revised form application of computer vision technology for real-time seam tracking in robotic gas tungsten arc
1 September 2014
welding (GTAW) and gas metal arc welding (GMAW). The key aspect in using vision techniques to track
Accepted 1 September 2014
welding seams is to acquire clear real-time weld images and to process them accurately. This is directly
related to the precision of seam tracking. In order to further improve the accuracy of seam tracking, in
Keywords: this paper, a set of special vision system has been designed rstly, which can acquire clear and steady
Robot welding real-time weld images. By analyzing the features of weld images, a new and improved edge detection
Vision technology
algorithm was proposed to detect the edges in weld images, and more accurately extract the seam and
Seam tracking
pool characteristic parameters. The image processing precision was veried through the experiments.
Image processing
GTAW/GMAW Results showed that the precision of this vision based tracking technology can be controlled to be within
70.17 mm and 7 0.3 mm in robotic GTAW and GMAW, respectively.
& 2014 Published by Elsevier Ltd.

1. Introduction other sensors, such as rich visual information, non-contact, fast,


high precision, etc. So it has become the research focus in robotic
At present, a growing number of welding robots have been seam tracking. Kawahara et al. described a tracking control system
applied in the modern industrial automation production; however, for arc welding by using image sensor [10]. Chen et al. established
most of them are teach-and-playback robots. They all have one a computer vision system by utilizing composite ltering technol-
vital weakness, namely, they have not the function of self-rectify ogy, and captured clear weld pool images during robotic-pulsed
deviations during the robotic welding process. That is, the teach- gas tungsten arc welding (GTAW) [11]. Shen et al. studied the weld
and-playback robots require a great deal of time to be taught in pool control of welding robot with computer vision to control
advance when the work-pieces are changed. Even for the same weld penetration [12]. Kong et al. gave an Al alloy weld pool
work-pieces in the welding, the seam position is often disturbed control methods of welding robot with passive vision [13]. Xu et al.
due to distortion, ways of heat spreading, variability of gap, presented a technology about real-time seam tracking control
stagger edge, etc., which will affect the quality of welding forming. technology during welding robot GTAW process based on passive
This weakness is particularly evident when the welding precision vision system [14]. Ye et al. developed a robust algorithm for weld
requirement is high. Therefore, it is necessary to develop a seam extraction based on prior knowledge of weld seam in robotic
technology of real-time seam tracking for welding robots. GMAW [15]. In accordance with the different light source, the
For now, the real-time seam tracking is realized mainly by vision system can be divided into two categories: the active vision
using the various sensors in the robotic arc welding process, such system and the passive vision system. Because passive vision
as arc sensors [14], acoustic sensors [5,6], electromagnetic system is cheaper and can get enough seam information in robotic
sensors [7], ultrasonic sensors [8,9] and vision sensors [1015]. seam tracking, this paper chooses the passive vision sensor system
Among these sensors, vision sensor is the most commonly used in for the robotic seam tracking.
robotic welding [16,17]. Vision sensor has more advantages than Robotic arc welding mainly includes robotic gas tungsten arc
welding (GTAW) and gas metal arc welding (GMAW). Although
vision based techniques for weld seam tracking have made some
n
Corresponding author. signicant achievements, existing methods could not meet higher
E-mail address: xuyanling991@sina.com (Y. Xu). precision requirements in some welding processes. In particular,

http://dx.doi.org/10.1016/j.rcim.2014.09.002
0736-5845/& 2014 Published by Elsevier Ltd.
26 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536

during the welding of sheet metal without groove there is a higher radiation environments. The designed automatic transmission system
precision requirement that existing methods could not address. includes a DC micro-motor, a pair of gear rack and a pair of linear
This paper focuses to address the real-time seam tracking problem guide rail.
in robotic GTAW and GMAW of sheet metals. To achieve this, there Once the system is installed onto the robot, a calibration is
are two problems need to be solved, namely (1) how to capture performed to establish the intrinsic and extrinsic parameters of the
clear welding images and (2) how to accurately extract character- camera [18]. The results of the camera calibration are shown in
istic parameters from real-time seam tracking images through Table 1. From Table 1, it can be seen that the pixel error range is
image processing. Both aspects are crucial to effectively improve within 0.330.62 pixel. This pixel error translates to a precision range
the accuracy of the seam tracking in robotic GTAW and GMAW of within 0.030.05 mm in spatial measurements. This precision can
with passive vision system. satisfactorily meet the seam-tracking requirement of a welding robot.
Meanwhile, the deformation inevitably exists in the coordinate
system of the image plane relative to the absolute coordinate
2. Design of vision system system, so the calibration is necessary before any experiments.
The relation between the image plane coordinates and the
For the seam tracking of teach-and-playback welding robot, absolute coordinates can be calculated using Eq. (1) [12].
8  h 
vision system plays a key role. In this paper, a passive vision
> 2  i1=2
system was purposely designed for seam tracking. Fig. 1 shows the >
< xreal b b 4k ximage ximage0  d2k
real

structure design (right) and the picture of the passive vision 1


>
: yreal image
yimage0 dreal
2
> y
system (left) (Patent number: 2012101210118579 ). xreal kdreal b
Using vision sensor in the robotic welding, a dimmer-lter system is
the key component. It often needs to be removed from the CCD camera where ximage ; yimage and xreal ; yreal are the image plane coordinates
during welding environment identication and initial seam position and the real world coordinates, respectively. ximage0 ; yimage0 is the x,
guiding, while it must be put under the CCD camera during welding. y coordinates of the origin of the absolute coordinate system O0; 0
0 0
In the passive vision system of this paper, the motorized mechanism on the image plane coordinate system, k, k and b, b are the scale
can remove the dimmer-lter system automatically over long distan- factor and the intercept. dreal is the interval of the panes.
ces before welding and place them back during welding for seam To eliminate the disturbance of arc light during robotic seam
tracking. This automated mechanism allows the system to work in tracking, a dimmer-lter system is used. To perform a proper ltering
adverse and dangerous conditions, such as in poisonous or nuclear of light, light spectrum of welding arc is analyzed. Taking the pulsed

CCD
Lens
Micro-motor
Dimmer-filter

Gear

Reflector
Rack

Linear guide rail


Fig. 1. Design of a passive vision system.

Table 1
Calibration results of the CCD.

Calibration results after optimization (with uncertainties)

Focal length fc [3229.14870 3234.68237] 7 [235.17548 224.40216]


Principal point cc [179.10083 193.67640] 7 [144.48150 96.16236]
Skew Alpha_c [0.00000] 7 [0.00000]Z angle of pixel axes 90.00000 7 0.00000 degrees
Distortion kc [0.88165  15.19803  0.01775  0.03744 0.00000] 7 [0.66612 18.69930 0.02328 0.03856 0.00000]
Pixel error err [0.33404 0.61929]
Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536 27

Fig. 2. Distribution map of arc spectrum character of GTAW/GMAW: (a) pulsed GTAW for Al alloy, (b) pulsed GMAW for Q235 steel.

Vision Sensor
Robot Controller

Control Line
Teach Box

UP6 Robot
Positioner

Interface Box
Gas Soure
GND

Host Computer
Water Tank
Weld Power Soure
Fig. 3. The robotic GTAW system.

Control Line

Host Computer

Wire Feeder

Robot controller
Visual Sensor

Control Line

FANUC Robot

GND Robot Positioner


Gas WeldPower Source
soure

Fig. 4. The robotic GMAW system.


28 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536

Start

Initialize Vision Sensor Image


Parameters and Robot Capturing
Registers

Image
Input Welding Parameters Selecting
Condition Setting

N
Complete?
Inquire the Welding N
Complete?
Database Y
Y Image Preprocessing
(Windowed, Restoration and
Trajectory Smooth)
Obtain the Welding planning
Process Parameters
Edge
N Detection
Complete?

Read Position Y Characteristic


Coordinates Parameters Extraction
Arc On

Write Position N
Message Response, Display Complete?
Coordinates
of Interface & Robot Control
Y
Get Deviation
Arc off Value

End
End

Fig. 5. The ow chart of seam tracking control program for robotic GTAW and GMAW.

Table 2 Table 3
The relevant technological parameters of experiment for Al alloy GTAW. The relevant technological parameters of experiment for Q235 steel GMAW.

Parameter Value Parameter Value Parameter Value Parameter Value

Pulse frequency 2 HZ Duty cycle of pulse duration 50 percent Welding materials Q235 steel Weld frequency 100 Hz
Peak current 230 A Argon ow 15 L/m Type of welding seam Butt joint Welding speed 7 mm/s
Base current 30 A Work-piece(LF6) 3 mm Pulse frequency 100 HZ Wire diameter 1.2 mm
Feed speed 10 mm/s Welding speed 3 mm/s Welding current 270 A Shielding gas 92%Ar 8%CO2
Tungsten anode diameter 3.2 mm Shielding gas 99.99%Ar Welding volts 17 V Gas ow 16 L/m
Feed speed 2 mm/s Work-piece 2.5 mm

GTAW of Al alloy and Pulsed GMAW for Q235 steel as an example,


Fig. 2 is the distribution map of the arc spectrum character. From the
gure, it can be seen that the electric arc light range of 620700 nm is isolation unit, the weld power supply and the host computer.
the weakest and most stable, which can be used to capture welding The isolation unit (interface box) is a self-made circuit to isolate
images. Further experiment shows that an optical lter with the the host computer and the welder. Because of high frequency and
central wavelength of 660 nm is suitable for our welding system. high voltage in robotic GTAW, the isolation unit is very important.

3. Robotic arc welding system 3.2. Robotic GMAW system

Robotic GTAW and GMAW are the two most commonly used arc Fig. 4 is a schematic diagram of robotic GMAW. It consists of
welding methods. Robotic GTAW uses a high-frequency arc initiation, four parts: the robotic system, the vision system, the weld power
and the robotic GMAW uses contact arc initiation. Because the way of and the host computer. The robotic system is a six-degree FANUC
arc ignition is different, the two welding systems are also different. industry robot. The weld power consists of a Lincoln AC welding
machine and a wire feeder.
3.1. Robotic GTAW system Fig. 5 is the ow chart of the seam tracking control program for
robotic GTAW and GMAW. For seam tracking, the image processing
The robotic GTAW system is shown in Fig. 3. It includes six processes mainly include image capturing, image selecting and
parts: the robot arm, the robot controller, the vision system, image processing.
Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536 29

Work-piece
Tungsten electrode

Weld pool

Nozzle

Weld seam

Welding wire

Fig. 6. The welding image during the GTAW process.

500 frequency is much higher than GTAW. In GTAW which has a low
450 pulse frequency of 110 Hz, clear images could be acquired during
the time when the current is at the lowest [19]. However, for the
400
GMAW process, the welding current pulse is at a much higher
350 frequency (around 100 Hz). Fig. 7 shows the welding current
300 waveform of the GMAW process. From Fig. 7, it can be seen that
CurrentI(A)

the current pulse frequency is around 100 Hz, and a current cycle
250
is around 10 ms. Therefore, if the same method to capture images
200 is used in GMAW as that in GTAW, the images should be captured
150 at around 200 fps. This will require a very expensive camera
100
device. The acquisition of clear welding image is always techno-
logically difcult in robotic GMAW.
50
In this paper, the USB CCD camera is used that has a maximum
0 image collection frequency of around 30 Hz. In actual experiment,
0 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 0.1
the collection frequency is chosen to be 25 fps. This means that the
Time (s)
time duration between two consecutive images is 40 ms. This
Fig.7. The welding current waveform of GMAW. means that each image could be exposed up to four current
periods. Therefore it is impossible to capture weld images for
4. Capturing and selecting High quality images GMAW at the base current period. In the robot GMAW process,
clear images are acquired mainly by selecting appropriate
To perform the real-time seam tracking effectively during the dimmer-lter system. As mentioned in Section 2, one narrow-
robotic welding process, whether for robotic GTAW or GMAW, the band lter with a pass wavelength of 660 nm is chosen with a
real-time welding seam images must be captured clearly and dimmer glass with an attenuation of 94%. Fig. 8 shows one of weld
processed accurately. In this paper analyses have been done to seam images captured using this dimmer-lter system. Although
determine the parameters that affect the image capturing. The the image of GMAW is not as good as GTAW, edges of the seam and
method is developed based on the Al alloy GTAW and low carbon pool still can be extracted.
steel GMAW. The relevant technological parameters of experiment
are shown in Tables 2 and 3, respectively.
4.2. Image selection

4.1. Images capturing As the quality of welding images is affected by many factors, it
is difcult to ensure that every image captured is clear even when
4.1.1. Images capturing of GTAW the same parameters are used for capturing images. Processing an
For the images captured during robotic GTAW, the quality of image image without useful information not only is a waste of time but
acquisition is associated not only with the dimmer-lter system, but also could potentially lead to wrong results being derived. There-
also with other factors, such as, the value of base current and the time fore, a method is required to quickly determine which images are
of capturing the image. Through a large number of welding experi- useful for further processing.
ments, the parameters of Al alloy pulsed GTAW are ascertained. Using
the parameters, the clear and high quality images can be captured. For
4.2.1. Image selection of GTAW
example, for dimmer-lter system, the wavelength is about 660 nm,
In this paper, the welding pulse frequency is 2 Hz in robotic
and the two dimmer glasses are 89% and 96%, respectively. The image is
GTAW, i.e., the cycle time of welding current is 500 ms, and the
captured when the current is at base current, and the value is 30 A. The
time of base current is 250 ms in a current cycle. Fig. 9 is the time
best time of capturing images is 50 ms after the fall edge from the peak
of capturing image for GTAW. From the above analysis, clear
level to the background level. For detailed image capturing method,
images were acquired in base current level duration. The start
refer to the papers by Xu et al. [19]. Fig. 6 shows the welding image of
time of capturing image is T 50 ms, where T is the fall edge
Al alloy during robotic GTAW by using the passive vision system.
moment from peak level to base level. Fig. 10 is four seam images
of Al alloy in real-time weld within 200 ms during the base
4.1.2. Images capturing of GMAW current stage. Each image is captured at a 50 ms interval.
Compared with the robotic GTAW, its more difcult to acquire Due to slow speed in robotic GTAW, deviation is detected one
clear welding images in robotic GMAW. For GMAW, the welding time every current cycle. The deviation is acquired from one of
30 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536

Nozzle

Electric arc
Wire

Weld pool

Weld seam
Work-piece

Fig. 8. The welding image of low carbon steel for GMAW.

T =500ms trajectory is adjusted every 0.2 s. However, there are 5 images


Tp being captured within 0.2 s, and it is unnecessary to process each
T b=250ms
image. Therefore, we need to choose the best image out of every
I(A)

ve images. Fig. 12 is the ve serial seam images of Q235 steel


captured during 0.2 s of GMAW process.
For the GMAW images, this paper takes the same approach to
select the useful image. As shown in Fig. 13(a)(c). According to
O the GGMAW , we can easily choose the best image as the useful
t (s) image from every ve images.
jGa j jGb j jGc j jGd j
GGMAW 3
4
Background level
T 1=50ms
T 2=100ms
T 3=150ms 5. Images processing
Peak level T 4=200ms
In this paper, the aim of the image processing is to obtain the
Fig. 9. The time of capturing image for GTAW. weld pool center Oax ; ay and the weld seam centerline equa-
tiony kx b. To achieve this, a method is developed by integrat-
ing many image processing algorithms, such as image restoration,
four images. However, there are some differences at the different image smoothing, edge detection, false edge removal and edge
moment of the same base current. We must choose one out of scan. By calculating the distance from the pool center to the seam
every four images to process, it is critical that how to select the centerline, the deviation between the welding wire and the weld
image. According to the analysis of images, its not difcult to nd seam centerline can be acquired. Fig.14 shows the denition of
that the larger the change of the gray value of seam and pool edge, these geometrical characteristic parameters for the pulsed GTAW
the better the result of edge extraction should be. In order to and GMAW. Compared with robotic GTAW, clear weld image of
distinguish the largest change of the gray level, one line is drew GMAW are more difcult to capture, which will increase the
through the weld pool in every image, as shown in Fig. 11(a). difculty of processing the image. So, this paper will focus on
Fig. 11(b) and (c) are the pixel distribution along the line and the introducing the image processing of robotic GMAW as this image
gray gradient information in the image, respectively. processing method can be easily adopted into GTAW applications.
The average value of the intensity gradient for the two peak In image processing, if the whole welding image is processed, it
values, i.e., the two edge points, along the line in the weld image is will be a huge waste of time as only the immediate neighboring
given in Eq. (2), as shown in Fig. 11(c). seam is required for tracking purpose. In order to improve the
speed of image processing, two windows are introduced to focus
jGa j jGb j the process being performed only on required locations. The
GGTAW 2
2 windows are created to cover the areas of the pool and the seam.
where GGTAW is the average value of the intensity gradient of two As Fig. 15 shows, windows 1, 2 are the area of the weld pool and
edge points in the image. The Ga and Gb are the intensity gradient the seam, respectively.
value of the two edge points along the line, respectively. The larger In addition to the image window creation, several other
the GGTAW , the better the image quality. From the GGTAW , we can operations are performed that include image restoration, image
determine which image quality is good enough to provide sharp smoothing, edge detection, false edge removal and edge scan.
edges at object boundaries, and we can easily choose the best Fig. 16 is the ow chart of the image processing.
image from ve images.
5.1. Edge detectionAn improved method
4.2.2. Image selection of GMAW
For GMAW, the welding speed is generally faster than GTAW. In the image processing, edge detection is the most vital step.
During the actual seam tracking for robotic GMAW, seam deviation Among the numerous available edge detection algorithms, each
is detected ve times every second, which means that the robot algorithm has its advantages and disadvantages in certain situations.
Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536 31

T+50ms T+100ms T+150ms T+200ms


Fig. 10. The real-time weld seam image of Al alloy within 200 ms during GTAW process.

Fig. 11. (a) The welding image of GTAW, (b)the pixel distribution along the line, (c) the gray gradient information along the line.

T+40ms T+80ms T+120ms T+160ms T+200ms


Fig. 12. The real-time welding image of Q235 steel during GMAW process.

Some do well in edge detection while others are good at anti-noise. However, there are some shortcomings in the traditional Canny
Take weld pool image processing as an example, some typical algorithm to detect the edge of the welding image. First, the
algorithms, such as, Roberts, Sobel, Prewitt, Laplacian and Canny, are traditional Canny algorithm uses the Gaussian function to smooth
used to detect the pool edge. Results of the edge detection are shown images, which cannot remove the local noise and may detect fake
in Fig. 17. By comparison among different algorithms, we can see that edges or may lose local edges when the intensity values of them
the traditional Canny algorithm is clearly superior to other algorithms change slowly. Second, high and low thresholds of detecting image
in accuracy and continuity of edges detection [20]. edges need to be set manually. However, the robotic arc welding
32 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536

Fig. 13. (a) The welding image of GMAW, (b) the pixel distribution along the line, (c) the gray gradient information along the line.

Fig. 14. The denition of characteristic parameters for welding image: (a) pulsed GTAW, (b) pulsed GMAW.

Window2

Window1

Window 2

Window 1

Fig. 15. Extraction of small window: (a) pulsed GTAW, (b) pulsed GMAW.

process involves many uncertain aspects, such as the unstable different. Therefore, to make the Canny edge detection effective, a
electric arc and constant changes of the external factors. Even if method of automatically determining the two thresholds used in
the process is the same, the welding images acquired may be Canny needs to be implemented for robot welding automation.
Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536 33

An improved Canny edge detection algorithm is proposed in it has better precision and adaptability in the edge detection
this paper to addressed the two problems mentioned above. First, during automated robotic welding processes.
in the improved Canny algorithm, the Gaussian lter is replaced
with the nonlinear anisotropy diffusion lter, which can locally 5.2. Characteristic parameters extraction
smooth images to avoid excessive smoothing and retain large scale
edge information. Second, the values of the high and low thresh- By using the above image processing method, the pool center
old can be automatically obtained by using Otsu algorithm in the coordinate Oax ; ay can be extracted. The same method applies to
proposed method, and the edges are detected and connected with the seam processing. We can also t two line functions from the
the acquired thresholds [19]. Fig.18 is the comparison by using the array of the left-edge and right-edge.
traditional Canny algorithm and the improved Canny to detect the (
f left x k1 x b1
edge. The results show that the improved Canny algorithm can 0 r x r nwidth 4
improves the effect of edge detection of different weld images, and f right x k2 x b2

where f left x and f right x are the left-edge and right-edge function,
respectively. From the two functions, we can obtain the seam
center liney k1 k2 =2x b1 b2 =2. Then the deviation dt
Start between the pool center point and the seam centerline can be
calculated, the calculation equation is given in Eq. (5).
False Edge    
Image Window  k1 k2 =2 ax  ay b1 b2 =2 
Removal dt q
5
Creation 2
k1 k2 =2  12

Image Restoration Edge Scan 5.3. Image processing ow

With the above steps, the complete processing ow of seam


and pool image for robotic GMAW is shown in Fig. 19.
Image Smoothing
Characteristic The methods of welding image processing developed for
Parameters Extraction GMAW images, are applicable to GTAW images.

Edge Detection
6. Experimental verication
End
It is generally known that the robot arc welding process is a
Fig. 16. The specic ow chart of image processing. very complex and variable process, which often affect the welding

Fig. 17. The comparison diagram of several typical algorithms of edge detection: (a) Original image, (b) Roberts, (c) Sobel, (d) Prewitt, (e) Laplacian, (f) Canny.

Fig. 18. Compare of Canny and improve Canny: (a) Canny, (b) improved Canny.
34 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536

Median filter Restoration Small window1

Window 1

Improved Canny Removing false edge Edge scan

Seam center fitting Edge fitting

Median filter Restoration Small window2 Window 2

Improved Canny Removing false edge Edge scan

Edge fitting
Fig. 19. The ow diagram of image processing during robot GMAW process.

Fig. 20. The experiment system set up of GTAW and GMAW for seam tracking: (a) pulsed GTAW, (b) pulsed GMAW.

Fig. 21. The experiment verication of image processing precision: (a) pulsed GTAW, (b) pulsed GMAW.
Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536 35

image quality and the precision of processing. Experiments are vision-based seam tracking method consists of the design of vision
conducted to verify the proposed method. Experiments of straight system, image capturing, image selecting and image processing
seam welding using GTAW and GMAW are performed to evaluate method. Experimental results had shown that the developed
the precision of the above image processing methods. method is feasible and sufcient to meet the specic precision
Fig. 20 is the experimental set up of GTAW (Fig. 20(a) and requirements of some applications in the robotic seam tracking.
GMAW (Fig. 20(b). The robot is programmed to move along the The following conclusions were generalized:
seam center in the experiment. The passive vision system captures
and processes the images in real-time, and calculates the deviation (1) In order to meet the needs of robot seam tracking, a special
from the pool center to the seam center line. Fig. 21 is the state of passive vision system was designed and the weld system
the welding-piece used in the experiment after the welding. platforms were built for the robotic GTAW and GMAW.
As the robot is taught along the seam center in the experi- (2) Using the purpose-designed passive vision system, clear weld
ments, the deviation value of experiments can be approximately images were acquired, which was the necessary prerequisite
seen as the precision of the image processing. The real-time to improve the precision of seam tracking during robotic arc
deviation data are shown in Figs. 22 and 23 for robotic GTAW welding process.
and GMAW. The results in Fig. 22 show that the precision range of (3) By utilizing the gray gradient information along the line in the
the image processing of GTAW can be controlled about within weld image, the useful image can be quickly and reliably
70.17 mm [19]. And from Fig. 23, we can see that the deviation selected during the real-time image processing process.
range from wire to the seam center is within 7 0.30 mm. That (4) An improved Canny algorithm of image processing was pro-
means the precision range of the image processing for robotic posed that could accurately extract the characteristic para-
GTAW and GMAW can be controlled to be within 70.17 mm and meters of welding images.
70.3 mm, respectively. (5) The precision of the image processing approach was veried in
robotic GTAW and GMAW by experiments. It could be con-
trolled to be within 70.17 mm and 70.3 mm, respectively,
7. Conclusions which can satisfy the quality demand of seam forming in
robotic arc welding.
To improve welding precision in GTAW and GMAW, a vision
based seam tracking technique is introduced in the paper. The
Acknowledgements
0.3

X: 35
This work is partly supported by the Australian Research
The deviation between torch and

0.2 Y: 0.1652
Council and Lincoln Electric Company (Australia) under project
seam center d(t) / (mm)

0.1 ID LP0991108, the National Natural Science Foundation of China


under the Grant No. 51405298, 61374071.
0
References
-0.1
[1] Kim JW, Na SJ. A self-organizing fuzzy control approach to arc sensor for weld
joint tracking in gas metal arc welding of butt joints. Weld Res Suppl
-0.2 X: 89
Y: -0.1693
1993;2:606.
[2] Jeong SK, Lee GY, Lee WK, Kim SB. Development of high speed rotating arc
sensor and seam tracking controller for welding robots. Ind Electron
0 5 10 15 20 25 30 35 40 45 50 55 60 65 2001;2:84550.
Time (S) [3] Shi YH, Yoo WS, Na SJ. Mathematical modeling of rotational arc sensor in
GMAW and its applications to seam tracking and endpoint detection. Sci
Fig. 22. The precision range of the image processing of robotic GTAW. Technol Weld Joining 2006;11:72330.

0.5
X: 41
Y: 0.2998
and s eam c enter d(t)/(mm)
The dev iation between wire

X: 112
Y: -0.2923

-0.5
0 2 4 6 8 10 12 14 16 18 20 22 24 26 28
Time (S)

Fig. 23. The precision range of the image processing of robotic GMAW.
36 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536

[4] Xu YL, Zhong JY, Ding MY, Chen HB, Chen SB. The acquisition and processing of [13] Kong M, Chen SB. Al alloy weld pool control of welding robot with passive
real-time information for height tracking of robotic GTAW process by arc vision. Sens Rev 2009;29:2837.
sensor. Int J Adv Manuf Technol 2013;65:103143. [14] Xu YL, Yu HW, Zhong JY, Lin T, Chen SB. Real-time seam tracking control
[5] Estochen EL, Neuman CP. Application of acoustic sensors to robotic seam technology during welding robot GTAW process based on passive vision
tracking. Ind Electron 1984;3:21924. sensor. J Mater Process Technol 2012;212:165462.
[6] Lv N, Xu YL, Zhang ZF, Wang JF, Chen B, Chen SB. Audio sensing and modeling [15] Ye Z, Fang G, Chen SB, Dinham M. A robust algorithm for weld seam extraction
of arc dynamic characteristic during pulsed Al alloy GTAW process. Sens Rev based on prior knowledge of weld seam. Sens Rev 2013;33:12533.
2013;33:14156. [16] Ye Z, Fang G, Chen SB, Zou JJ. Passive vision based seam tracking system for
[7] Kim JW, Shin JH. A study of a dual-electromagnetic sensor system for weld pulse-MAG welding. Int J Adv Manuf Technol. 2013;67:198796.
seam tracking of I-butt joints. Proc Inst Mech Eng Part B: J Eng Manuf [17] Dinham M, Fang G. Autonomous weld seam identication and localisation
2003;217:130513. using eye-in-hand stereo vision for robotic arc welding. Rob Comput Integr
[8] Maqueira B, Umeagukwu CI, Jarzynski J. Application of ultrasonic sensors to
Manuf 2013;29(5):288301.
robotic seam tracking. IEEE Trans Rob Autom 1989;5:33744.
[18] Dinham M, Fang G., A low cost hand-eye calibration method for arc welding
[9] Mahajan A, Figueroa F. Intelligent seam tracking using ultrasonic sensors for
robots, In: Proceedings of the 2009 IEEE international conference on robotics
robotic welding. Robotica 1997;15:27581.
and biomimetics (ROBIO 2009) Guilin, Guangxi, China; December 1822,
[10] Kawahara M. Tracking control system using image sensor for arc welding.
Automatica 1983;19:35763. 2009. p. 18891893.
[11] Chen SB, Zhang Y, Qiu T, Lin T. Robotic welding systems with vision sensing [19] Xu YL, Yu HW, Zhong JY, Lin T, Chen SB. Real-time image capturing and
and self-learning neuron control of arc weld dynamic process. J Intell Rob Syst processing of seam and pool during robotic welding process. Int J: Ind Robot
2003;36:191208. 2012;39:51323.
[12] Shen HY, Ma HB, Lin T, Chen SB. Research on weld pool control of welding [20] Canny J. A computational approach to edge detection. IEEE Trans Pattern Anal
robot with computer vision. Ind Robot 2007;34:46775. Mach Intell 1986;8:67998.

Das könnte Ihnen auch gefallen