Beruflich Dokumente
Kultur Dokumente
art ic l e i nf o a b s t r a c t
Article history: Due to ever increasing demand in precision in robotic welding automation and its inherent technical
Received 1 January 2014 difculties, seam tracking has become the research hotspot. This paper introduces the research in
Received in revised form application of computer vision technology for real-time seam tracking in robotic gas tungsten arc
1 September 2014
welding (GTAW) and gas metal arc welding (GMAW). The key aspect in using vision techniques to track
Accepted 1 September 2014
welding seams is to acquire clear real-time weld images and to process them accurately. This is directly
related to the precision of seam tracking. In order to further improve the accuracy of seam tracking, in
Keywords: this paper, a set of special vision system has been designed rstly, which can acquire clear and steady
Robot welding real-time weld images. By analyzing the features of weld images, a new and improved edge detection
Vision technology
algorithm was proposed to detect the edges in weld images, and more accurately extract the seam and
Seam tracking
pool characteristic parameters. The image processing precision was veried through the experiments.
Image processing
GTAW/GMAW Results showed that the precision of this vision based tracking technology can be controlled to be within
70.17 mm and 7 0.3 mm in robotic GTAW and GMAW, respectively.
& 2014 Published by Elsevier Ltd.
http://dx.doi.org/10.1016/j.rcim.2014.09.002
0736-5845/& 2014 Published by Elsevier Ltd.
26 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536
during the welding of sheet metal without groove there is a higher radiation environments. The designed automatic transmission system
precision requirement that existing methods could not address. includes a DC micro-motor, a pair of gear rack and a pair of linear
This paper focuses to address the real-time seam tracking problem guide rail.
in robotic GTAW and GMAW of sheet metals. To achieve this, there Once the system is installed onto the robot, a calibration is
are two problems need to be solved, namely (1) how to capture performed to establish the intrinsic and extrinsic parameters of the
clear welding images and (2) how to accurately extract character- camera [18]. The results of the camera calibration are shown in
istic parameters from real-time seam tracking images through Table 1. From Table 1, it can be seen that the pixel error range is
image processing. Both aspects are crucial to effectively improve within 0.330.62 pixel. This pixel error translates to a precision range
the accuracy of the seam tracking in robotic GTAW and GMAW of within 0.030.05 mm in spatial measurements. This precision can
with passive vision system. satisfactorily meet the seam-tracking requirement of a welding robot.
Meanwhile, the deformation inevitably exists in the coordinate
system of the image plane relative to the absolute coordinate
2. Design of vision system system, so the calibration is necessary before any experiments.
The relation between the image plane coordinates and the
For the seam tracking of teach-and-playback welding robot, absolute coordinates can be calculated using Eq. (1) [12].
8 h
vision system plays a key role. In this paper, a passive vision
> 2 i1=2
system was purposely designed for seam tracking. Fig. 1 shows the >
< xreal b b 4k ximage ximage0 d2k
real
CCD
Lens
Micro-motor
Dimmer-filter
Gear
Reflector
Rack
Table 1
Calibration results of the CCD.
Fig. 2. Distribution map of arc spectrum character of GTAW/GMAW: (a) pulsed GTAW for Al alloy, (b) pulsed GMAW for Q235 steel.
Vision Sensor
Robot Controller
Control Line
Teach Box
UP6 Robot
Positioner
Interface Box
Gas Soure
GND
Host Computer
Water Tank
Weld Power Soure
Fig. 3. The robotic GTAW system.
Control Line
Host Computer
Wire Feeder
Robot controller
Visual Sensor
Control Line
FANUC Robot
Start
Image
Input Welding Parameters Selecting
Condition Setting
N
Complete?
Inquire the Welding N
Complete?
Database Y
Y Image Preprocessing
(Windowed, Restoration and
Trajectory Smooth)
Obtain the Welding planning
Process Parameters
Edge
N Detection
Complete?
Write Position N
Message Response, Display Complete?
Coordinates
of Interface & Robot Control
Y
Get Deviation
Arc off Value
End
End
Fig. 5. The ow chart of seam tracking control program for robotic GTAW and GMAW.
Table 2 Table 3
The relevant technological parameters of experiment for Al alloy GTAW. The relevant technological parameters of experiment for Q235 steel GMAW.
Pulse frequency 2 HZ Duty cycle of pulse duration 50 percent Welding materials Q235 steel Weld frequency 100 Hz
Peak current 230 A Argon ow 15 L/m Type of welding seam Butt joint Welding speed 7 mm/s
Base current 30 A Work-piece(LF6) 3 mm Pulse frequency 100 HZ Wire diameter 1.2 mm
Feed speed 10 mm/s Welding speed 3 mm/s Welding current 270 A Shielding gas 92%Ar 8%CO2
Tungsten anode diameter 3.2 mm Shielding gas 99.99%Ar Welding volts 17 V Gas ow 16 L/m
Feed speed 2 mm/s Work-piece 2.5 mm
Robotic GTAW and GMAW are the two most commonly used arc Fig. 4 is a schematic diagram of robotic GMAW. It consists of
welding methods. Robotic GTAW uses a high-frequency arc initiation, four parts: the robotic system, the vision system, the weld power
and the robotic GMAW uses contact arc initiation. Because the way of and the host computer. The robotic system is a six-degree FANUC
arc ignition is different, the two welding systems are also different. industry robot. The weld power consists of a Lincoln AC welding
machine and a wire feeder.
3.1. Robotic GTAW system Fig. 5 is the ow chart of the seam tracking control program for
robotic GTAW and GMAW. For seam tracking, the image processing
The robotic GTAW system is shown in Fig. 3. It includes six processes mainly include image capturing, image selecting and
parts: the robot arm, the robot controller, the vision system, image processing.
Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536 29
Work-piece
Tungsten electrode
Weld pool
Nozzle
Weld seam
Welding wire
500 frequency is much higher than GTAW. In GTAW which has a low
450 pulse frequency of 110 Hz, clear images could be acquired during
the time when the current is at the lowest [19]. However, for the
400
GMAW process, the welding current pulse is at a much higher
350 frequency (around 100 Hz). Fig. 7 shows the welding current
300 waveform of the GMAW process. From Fig. 7, it can be seen that
CurrentI(A)
the current pulse frequency is around 100 Hz, and a current cycle
250
is around 10 ms. Therefore, if the same method to capture images
200 is used in GMAW as that in GTAW, the images should be captured
150 at around 200 fps. This will require a very expensive camera
100
device. The acquisition of clear welding image is always techno-
logically difcult in robotic GMAW.
50
In this paper, the USB CCD camera is used that has a maximum
0 image collection frequency of around 30 Hz. In actual experiment,
0 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 0.1
the collection frequency is chosen to be 25 fps. This means that the
Time (s)
time duration between two consecutive images is 40 ms. This
Fig.7. The welding current waveform of GMAW. means that each image could be exposed up to four current
periods. Therefore it is impossible to capture weld images for
4. Capturing and selecting High quality images GMAW at the base current period. In the robot GMAW process,
clear images are acquired mainly by selecting appropriate
To perform the real-time seam tracking effectively during the dimmer-lter system. As mentioned in Section 2, one narrow-
robotic welding process, whether for robotic GTAW or GMAW, the band lter with a pass wavelength of 660 nm is chosen with a
real-time welding seam images must be captured clearly and dimmer glass with an attenuation of 94%. Fig. 8 shows one of weld
processed accurately. In this paper analyses have been done to seam images captured using this dimmer-lter system. Although
determine the parameters that affect the image capturing. The the image of GMAW is not as good as GTAW, edges of the seam and
method is developed based on the Al alloy GTAW and low carbon pool still can be extracted.
steel GMAW. The relevant technological parameters of experiment
are shown in Tables 2 and 3, respectively.
4.2. Image selection
4.1. Images capturing As the quality of welding images is affected by many factors, it
is difcult to ensure that every image captured is clear even when
4.1.1. Images capturing of GTAW the same parameters are used for capturing images. Processing an
For the images captured during robotic GTAW, the quality of image image without useful information not only is a waste of time but
acquisition is associated not only with the dimmer-lter system, but also could potentially lead to wrong results being derived. There-
also with other factors, such as, the value of base current and the time fore, a method is required to quickly determine which images are
of capturing the image. Through a large number of welding experi- useful for further processing.
ments, the parameters of Al alloy pulsed GTAW are ascertained. Using
the parameters, the clear and high quality images can be captured. For
4.2.1. Image selection of GTAW
example, for dimmer-lter system, the wavelength is about 660 nm,
In this paper, the welding pulse frequency is 2 Hz in robotic
and the two dimmer glasses are 89% and 96%, respectively. The image is
GTAW, i.e., the cycle time of welding current is 500 ms, and the
captured when the current is at base current, and the value is 30 A. The
time of base current is 250 ms in a current cycle. Fig. 9 is the time
best time of capturing images is 50 ms after the fall edge from the peak
of capturing image for GTAW. From the above analysis, clear
level to the background level. For detailed image capturing method,
images were acquired in base current level duration. The start
refer to the papers by Xu et al. [19]. Fig. 6 shows the welding image of
time of capturing image is T 50 ms, where T is the fall edge
Al alloy during robotic GTAW by using the passive vision system.
moment from peak level to base level. Fig. 10 is four seam images
of Al alloy in real-time weld within 200 ms during the base
4.1.2. Images capturing of GMAW current stage. Each image is captured at a 50 ms interval.
Compared with the robotic GTAW, its more difcult to acquire Due to slow speed in robotic GTAW, deviation is detected one
clear welding images in robotic GMAW. For GMAW, the welding time every current cycle. The deviation is acquired from one of
30 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536
Nozzle
Electric arc
Wire
Weld pool
Weld seam
Work-piece
Fig. 11. (a) The welding image of GTAW, (b)the pixel distribution along the line, (c) the gray gradient information along the line.
Some do well in edge detection while others are good at anti-noise. However, there are some shortcomings in the traditional Canny
Take weld pool image processing as an example, some typical algorithm to detect the edge of the welding image. First, the
algorithms, such as, Roberts, Sobel, Prewitt, Laplacian and Canny, are traditional Canny algorithm uses the Gaussian function to smooth
used to detect the pool edge. Results of the edge detection are shown images, which cannot remove the local noise and may detect fake
in Fig. 17. By comparison among different algorithms, we can see that edges or may lose local edges when the intensity values of them
the traditional Canny algorithm is clearly superior to other algorithms change slowly. Second, high and low thresholds of detecting image
in accuracy and continuity of edges detection [20]. edges need to be set manually. However, the robotic arc welding
32 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536
Fig. 13. (a) The welding image of GMAW, (b) the pixel distribution along the line, (c) the gray gradient information along the line.
Fig. 14. The denition of characteristic parameters for welding image: (a) pulsed GTAW, (b) pulsed GMAW.
Window2
Window1
Window 2
Window 1
Fig. 15. Extraction of small window: (a) pulsed GTAW, (b) pulsed GMAW.
process involves many uncertain aspects, such as the unstable different. Therefore, to make the Canny edge detection effective, a
electric arc and constant changes of the external factors. Even if method of automatically determining the two thresholds used in
the process is the same, the welding images acquired may be Canny needs to be implemented for robot welding automation.
Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536 33
An improved Canny edge detection algorithm is proposed in it has better precision and adaptability in the edge detection
this paper to addressed the two problems mentioned above. First, during automated robotic welding processes.
in the improved Canny algorithm, the Gaussian lter is replaced
with the nonlinear anisotropy diffusion lter, which can locally 5.2. Characteristic parameters extraction
smooth images to avoid excessive smoothing and retain large scale
edge information. Second, the values of the high and low thresh- By using the above image processing method, the pool center
old can be automatically obtained by using Otsu algorithm in the coordinate Oax ; ay can be extracted. The same method applies to
proposed method, and the edges are detected and connected with the seam processing. We can also t two line functions from the
the acquired thresholds [19]. Fig.18 is the comparison by using the array of the left-edge and right-edge.
traditional Canny algorithm and the improved Canny to detect the (
f left x k1 x b1
edge. The results show that the improved Canny algorithm can 0 r x r nwidth 4
improves the effect of edge detection of different weld images, and f right x k2 x b2
where f left x and f right x are the left-edge and right-edge function,
respectively. From the two functions, we can obtain the seam
center liney k1 k2 =2x b1 b2 =2. Then the deviation dt
Start between the pool center point and the seam centerline can be
calculated, the calculation equation is given in Eq. (5).
False Edge
Image Window k1 k2 =2 ax ay b1 b2 =2
Removal dt q
5
Creation 2
k1 k2 =2 12
Edge Detection
6. Experimental verication
End
It is generally known that the robot arc welding process is a
Fig. 16. The specic ow chart of image processing. very complex and variable process, which often affect the welding
Fig. 17. The comparison diagram of several typical algorithms of edge detection: (a) Original image, (b) Roberts, (c) Sobel, (d) Prewitt, (e) Laplacian, (f) Canny.
Fig. 18. Compare of Canny and improve Canny: (a) Canny, (b) improved Canny.
34 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536
Window 1
Edge fitting
Fig. 19. The ow diagram of image processing during robot GMAW process.
Fig. 20. The experiment system set up of GTAW and GMAW for seam tracking: (a) pulsed GTAW, (b) pulsed GMAW.
Fig. 21. The experiment verication of image processing precision: (a) pulsed GTAW, (b) pulsed GMAW.
Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536 35
image quality and the precision of processing. Experiments are vision-based seam tracking method consists of the design of vision
conducted to verify the proposed method. Experiments of straight system, image capturing, image selecting and image processing
seam welding using GTAW and GMAW are performed to evaluate method. Experimental results had shown that the developed
the precision of the above image processing methods. method is feasible and sufcient to meet the specic precision
Fig. 20 is the experimental set up of GTAW (Fig. 20(a) and requirements of some applications in the robotic seam tracking.
GMAW (Fig. 20(b). The robot is programmed to move along the The following conclusions were generalized:
seam center in the experiment. The passive vision system captures
and processes the images in real-time, and calculates the deviation (1) In order to meet the needs of robot seam tracking, a special
from the pool center to the seam center line. Fig. 21 is the state of passive vision system was designed and the weld system
the welding-piece used in the experiment after the welding. platforms were built for the robotic GTAW and GMAW.
As the robot is taught along the seam center in the experi- (2) Using the purpose-designed passive vision system, clear weld
ments, the deviation value of experiments can be approximately images were acquired, which was the necessary prerequisite
seen as the precision of the image processing. The real-time to improve the precision of seam tracking during robotic arc
deviation data are shown in Figs. 22 and 23 for robotic GTAW welding process.
and GMAW. The results in Fig. 22 show that the precision range of (3) By utilizing the gray gradient information along the line in the
the image processing of GTAW can be controlled about within weld image, the useful image can be quickly and reliably
70.17 mm [19]. And from Fig. 23, we can see that the deviation selected during the real-time image processing process.
range from wire to the seam center is within 7 0.30 mm. That (4) An improved Canny algorithm of image processing was pro-
means the precision range of the image processing for robotic posed that could accurately extract the characteristic para-
GTAW and GMAW can be controlled to be within 70.17 mm and meters of welding images.
70.3 mm, respectively. (5) The precision of the image processing approach was veried in
robotic GTAW and GMAW by experiments. It could be con-
trolled to be within 70.17 mm and 70.3 mm, respectively,
7. Conclusions which can satisfy the quality demand of seam forming in
robotic arc welding.
To improve welding precision in GTAW and GMAW, a vision
based seam tracking technique is introduced in the paper. The
Acknowledgements
0.3
X: 35
This work is partly supported by the Australian Research
The deviation between torch and
0.2 Y: 0.1652
Council and Lincoln Electric Company (Australia) under project
seam center d(t) / (mm)
0.5
X: 41
Y: 0.2998
and s eam c enter d(t)/(mm)
The dev iation between wire
X: 112
Y: -0.2923
-0.5
0 2 4 6 8 10 12 14 16 18 20 22 24 26 28
Time (S)
Fig. 23. The precision range of the image processing of robotic GMAW.
36 Y. Xu et al. / Robotics and Computer-Integrated Manufacturing 32 (2015) 2536
[4] Xu YL, Zhong JY, Ding MY, Chen HB, Chen SB. The acquisition and processing of [13] Kong M, Chen SB. Al alloy weld pool control of welding robot with passive
real-time information for height tracking of robotic GTAW process by arc vision. Sens Rev 2009;29:2837.
sensor. Int J Adv Manuf Technol 2013;65:103143. [14] Xu YL, Yu HW, Zhong JY, Lin T, Chen SB. Real-time seam tracking control
[5] Estochen EL, Neuman CP. Application of acoustic sensors to robotic seam technology during welding robot GTAW process based on passive vision
tracking. Ind Electron 1984;3:21924. sensor. J Mater Process Technol 2012;212:165462.
[6] Lv N, Xu YL, Zhang ZF, Wang JF, Chen B, Chen SB. Audio sensing and modeling [15] Ye Z, Fang G, Chen SB, Dinham M. A robust algorithm for weld seam extraction
of arc dynamic characteristic during pulsed Al alloy GTAW process. Sens Rev based on prior knowledge of weld seam. Sens Rev 2013;33:12533.
2013;33:14156. [16] Ye Z, Fang G, Chen SB, Zou JJ. Passive vision based seam tracking system for
[7] Kim JW, Shin JH. A study of a dual-electromagnetic sensor system for weld pulse-MAG welding. Int J Adv Manuf Technol. 2013;67:198796.
seam tracking of I-butt joints. Proc Inst Mech Eng Part B: J Eng Manuf [17] Dinham M, Fang G. Autonomous weld seam identication and localisation
2003;217:130513. using eye-in-hand stereo vision for robotic arc welding. Rob Comput Integr
[8] Maqueira B, Umeagukwu CI, Jarzynski J. Application of ultrasonic sensors to
Manuf 2013;29(5):288301.
robotic seam tracking. IEEE Trans Rob Autom 1989;5:33744.
[18] Dinham M, Fang G., A low cost hand-eye calibration method for arc welding
[9] Mahajan A, Figueroa F. Intelligent seam tracking using ultrasonic sensors for
robots, In: Proceedings of the 2009 IEEE international conference on robotics
robotic welding. Robotica 1997;15:27581.
and biomimetics (ROBIO 2009) Guilin, Guangxi, China; December 1822,
[10] Kawahara M. Tracking control system using image sensor for arc welding.
Automatica 1983;19:35763. 2009. p. 18891893.
[11] Chen SB, Zhang Y, Qiu T, Lin T. Robotic welding systems with vision sensing [19] Xu YL, Yu HW, Zhong JY, Lin T, Chen SB. Real-time image capturing and
and self-learning neuron control of arc weld dynamic process. J Intell Rob Syst processing of seam and pool during robotic welding process. Int J: Ind Robot
2003;36:191208. 2012;39:51323.
[12] Shen HY, Ma HB, Lin T, Chen SB. Research on weld pool control of welding [20] Canny J. A computational approach to edge detection. IEEE Trans Pattern Anal
robot with computer vision. Ind Robot 2007;34:46775. Mach Intell 1986;8:67998.