0% found this document useful (0 votes)
41 views15 pages

Xiao Et Al. - 2019 - An Adaptive Feature Extraction Algorithm For Multiple Typical Seam Tracking Based On Vision Sensor I

Uploaded by

chendaoming99
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views15 pages

Xiao Et Al. - 2019 - An Adaptive Feature Extraction Algorithm For Multiple Typical Seam Tracking Based On Vision Sensor I

Uploaded by

chendaoming99
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Sensors and Actuators A 297 (2019) 111533

Contents lists available at ScienceDirect

Sensors and Actuators A: Physical


journal homepage: www.elsevier.com/locate/sna

An adaptive feature extraction algorithm for multiple typical seam


tracking based on vision sensor in robotic arc welding
Runquan Xiao a , Yanling Xu a,∗ , Zhen Hou a , Chao Chen a , Shanben Chen a,b
a
Intelligentized Robotic Welding Technology Laboratory, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai, 200240, PR
China
b
Shanghai Key Laboratory of Materials Laser Processing and Modification, School of Materials Science and Engineering, Shanghai Jiao Tong University,
Shanghai, 200240, PR China

a r t i c l e i n f o a b s t r a c t

Article history: Intelligent robotic welding is an indispensable part of modern welding manufacturing, and vision-based
Received 28 April 2019 seam tracking is one of the key technologies to realize intelligent welding. However, the adaptability and
Received in revised form 16 July 2019 robustness of most image processing algorithms are deficient during welding practice. To address this
Accepted 30 July 2019
problem, an adaptive feature extraction algorithm based on laser vision sensor is proposed. According
Available online 9 August 2019
to laser stripe images, typical welding seams are classified into continuous and discontinuous welding
seams. A Faster R-CNN model is trained to identify welding seam type and locate laser stripe ROI auto-
Keywords:
matically. Before welding, initial welding point is determined through point cloud processing to realize
Robotic welding
Adaptability feature extraction
welding guidance. During seam tracking process, the seam edges are achieved by a two-step extraction
Seam tracking algorithm, and the laser stripe is detected by Steger algorithm. Based on the characteristics of two kinds of
Image processing welding seams, the corresponding seam center extraction algorithms are designed. And a prior model is
Vision sensor proposed to ensure the stability of the algorithms. Test results prove that the algorithm has good adapt-
ability for multiple typical welding seams and can maintain satisfying robustness and precision even
under complex working conditions.
© 2019 Elsevier B.V. All rights reserved.

1. Introduction edge features, and Soares et al. [4] employed LSD algorithm to
extract the center of welding seam. As for butt welding seams, Xu
Intelligent welding manufacturing technology has become one et al. [5] proposed an improved Canny algorithm feature extraction
of the main research fields in advanced manufacturing [1]. Fur- algorithm to extract the edges of welding seam and welding pools.
thermore, intelligent robotic welding technology is one of the key And Jin et al. [6] utilized Otus threshold segmentation method and
technologies of intelligent welding manufacturing. Seam tracking Canny algorithm to extract tube sheet welding seams center.
is a major problem in intelligent robotic welding, and vision sens- While passive vision is susceptible to ambient illumination, and
ing technology is an efficient method of dealing with it [2]. With intricate filtering processes are needed to reduce arc interference
the help of vision-based seam tracking technology, it is able for [7]. In contrast, active vision has the advantages of high stabil-
traditional “teach and playback” robots to overcome disturbances ity with the use of auxiliary light sources. And seam tracking is
during welding practice and to meet the requirements of high- mainly realized via extracting the point on the stripe that repre-
quality welding. On the basis of lighting sources, the vision sensing sents welding seam center. To extract feature points of V-shaped
technology can be divided into passive vision and active vision. welding seams, improved Otus algorithm and line detection algo-
Passive vision uses arc as the light source, and its images contain rithm were employed by Jawad et al. [8]. As for butt welding seams,
abundant information. Welding seam tracking is mainly realized by Fan, J et al.extracted the butt welding center and laser stripe by
comparing the deviation between welding seam center and weld- row scanning and column scanning respectively [9], and a precise
ing pool (or welding gun). To track V-shaped welding seams, Canny Hough transform algorithm was designed by Xue B et al. [10] to
algorithm and Hough change were used by Guo et al. [3] to extract extract seam center. Furthermore, Fang et al. [11] proposed a two-
step feature-extraction method to detect center of fillet welding
seam.
However, the algorithms mentioned above mainly focused on
∗ Corresponding author.
single type of welding seam, which limited their scope of appli-
E-mail addresses: [email protected], [email protected] (Y. Xu).

https://doi.org/10.1016/j.sna.2019.111533
0924-4247/© 2019 Elsevier B.V. All rights reserved.
2 R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533

Fig. 2. Seam tracking system procedure.

A 660 ± 10 nm band filter was utilized to ensure that the vision


system can obtain ideal laser stripe images.
The vision sensor needs to be calibrated to determine the con-
version relationship between the coordinates of laser stripe pixels
in image coordinate system and the 3D coordinates in robot base
coordinate system, which is given by Eq. 1.

Pb = b T t t T c c T I PI (1)

Where: PI ——Homogeneous coordinates of pixels in image


coordinate system; c T I ——Transformation matrix from laser
plane coordinate system to camera coordinate system; t T c
——Transformation matrix from camera coordinate to tool coor-
dinate system; b T t ——Transformation matrix from tool coordinate
system to robot base coordinate system; Pb ——Homogeneous coor-
dinates of points in robot base coordinate system;
Fig. 1. Vision sensor for seam tracking.
With the help of Matlab calibration toolbox, the transformation
matrix t T c can be obtained. And c T I is acquired by fitting the laser
cation. Meanwhile, these algorithms lack discussion of robustness plane equation. b T t is determined based on the robot pose when
under strong welding noise, such as welding dust, welding spatter taking each picture.
and strong arc. In this paper, a robotic seam tracking system was established,
Taking into account the problems aforementioned above, Fan and the procedure is shown in Fig. 2. At the beginning, the system
et al. [12] innovatively regarded the distance between stripes and establishes TCP/IP communication with robot and checks whether
the main line as features, and SVM was used to classify features to the camera is calibrated. After system initialization, select point
detect seam types. The algorithm still maintained good effect on cloud acquisition function to obtain initial welding point, and
presence of welding spatter, but it has not been applied in welding select seam tracking function to realize seam tracking. Fig. 3 is
seam tracking process yet. Li et al. developed a string method to the interface of the software, including function button area, status
detect seam types [13]. Feature points were extracted by line scan- monitoring area, parameter setting area and image display area.
ning and similarity discrimination. It realized seam tracking in butt
weld, fillet weld and V-shape weld. On the other hand, it needs to
define a character for each type of welding seam, which may reduce 3. Classification of welding seam
the applicability of algorithm.
Based on laser vision sensor, a feature extraction algorithm for On the basis of laser stripe images, typical welding seams are
multiple typical welding seams is developed. The typical welding divided into two categories. One is continuous welding seam whose
seams are classified into continuous and discontinuous welding laser stripe images consist of several connected lines, such as fillet
seams depending on laser stripe images. It is proposed to use Faster welding seam, V-shaped welding seam and so on, as outlined in
R-CNN algorithm to automatically identify the type of welding Fig. 4. Another is discontinuous welding seam whose laser stripe
seams and locate ROI (region of interest) of laser stripes. Before in images are broken at welding seam edges, such as butt welding
welding, initial welding point extraction is achieved via point cloud seam, V-shaped welding seam with large gap, etc. as outlined in
processing. In the course of welding, edge extraction algorithm and Fig. 5.
Steger algorithm are combined to extract feature points of laser Different features need to be extracted for two types of weld-
stripes. Finally, the adaptability of the algorithm directing at typi- ing seams, so it is necessary to judge their types. For the sake of
cal welding seams is discussed. And experiments are conducted to increasing computation efficiency, it is needed to determine laser
verify the robustness and accuracy of the algorithm. strips ROI. Generally, the seam type and ROI are preset manually as
prior knowledge before seam tracking. However, laser stripe may
move out of ROI when welding gesture changes. And in some special
2. Vision sensor for seam tracking cases, for instance, when the gap of welding seam changes, the type
of welding seam may change. In these cases, it is difficult for the
A laser vision sensor was developed independently, which com- system to make adjustments in time, which may reduce welding
posed of a CCD camera and single line laser, as shown in Fig. 1. quality.
R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533 3

Fig. 3. Main interface of the seam tracking software.

Fig. 4. Continuous welding seam.


Fig. 5. Discontinuous welding seam.

This paper used an object detection algorithm to autonomously


determine the parameters mentioned above. System extracts dif- detection algorithms, such as HOG, Haar-like, SIFT, and HOG-SVM,
ferent features according to detection results, and the boundary box need manually design and select features and have complicated
of detection results can be employed as image processing ROI. In training process. Moreover, the trained models perform poorly in
the course of welding, the detection algorithm is applied to mon- case of large change of target shape, complex background or insuf-
itor seam type and location of ROI to keep the feature extraction ficient light, which makes it difficult to discriminate laser stripes
algorithm stable. from environmental noise. As seen in Fig. 6, there is a large error in
As collected images are grayscale, welding noise and laser stripe detecting laser stripe images of fillet welding seam with welding
are both expressed as white stripes in images. The traditional target spatter when using HOG-SVM algorithm [14].
4 R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533

Table 1
Training Parameters.

Parameters Value

Batch size 1
Max step 12500
Dropout keep probability 1.0
The score threshold for NMS 0.0
The IoU threshold for NMS 0.7
The initial learning rate 0.001
The momentum value 0.9

The images obtained after data augmentation were shown in


Fig. 8.
After data augmentation, 2000 welding seam pictures of each
type were acquired, which were marked as “Discontinues” and
“Continues” respectively. The images of each type are randomly
divided into training set and test set in a ratio of 7:3, and the two
data sets are mutually exclusive.

Fig. 6. Feature detection with HOG-SVM algorithm. 3.3. Training result of Faster R-CNN
In order to extract high quality features, this paper proposes
The training parameters utilized in this paper were noted in
to use Faster R-CNN algorithm [15] to classify laser stripe images,
Table 1. The curve of training error during training process is pre-
which is superior to traditional recognition algorithm in detection
sented in Fig. 9. The horizontal axis represents training steps, and
speed and detection accuracy.
the vertical axis represents the error between output results of cur-
rent network and the real results. After 12,500 iterations, the model
3.1. Faster R-CNN algorithm
basically converged.
Laser stripe images of four types of welding seams were tested
Faster R-CNN algorithm generates 9 anchors of different scales
by the trained model and the results were shown in Fig. 10. Tak-
for each pixel and calculates the Intersection over Union (IoU) value
ing Fig. 10a as an example, the green boundary box reprented the
of the anchor and real target. It can effectively separate detection
position of laser stripe. The c̈ontinuousr̈eferred to classification
target from background and eliminate interference of noise. The
result and 99% was confidence level, indicating the probability of
algorithm procedure is illustrated in Fig. 7:
the welding seam being continuous welding seam is 99%. It proves
In this paper, a pre-trained convolutional network model ResNet
that even under complex conditions, the model can still identify
is adopted to extract features. And transfer learning method is
and locate laser stripe correctly. The matching rate of the anchor
employed to speed up training and improve the performance of
and target reaching 95% is regarded as a successful detection, and
model. The boundary box of detection result is filtered by Non-
the precision P = 0.966, recall R = 0.971.
maximum suppression (NMS) algorithm. The accuracy of the model
The model is applied to test the laser stripe images of lap welding
is evaluated by precision P and recall R, defined as follows:
seam and half V-shape welding seam, which were not involved in
TP the training data set, and the results are displayed in Fig. 11. It can
P= (2)
TP + FP be seen that the model can still make accurate detection, which
TP proves that the trained Faster R-CNN model has excellent capability
R= (3) of generalization.
TP + FN
Where: TP ——number of positive samples judged as positive sam-
4. Robotic welding guidance
ple; FN ——number of positive samples judged as negative sample;
FP ——number of negative samples judged as positive sample; TN Initial welding point guidance and real-time seam tracking
——number of negative samples judged as negative sample should be an integrated and consecutive process, so as to reach
actual welding requirements in robotic autonomous welding. In
3.2. Data set of laser stripe images this paper, the vision sensor is employed to obtain point cloud data
to reconstruct 3D welding seam. The initial welding point guid-
To ensure robustness of the trained model, laser stripe images ance is realized by point cloud feature extraction. The entire guiding
of two types of continuous welding seams, involving V-shaped process is depicted in Fig. 12.
welding seam and fillet welding seam, and two types of discon- Taking fillet welding seam and butt welding seam as instances,
tinuous welding seams, involving butt welding seam and large and generally considering the point on the seam closest to origin
gap V-shaped welding seam, were collected to train the model. of robot base coordinate system is the initial welding point, the
These images were taken at different angles, under different light- feature extraction process is described as follows.
ing conditions, and with welding noises such as arc, reflections and As mentioned above, 3D coordinates of points on the laser stripe
splashes. can be obtained with Eq. (1). Therefore, by scanning the workbench,
In order to improve generalization of the model and prevent 3D point cloud data of the workpiece can be collected, as shown in
over-fitting, this paper adopted following data augmentation meth- Fig. 13.
ods: Since the point cloud is obtained by scanning the workbench,
it is first necessary to separate the workpiece from workbench.
1) Randomly changed the orientation of images; Zhang et al. [16] adopted KD-Tree background model to filter out
2) Added random perturbations to images by using gauss noise; background points. Considering that the working plane is relatively
3) In the HSV space, changed the contrast of images at random; horizontal and the workpiece surface is generally higher than work-
4) Arbitrarily scaled images; ing plane, simple z coordinate segmentation can be employed to
R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533 5

Fig. 7. Schematic diagram of the Faster R-CNN algorithm.

Fig. 8. Effect of data augmentation.

Fig. 9. Training error of Faster R-CNN.

extract workpiece data. By eliminating the points, whose z coordi- Continuous welding seam is the line of intersection of two work-
nate value are less than a specified threshold, the workpiece point piece planes. By fitting the two planes’ equations and calculating
cloud can be acquired. And a statistical filter is applied to elim- the line of intersection, seam equation can be achieved. Firstly, use
inate outliers. The average distance of 100 neighboring points is local surface fitting method to estimate normal vector at each point.
calculated at each point, and the points outside standard range are 80 neighboring points of each point are integrated to fit a plane, and
considered to be an outlier. the normal vector of this plane is regarded as the normal vector
6 R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533

Fig. 10. Detection results of different welding seams.

of the point. Then, planes are segmented with K-means clustering of two planes to obtain the seam equation, marked in red in Fig. 14b,
algorithm. Points in the same plane have similar normal vectors. and the initial welding point is in green.
Therefore, points can be classified into k classes on the basis of Discontinuous welding seam is the midline of welding seam
lighting sources. The cluster number K depends on the character- edges. Therefore, point cloud edge extraction algorithm [17] is
istics of welding seam point cloud. At last, fit plane equations with adopted to extract 3D edge points. Then, project the detected points
RANSAC algorithm on each class, and calculate the intersection line onto x–y plane, and fit a 2D line equation. Therefore, the points of
two edges can be separated by this line. Finally, fit the two edges
R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533 7

Fig. 11. Detection results of welding seams not in the training data sets.

Welding smoke, spatter and arc light may pollute images in


the course of welding process, making it more difficult for feature
extraction. While laser stripe and welding seam edges can be less
disturbed, so it’s considered to extract the stripe and welding seam
edges to extract the seam center. The center of continuous weld-
ing seam is the intersection point of stripe lines. And the center of
discontinuous welding seam is the midpoint of two discontinuity
points. In this paper, points on the stripe which represents the cen-
ter of welding seam are extracted through image processing, and
corresponding 3D coordinates can be calculated via Eq. (1).

5.1. Feature extraction algorithms


Fig. 12. Robotic welding guidance process.

Steger algorithm is utilized to extract stripe center. In the nor-


equation in 3D space via RANSAC algorithm, and the results are mal direction of laser stripe center points, the first derivative of
marked as the red line in Fig. 15b, and the initial welding point is gray value is 0 and the gray value is the maximum. Therefore, the
presented in green. second-order Taylor expansion of gray value is calculated in the
To verify the guidance precision, 10 sets of initial welding point normal direction of each pixel, and if the zero point of its first order
guidance experiments were performed on butt welding seam and derivative is within the pixel, it is considered to be a center point
fillet welding seam respectively. Move the end-effector (welding [20]. Gaussian function and its derivatives g (x, y; ) are utilized
touch) of robot to the initial point accurately and record the coor- to convolve the image. The standard deviation  depends on line
dinates of robot as the actual position. The Euclidean distance width ω,  ≥ √ ω
is required. Fig. 17 displayed the effect of the
2 3
between the calculated position and actual position is taken as
algorithm.
guidance error, and the results are in Fig. 16. It can be seen that the
The edges of welding seam are extracted via Facet model which
error of fillet welding seam is between 0.25 mm and 0.45 mm. And
of high precision. The Facet model is to fit a polynomial f () for
the error of butt welding seam is between 0.3 mm and 0.55 mm. The
each pixel to approximate the distribution of gray value in its
errors are all less than 0.6 mm, which have met actual production
neighborhood [21]. In the gradient direction, if f () satisfies the
demand [18,19]. 
following edge criteria: 1) Module value f is greater than a specified
threshold. 2) Zero point
 of the second derivative  ˆ locates within
5. Laser stripe feature points extraction the pixel, that is  ˆ  < 12 . 3) The third derivative is negative. It is
considered to be an edge point.
Laser stripe images can directly reflect the shape of welding Due to the fact that Facet model needs to fit a polynomial for
seam section. The characteristic points of welding seam are rep- each pixel, it is difficult to meet real-time requirements of welding.
resented as feature points of stripe in image, such as discontinuity Therefore, fast wavelet transform algorithm is adopted to roughly
points and inflection points. By extracting these feature points, the extract edge points. The image is decomposed to several scales via
center of welding seam can be obtained to realize seam tracking. Mallat algorithm [22]. If the modulus value of point is a local max-

Fig. 13. Point cloud images of continue and discontinue welding seams.
8 R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533

Fig. 14. Fillet welding seam feature extraction.

Fig. 15. Butt welding seam feature extraction.

imum in direction of phase angle and is greater than a specified


threshold, it can be regarded as an edge point. 
1, 0 ≤ t < 0.5
In this paper, Harr wavelet base: (t) = −1, 0.5 ≤ t < 1
0, otherwise
is proposed to extract rough edge points, which can improve
computational efficiency owing to its simple structure. In 5*5
neighborhood of each rough edge point, the binary cubic Facet
model is fitted to extract accurate edge points.
Fig. 18 displayed the effect of edge extraction algorithm. Fig. 18b
is the result of decomposing Fig. 18a into 2 scales. The white stripes
are welding seam edges. And Facet models were fit in the corre-
sponding positions of original image to obtain the precise edge
points (green points in Fig. 18c). By the two-step edge extrac-
tion method, the image processing efficiency can be significantly
improved, and the extraction accuracy can be guaranteed at the
same time.
To weaken the influence of noise on line fitting result, M- Fig. 16. Guidance error of fillet welding seam and butt welding seam.
estimation algorithm is employed to extract sample outliers, which
are given a small weight to reduce fitting error.
2) Noise filtering. The image may be corrupted by noise, which will
5.2. Welding seam center extraction increase the number of false center points detected in step 1.
Therefore, mean filtering is adopted to filter false center points.
5.2.1. Continuous welding seam center extraction The points, whose mean gray value in 3*3 neighborhood is less
The center of continuous welding seam is expressed as inflec- than a specified threshold, are eliminated. And a prior model is
tion point of laser stripe. In this paper, the center of laser stripe proposed to further filter the false center points, which will be
is extracted by Steger algorithm, and the coordinate of welding discussed in section 5.3.
seam center is extracted through calculating the intersection of two 3) By fitting laser stripes’ equations, and calculating the intersec-
adjacent stripes. Taking fillet welding seam as an example, the con- tions of adjacent straight lines, the center of welding seam can
tinuous welding seam center extraction algorithm is described in be obtained.
detail as follows, and the flow chart is demonstrated in Fig. 19.
5.2.2. Discontinuous welding seam center extraction
1) Center points extraction. The image is convolved with Gaus- Two discontinuity point of laser stripe are the edges of welding
sian kernel function and its derivatives to obtain Hessian matrix seam, and the center of discontinuous welding seam is expressed as
for each pixel, and Steger algorithm is utilized to extract center the midpoint of the two discontinuity points. In this paper, welding
points. seam edges are extracted through edge extraction algorithm, and
R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533 9

Fig. 17. Extraction result of laser strip center.

Fig. 18. Extraction results of welding seam edges.

Fig. 19. Procedure of fillet welding seams center extraction.

Fig. 20. ROI selection diagram of butt welding seams.

the center of laser stripe is extracted with Steger algorithm. The 1) ROI selection. Two ROI regions, including laser stripes and
intersections between laser stripe and welding seam edges are cal- welding seams respectively, need to be selected before feature
culated, then the center position of welding seam can be obtained. extraction. The identified result of Faster R-CNN is utilized as
Taking butt welding seam as an example, the flow of discontinu- laser stripe ROI directly (Fig. 20a). And the ROI of welding seam
ous welding seam center point extraction is described in detail as is selected on the side near the arc. Since during welding process,
follows: arc light can help to illuminate welding seam, which makes it
10 R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533

3) Extraction of welding seam edges. Before edge extraction,


logarithmic transformation and normalization are utilized to
enhance original image if the average gray value of image is
lower than a threshold. The image is decomposed into 2 scales
by Harr wavelet to extract rough edge points, and Facet model is
calculated at these points. The points satisfying the edge criteria
are preserved as precise edge points. Similarly, the prior model
is provided to filter the false edge points.
4) Edges’ equations fitting. The gradients of points on the two edges
Fig. 21. Laser stripe centerline extraction of butt welding seam. are opposite. So, they can be easily distinguished. The edges
extraction process is demonstrated in Fig. 22a. The calculated
seam center is presented as red point in Fig. 22b.
easier to extract welding seam edges. But considering the influ-
ence of laser and arc, the welding seam ROI should not be too
close to them. So, Eq. (4) is given to select welding seam ROI, the
⎧ is as xshown in Fig. 20b.
result 5.3. False feature points filtering

⎪ X=
0

⎪ 4 During welding process, the change of welding torch’s gesture
⎨ h
Y = y0 + − w is generally slow compared with the image sampling frequency
2 (4)

⎪ x0 (14 Hz). So, it can be considered that the adjacent frames are

⎪ W =
⎩ 2 approximately the same. Thence, the positions of laser stripe and
H = 2w seam edges in adjacent frames are essentially unchanged. There-
Where: (X, Y ) ——Image coordinates of the upper left corner fore, the gradient direction of stripe and edge in the previous frame
of welding seam ROI; (x0 , y0 ) ——Image coordinates of the can be treated as reference of the next frame to filter welding
upper left corner of laser stripe ROI ; (W, H) ——the width and noise.
height of welding seam ROI; h ——the height of laser stripe When extracting stripe center, the eigenvector corresponding
ROI; w ——the width of welding seam to the maximum eigenvalue of Hessian matrix is defined as the
gradient direction. The center points on the same segment have
2) Laser stripe extraction. The extraction process of stripe center is the same gradient direction and are different from other segments.
similar to which of continuous welding seam. Only the stripes’ Therefore, the gradient information can be used to filter the false
equations on both sides of welding seam are fitted, as Fig. 21. center points. The range and gradient direction of each segment

Fig. 22. Extraction the center of butt welding seam.


R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533 11

Fig. 23. Filtering false center points.

are defined as a prior model, as Eq.5. Only the points satisfying the
model are preserved. The filtering process is illustrated in Fig. 23.

Range = Xmin Xmax
Segment i : (5)
Angle = min max

Where, i is the sequence number of segments; Range represents the


location range of each segment; Angle represents the angle range
of gradient direction.

When extracting edge points, the  that maximizes f (0) is
regarded as the gradient direction. Similarly, the gradient direction
of two edges and their approximate position can be defined as a
prior model. And to ensure the accuracy of the model, the model will
be updated with the gradient information extracted from current
frame.

6. Experimental verification and analysis

The adaptability, robustness and accuracy of the feature extrac-


Fig. 24. Experimental system of robotic seam tracking.
tion algorithm are validated under actual welding conditions. The
experimental system, as shown in Fig. 24, includes industrial com-
puter, vision sensor, FANUC welding robot, Fronius CMT welding
power supply, etc., and Table 2 is the welding parameters.

Fig. 25. Extraction process of feature points for V-shaped welding seam with small gap.
12 R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533

Fig. 26. Extraction procedure of large gap V-shaped welding seam.

Table 2 is used to extract the seam center. Taking V-shaped welding seam
Welding parameters.
with different gaps as an example, the extraction procedures are
Parameters Value Parameters Value presented as follows.
Material Q235 steel plate Welding speed 4 mm/s When the gap is small, the corresponding laser stripe image con-
Groove type Fillet and butt Welding wire 1.2 mm sists of four connected lines, which is continuous welding seam, as
Pulse frequency 125HZ Protective gas 80%Ar+20%CO2 mentioned in Fig. 4b. When calculating the intersections of adja-
Welding current 150A Gas-flow rate 14 L/min cent straight lines, only the intersection of the middle two lines is
Dry elongation 15mm
preserved. The extraction process is demonstrated in Fig. 25.
When the gap gets larger, the laser stripe is broken in the middle,
6.1. Algorithm adaptability verification so the type of welding seam changes into discontinuous, as men-
tioned in Fig. 5b. Thus, the seam edges and the laser stripes that
The adaptability verification test is conducted on multi-type intersecting seam edges are extracted. The extraction procedure is
welding seams. The trained Faster R-CNN model is used to classify demonstrated in Fig. 26.
welding seams into continuous welding seams or discontinuous Moreover, the center extraction result of lap welding seam,
welding seams, and the corresponding feature extraction algorithm which is continuous welding seam, is shown in Figs. 27 and 28
R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533 13

Fig. 27. Center extraction results of overlap welding seam.

Fig. 28. Center extraction results of half V-shaped welding seam.

Fig. 29. Extraction results of continuous welding seam with noise.

presents the result of half V-shaped welding seam center extrac-


tion, which is discontinuous welding seam. It can be seen that the
feature extraction algorithm proposed can be efficiently applied to
multiple typical welding seams.

6.2. Algorithm robustness verification

The algorithm is applied to laser stripe images with welding


noise to verify its robustness. With the use of prior model, the
algorithm can filter the false center points reasonably, and obtain
the accurate laser stripes’ equations. Therefore, when the stripes
are disturbed by welding noise, such as arc, welding spatter and
joint reflection, the algorithm still has a good effect, as displayed in
Fig. 29.
The prior model also limits the approximate position and direc-
tion of seam edges. And the stripe and edges are less disturbed by
welding noise. So, by properly selecting ROI and accurately extract-
ing edges, the algorithm still works under situations of arc spatter Fig. 30. Extraction results of discontinuous welding seam with noise.
and welding spatter, as displayed in Fig. 30.
Generally, laser stripe and seam edges are relatively stable fea-
tures in image during welding. Meanwhile, the gradient features 6.3. Tracking accuracy verification
are restricted by the prior model, so the stability of feature extrac-
tion can be guaranteed. Moreover, the algorithm consumes less In order to verify the accuracy of the feature extraction algo-
than 20 ms when processing a picture, which meets real time weld- rithm, this paper tracked butt welding seam (Fig. 31a) and fillet
ing requirements. welding seam (Fig. 31b).

Fig. 31. Seam tracking accuracy verification test.


14 R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533

Experimental results indicated that under actual welding condi-


tions, the maximum error is 0.47 mm and average error is 0.29 mm
for continuous welding seam tracking. And for discontinuous weld-
ing seam tracking, the maximum error is 0.46 mm and average error
is 0.28 mm, which have met actual welding requirements.

7. Conclusions

In order to improve the adaptability and robustness of feature


extraction algorithms under complex welding conditions, an adap-
tive feature extraction algorithm is proposed in this paper. The
following conclusions are drawn:

1 A robotic seam tracking system was established based on a laser


vision sensor, whose software is developed independently.
2 Typical welding seams are divided into continuous and discon-
Fig. 32. Procedure of seam tracking.
tinuous welding seams in accordance with laser stripe images. A
Faster R-CNN model is trained to automatically distinguish the
welding seam types and determine the laser stripe ROI. Experi-
mental results prove that the trained model can still work even
under complex welding conditions.
3 To realize initial welding point guidance, 3D point cloud data
are used to reconstruct welding seam. Through point cloud pro-
cessing, the equations and initial points of welding seams can be
obtained. Guidance test results prove that the extraction error is
less than 0.6 mm, meeting actual production demand.
4 For continuous welding seams, the center coordinates of the
welding seam are extracted by calculating the intersection point
of two adjacent stripes. For the discontinuous welding seams, a
two-step method is used to quickly extract the seam edges. By
calculating the midpoint of two intersections of the laser stripes
and seam edges, the center coordinates of welding seam can be
Fig. 33. Tracking accuracy of fillet welding seam. obtained. A priori model is applied to filter noise, which ensures
the algorithm can be applied to complex welding conditions.
Experiments indicate that the algorithm is suitable for multiple
typical welding seams.
5 Accuracy verification results demonstrate that the maximum
tracking errors of continuous welding seam and discontinu-
ous welding seam are 0.47 mm and 0.46 mm respectively, and
average errors are 0.29 mm and 0.28 mm respectively. Results
confirmed that the feature extraction algorithm is accurate
enough for most robotic welding tasks.

Acknowledgements

This work is partly supported by the National Natural Science


Foundation of China under the Grant No. 61873164 and 51575349,
and the Shanghai Natural Science Foundation (18ZR1421500).

Fig. 34. Tracking accuracy of butt welding seam.


References

[1] S.B. Chen, On intelligentized welding manufacturing The Advances in


The tracking procedure is noted in Fig. 32. The seam type is iden- Intelligent Systems and Computing, vol. 363, Springer Verlag, 2015, pp. 3–34.
[2] S.B. Chen, J. Wu, Intelligentized Technology for Arc Welding Dynamic Process,
tified by the trained Faster R-CNN model. Then, initial welding point
Springer-Verlag, Berlin Heidelberg, Germany, 2009, Lecture Notes in Electrical
is determined by point cloud processing. Finally, the coordinates of and Engineering, LNEE 29.
welding seam centers are extracted though image processing, and [3] B. Guo, Y. Shi, G. Yu, B. Liang, K. Wang, Weld deviation detection based on
sent to robot to realize seam tracking. wide dynamic range vision sensor in MAG welding process, Int. J. Adv. Manuf.
Technol. 87 (9-12) (2016) 3397–3410.
Two hundred sampling points were selected on each kind of [4] L.B. Soares, A. Weis Átila, R.N. Rodrigues, et al., Seam tracking and welding
welding seams to verify tracking accuracy. Laser stripe images and bead geometry analysis for autonomous welding robot, in: IEEE Latin
robot positions were stored at these sampling points. A carefully American Robotics Symposium, 2017, pp. 1–6.
[5] Y. Xu, N. Lv, G. Fang, S. Du, W. Zhao, Z. Ye, S. Chen, Welding seam tracking in
taught welding path was treated as actual path and the coordi- robotic gas metal arc welding, J. Mater. Process. Technol. 248 (2017) 18–30.
nates of sample points on the path were recorded. Welding seam [6] Z. Jin, H. Li, C. Zhang, Q. Wang, H. Gao, Online welding path detection in
centers were extracted by the algorithm, and Euclidean distance automatic tube-to-tube sheet welding using passive vision, Int. J. Adv. Manuf.
Technol. 90 (9-12) (2016) 1–10.
between the actual position of sampling point and the calculated [7] H.Y. Shen, J. Wu, T. Lin, S.B. Chen, Arc welding robot system with seam
result was regarded as tracking error. Results were presented in tracking and weld pool control based on passive vision, Int. J. Adv. Manuf.
Figs. 33 and 34. Technol. 39 (7-8) (2008) 669–678.
R. Xiao et al. / Sensors and Actuators A 297 (2019) 111533 15

[8] J. Muhammad, H. Altun, E. Abo-Serie, A robust butt welding seam finding


technique for intelligent robotic welding system using active laser vision, Int. Yanling XU was born in china in 1980. He received the Ph.D
J. Adv. Manuf. Technol. 94 (2018) 13–29. degree in materials processing engineering from Shanghai
[9] J. Fan, F. Jing, L. Yang, T. Long, M. Tan, A precise seam tracking method for Jiao Tong University, Shanghai, China, in 2012. Since April
narrow butt seams based on structured light vision sensor, Opt. Laser 2014, as a lecturer, he works in Shanghai Jiao Tong Uni-
Technol. 109 (2019) 616–626. versity, Shanghai, China, intelligentized robotic welding
[10] B. Xue, B. Chang, G. Peng, Y. Gao, Z. Tian, D. Du, G. Wang, A vision based technology laboratory. He mainly engaged in researches
detection method for narrow butt joints and a robotic seam tracking system, in robotic welding automation technology, machine vision
Sensors 19 (5) (2019) 1144. sensing technology and wire and arc additive manufactur-
[11] Z. Fang, D. Xu, M. Tan, A vision-based self-tuning fuzzy controller for fillet ing.
welding seam tracking, IEEE. ASME 16 (3) (2011) 540–550.
[12] J. Fan, F. Jing, Z. Fang, et al., Automatic recognition system of welding seam
type based on SVM method, Int. J. Adv. Manuf. Technol. 92 (1-4) (2017)
989–999.
[13] X. Li, X. Li, S.S. Ge, M.O. Khyam, C. Luo, Automatic welding seam tracking and Zhen Hou was born in china in 1986. He received the M.S.
identification, IEEE Trans. Ind. Electron. 64 (9) (2017) 7261–7271. and Ph.D. degrees in control theory and application from
[14] N. Dalal, B. Triggs, Histograms of oriented gradients for human detection, IEEE Harbin Institute of Technology, Harbin, China, in 2009
Computer Society Conference on Computer Vision And Pattern Recognition 1 and 2011, He studies in intelligentized robotic welding
(2005) 886–893. technology laboratory for Ph.D degree in Shanghai Jiao
[15] S.Q. Ren, K.M. He, R.B. Girshick, Faster R-CNN: towards real-time object Tong University since April 2015 until now. His research
detection with region proposal networks, IEEE Trans. Pattern Anal. Mach. direction are in robotic welding automation technology,
Intell. 39 (6) (2015) 1137–1149. machine vision sensing technology
[16] L. Zhang, Y. Xu, S. Du, et al., Point cloud based three-dimensional
reconstruction and identification of initial welding position, Trans. Inetll.
Weld Manuf. 1 (2018) 61–77.
[17] D. Bazazian, J.R. Casas Pla, J. Ruiz Hidalgo, Fast and robust edge extraction in
unorganized point clouds, International Conference on Digital Image
Computing: Techniques & Applications (2016) 1–8.
[18] M. Dinham, G. Fang, Autonomous weld seam identification and localisation Chao Chen was born in china in 1994. He received the
using eye-in-hand stereo vision for robotic arc welding, Int. J. Comput. Integr. bachelor degree in material forming and control engineer-
Manuf. 29 (5) (2013) 288–301. ing from Wuhan University of Technology, Wuhan, China,
[19] N. Wang, X. Shi, X. Zhang, Recognition of initial welding position based on in 2016. He studies in intelligentized robotic welding tech-
structured-light for arc welding robot, International Conference on Intelligent nology laboratory for Ph.D degree in Shanghai Jiao Tong
Robotics and Applications (2017) 564–575. University since September 2016 until now. His research
[20] C. Steger, An unbiased detector of curvilinear structures, IEEE Trans. Pattern direction are internet of things, deep learning and intelli-
Anal. Mach. Intell. 20 (2) (1998) 113–125. gent robotic welding.
[21] R.M. Haralick, Digital step edges from zero crossing of second directional
derivatives, IEEE Trans. Pattern Anal. Mach. Intell. 6 (1) (2009) 58–68.
[22] S. Mallat, S. Zhong, Characterization of signals from multiscale edges, IEEE
Trans. Pattern Anal. Mach. Intell. 14 (7) (1992) 710–732.

Shanben Chen received the B.S. degree in industry


Biographies automation from Dalian Jiao Tong University, Dalian,
China, in 1982, and the M.S. and Ph.D. degrees in control
theory and application from Harbin Institute of Technol-
ogy, Harbin, China, in 1987 and 1991, respectively. Since
Runquan Xiao was born in china in 1995. He received the April 2000, he has been the Special Professor of Cheung
bachelor degree in materials processing engineering from Kong Scholar Program of the Ministry of Education of P.
Harbin Institute of Technology, Harbin, China, in 2017. He R. China and Li Ka Shing Foundation, Hong Kong, and
studies in intelligentized robotic welding technology lab- with Shanghai Jiao Tong University, Shanghai, China. He
oratory for Ph.D degree in Shanghai Jiao Tong University is the author or coauthor of ten academic books and more
since September 2017 until now. His research direction than 300 magazine papers, including those published in
are machine vision and intelligent robotic welding. the IEEE TRANSACTIONS ON AUTOMATIC CONTROL, the
IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNET-
ICS, the Welding Journal, the Journal of Intelligent and Robotic Systems, and the
International Journal of System Sciences. His current research interests include intel-
ligentized welding manufacturing, intelligentized technologies for welding robot,
intelligent control of welding dynamical process, and modeling and control of com-
plex systems.

You might also like