C (XC, Yc, ZC) ' XI1I-Xc Yl1l - Yc zl1l-zc TM Xi - XC Yi - Yc Zi - ZC (Xi' Yi, Zi) + by + CZ
C (XC, Yc, ZC) ' XI1I-Xc Yl1l - Yc zl1l-zc TM Xi - XC Yi - Yc Zi - ZC (Xi' Yi, Zi) + by + CZ
Increasing the Accuracy for Computer Vision Systems by Removing the Noise
Zhenzhou Wang
State Key Lab for Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China
e-mail: [email protected]
Abstract-Robots rely on the computer vision systems to obtain known markers on the designed pattern are used to calculate
the environmental information. As a result, the accuracy of the the system parameters. Before they can be used for
computer vision systems is essential for the control of the calibration, the two-dimensional coordinates of these points
robots. Many computer vision systems make use of markers of in the camera view need to be computed, where the noise is
the well-designed patterns to calculate the system parameters.
introduced when the coordinates of the points are computed
Undesirably, the noise exists universally, which decreases the
as the Gaussian mean of all the comer or bright pixels.
calibration accuracy and consequently decreases the accuracy
Consequently, calibration error is caused by the noise. To
of the computer vision systems. In this paper, we propose a
pattern modeling method to remove the noise by decreasing
remove the noise caused error, we propose a pattern
the degree of freedom of the total calibration markers to one.
modeling method in this paper to reduce or eliminate the
The theorem is proposed and proved. The proposed method noise by reducing the degree of freedom of all the
can be readily adopted by different computer vision systems, calibration markers to one. Experimental results show that
e.g. structured light based computer vision systems and stereo the proposed method is effective in removing the noise in
vision based systems. the developed structured light computer vision systems [1-4],
[8], [9]. The accuracy improvement is significant in the
Keywords-degree of freedom; noise; pattern modeling; sense of mean squared errors. We use the proposed method
computer vision to remove the noise for camera calibration [10] and its
effectiveness is also verified by the experimental results.
Since camera calibration is fundamental to most computer
I. INT RODUCTION
vision applications, the proposed method has the potential to
Computer vision systems are becoming more and more benefit them greatly.
popular in robotics. It is believed that the next generation
robots should mainly rely on the computer vision systems to II. THE P ROPOSED P ATTERN MODE L IN G METHOD
control their behaviors just as human beings do. Many
computer vision systems [1]-[9] utilize some well-designed The pattern modeling method is implemented based on
pattern for calibration. Structured light based computer the distances between the center point and the other points
vison systems project the designed patterns onto the surface and contains the following steps:
Step 1: Model the rays with points in the designed
of the object to be measured. From the distorted pattern, the
surface's profile could be calculated based on the system's pattern and the projection center C(xc,yc,zc)'
The unit of
parameters calculated from a set of calibrated markers. Thus, the modeled points can be chosen as convenient as pixel or
the calibration accuracy is critical for the measurement as rum depending on the convenience of the application.
accuracy of the structured light based computer vision The modeled rays are formulated by Eq. 1.
systems. Traditionally, these computer vision systems use as
many calibration markers as they can until the calibration XI1I-Xc =yl1l_yc =zl1l-zc =tm (1)
accuracy improvement reached zero growth. The camera Xi -Xc Yi - Yc Zi - Zc I
[
- I'
Xi
"
al2 al3 al4 X�n
I
-I'
Yi
=llI
a21 a22 a23 a2 4 y;" (6)
Zi
-I' a31 a32 a33 a3 4 Zi
11/
[�,
al2 a13 "
j
a21 a22 a23 a2 4
A= (9)
a31 a32 a33 a3 4
a4 1 a4 2 a43 a44
(a) (b)
where III is a constant and N denotes the totoal number of
Figure 2. Differences after modeling: (a) X coordinate differences; (b) y
points in the registered pattern. After pattern modeling, all
coordinate differences.
the computed points in the captured pattern are replaced
with the modeled points in the ideal pattern. Then they are
used for system calibration.
372
camera captures the projected pattern as shown in Fig. 1 (b). IV. EXPE R IMENT A L RE SULT S
These laser dots are segmented by the method proposed in
[11-12]. The coordinates of the bright laser dots are Firstly, we model a camera calibration pattern [10] in
calculated based on the Gaussian mean of the bright pixels. Fig. 4. We see obvious mismatches between the modeled y
To demonstrate the noise, we select 44 points around the coordinates and the original y coordinates. The mean
center laser dot and plot the differences of the x coordinates squared errors are l.8155 and 2.8142 respectively. We then
and y coordinates between adjacent laser dots respectively. search around the projection center within the range [-5, 5]
Fig. I (c) and (d) show the calculated x coordinate in three dimensions. We found the new center that could
differences and y coordinate differences respectively. Based yield the minimum MSEs, which are reduced to 1.074 and
l.6262 respectively. The new modeling results are shown in
on the designed pattern, the differences should be a constant
Figure 5. It is seen that the modeled points and original ones
without noise while noise adds the random variations. These
match significantly better than those in Fig. 4.
random variations are the introduced noise. We plot the
differences of the x and y coordinate for these 45 points
before modeling and after modeling in Fig. 2 (a) and (b)
respectively. As can be seen, the noise (random variation) is
eliminated successfully. For the modeled results shown in
Fig. 2, the computed mean squared errors (MSE) are 2.8061
and 2.0916 for the x and y coordinate respectively. We
search a new projection center in a small range [-5, 5] in
three dimensions and find the center that yields the
minimum MSE. The MSEs are reduced to 2.2204 and 5 10 15 � � » � � 5 10 15 " "
373
(x/ ,y/ ,y/) i = 1,... ,28 in the world coordinate sytem by the phase shift
profilometry 3D imaging system.
REFERENCE S
374