0% found this document useful (0 votes)
13 views

tuan 7 camera calibration5

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

tuan 7 camera calibration5

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 69

Introduction to Camera Calibration

Robotic & Vision Lab RoVis


Robotic & Vision Lab RoVis
Which Parameter you want to obtain

Robotic & Vision Lab RoVis


• Image formation is the mapping of scene points to the image plane

Robotic & Vision Lab RoVis


© Peter Corke
Queensland University of Technology
CRICOS No. 00213J
Robotic & Vision Lab RoVis
© Peter Corke
Queensland University of Technology
CRICOS No. 00213J
No unique inverse

Robotic & Vision Lab RoVis


® © Peter Corke
a university for the
real world
CRICOS No. 00213J
Worksheet 2.1

• F=5mm
• Z=1 meter
• A tree is 2 meters high,0.5 meters wide.
What is the size of the tree appears in
the image?
• Sketch the diagram.
• Answer:___________
Ch2.

Robotic & Vision Lab RoVis


7
Camer
as
Answer: Worksheet 2.1

• F=5mm
• Z=1 meter
• A tree is 2 meters high,0.5 meters wide. What is the
size of the tree appears in the image?
• Answer: using y=F*Yc /Zc
• Tall=y=F*Y/Z=5x10-3x2 meters/1 meter=10mm
• or 5*2*10^-3/1 =10mm
• Wide=x=F*X/Z=5x10-3x0.5 meters/1 meter=0.0025 meter=2.5mm,
• or =5*0.5*10^-3*1 =2.5mm

Robotic & Vision Lab RoVis


Ch2.
Camera
8
Camera model | Intrinsic parameters | extrinsic parameters | projection

Picture element (Pixel) based image


• [x,y]T is in meters, []T is transposition in matrix
• [u,v]T in pixels which is easy to be measured by the
camera
• Each pixel is Sx wide and Sy high
• Typically Sx = Sy =5.46um (1um=1x10^-6m=1 micron)

1 pixel
Sx =5.46um
Image=
1024x768 Sy=5.46um
pixles

CCD Charge-coupled device Ch2. Cameras v.9e2 9


Sensor: http://en.wikipedia.org/wiki/Charge-coupled_device
Camera model | Intrinsic parameters | extrinsic parameters | projection

Answer: Worksheet 2.2


• For an image of 1280x1024, Sx = Sy =5.2um.
• What is the size of the image (CMOS or CCD sensor)?
• 1280*5.2um =1280*5.2*10^-6 = 0.00666m=6.66mm
• 1024X5.2um =1024*5.2*10^-6 = 0.00532m=5.32mm

Image
www.ovt.com
OVT CameraChip OV9620/9120 (CMOS sensor)
webcam camera chip

Ch2. Cameras v.9e2 10


công thức
Ta có:Camera chuyển| đổi
model sau
Intrinsic parameters | extrinsic parameters | projection

Hình 11.8 Chuyển đổi tọa độ(x, y)  (u, v)


x fX
x  (u  u0 )  sx  u 
x
 u0 x  (u  u0 )  sx  u   u0  u   u0
sx sx Zsx
y y fY
y  (v  v0 )  s y  v   v0 y  (v  v0 )  s y  v   v0  v   v0
sy sy Zs y

f 
s 0 u0 
 Zu   x  X  X 
 
  Zv   0 v0  Y   K Y 
f
sy
 Z     Z   Z 
0 0 1 
 
 
K: ma trận thông số nội của camera

Ch2. Cameras v.9e2 11


Simplified Intrinsic parameters Mint in pixels
Sx=Sy and F/sx= f in pixels
• 3D=Xc,Yc,Zc in meters  s *u   Xc 
• Image=u,v in pixels    
• s= arbitrary rating factor  s * v   M int  Yc 
 s  Z 
   c
f 0 ox 
 
M int 0 f oy 
0 0 1 

image _ center  (ox , o y )(in pixels)
f  focal _ length (in pixels)

12

Robotic & Vision Lab RoVis


Ch2. Cameras v.9e2
Simplified Intrinsic parameters Mint in pixels
Sx=Sy and F/sx= f in pixels
• 3D=Xc,Yc,Zc in meters  s *u   Xc 
• Image=u,v in pixels    
• s= arbitrary rating factor  s * v   M int  Yc 
 s  Z 
   c
f 0 ox 
 
M int 0 f oy 
0 0 1 

image _ center  (ox , o y )(in pixels)
f  focal _ length (in pixels)

13

Robotic & Vision Lab RoVis


Ch2. Cameras v.9e2
Worksheet 2.4, 3D projection on image
(Ch1_e1.m )
• sx=sy=5.4um
• (ox,oy)=(512,384) in pixels
• f=4.3mm/5.4um=800 pixels
• [Xc,Yc,Zc]T =[0.02,0.05,1.2]Tmeters
s * u   X   800 0 512  0.02
 s * v   M  Y    0 800 384  0.05
  int     
 s   Z   0 0 1   1.2 
• [u v]’=?
• Answer: ___________

• Write a pseudo code (or matlab) program to calculate the projected point
[u v] T from [Xc,Yc,Zc]T based on the above camera intrinsic
parameters.

Robotic & Vision Lab RoVis


Answer: Worksheet 2.4, 3D projection on image

%matlab
• sx=sy=5.4um k=[800 0 512
0 8000 384
• (ox,oy)=(512,384) in pixels 0 0 1]
• f=4.3mm/5.4um=800 pixels x=[0.02 0.05 1.2]'
im=k*x
• [Xc,Yc,Zc]T =[0.02,0.05,1.2]Tmeters im(1)/im(3)
s * u   X   800 0 512  0.02 im(2)/im(3)
 s * v   M  Y    0 800 384  0.05
  int      %answer:
 s  
 Z   0 0 
1   1.2  im =
630.4000
• [u v]’=?? 860.8000
1.2000
• Answer: [525.3, 717.3]T u,v=
• Write a pseudo code (or matlab) program to calculate the
projected point [u,v]’ from [Xc,Yc,Zc]’ based on the
525.3333
717.3333
above camera intrinsic parameters.

15
Robotic & Vision Lab RoVis
Ch2. Cameras v.9e2
CCD is glued to the back of a camera
W2 begins

• Ideally the CCD center is at ([512, 384]= image


center) of a 1024x768 resolution camera.
• Because of manufacturing fault the CCD center may
not be at the image center.
• Examples: 2 cameras of the same model .

CCD CCD
sensor sensor

Camera 2: The sensor


Camera 1 slight shifted to
leftward and upward
Robotic & Vision Lab RoVis
Worksheet 2.5
Because of manufacturing fault the CCD center
X,Y,Z are in 3D
may not be at image center, see(Ch1_e2.m ) u,v are in 2D
• Sx=sy=5.4um Y
v
• (Ox,oy)=(600,400) in pixels
Z
• f=4.3mm/5.4um=800 pixels
• [Xc,Yc,Zc]T=[0.02,0.05,1.2]T meters
s * u   X   800 0 600  0.02 u=400
 s * v   M  Y    0 800 400  0.05 Y=384 X
  int     
 s  
 Z   0 0 
1   1.2  (1,1)

X=512
v=600
• [u v]T=??
ox,oy=600, 400
• Answer: [u v]T =____________________ (Image center is
shifted to ox,oy)
• Conclusion: A 3D object point can have different
image positions for different cameras (different Mint)
placed at the same position.

Robotic & Vision Lab RoVis


Answer Worksheet 2.5
Because of manufacturing fault CCD center may not be at
image center, example
(Ch1_e2.m )
• Sx=sy=5.4um X,Y,Z are in 3D
• (Ox,oy)=(600,400) in pixels u,v are in 2D
Y
• f=4.3mm/5.4um=800 pixels v
• [Xc,Yc,Zc]T=[0.02,0.05,1.2]T meters Z

s * u   X   800 0 600  0.02


 s * v   M  Y    0 800 400  0.05 u=400
  int      Y=384 X
 s  
 Z   0 0 
1   1.2 
(1,1)

X=512
• [u v]T=[613.3, 733.3]T
v=600
• Conclusion: A 3D object point can have ox,oy=600, 400
different image positions for different (Image center is
cameras (different Mint) placed at the shifted to ox,oy)
same position.
Robotic & Vision Lab RoVis
Tutorial 2.2
Given
X,Y,Z are in 3D
• Focal length is 3mm, sx=sy=3um
u,v are in 2D
• (ox,oy)=(620,410) in pixels
Y
• A 3D point is at X=0.015 meters, Y=0.02 meters and Z=2 v
meters away from the camera center.
• Fill in the blanks (__?) Z

s * u  X   ____? 0 ____?   ____? u=410


s * v  M Y      Y=384 X
  int    0 ____? ____?  
____?

 s  
 Z   0 0 
1   2  (1,1)

X=512
• [u v]T=?? v=620

• Answer: [u v]T =____________________? ox,oy=620, 410


(Image center is
shifted to ox,oy)
Ch2. Cameras v.9e2 19
%matlab
m=[1000 0 620; 0
Answer: 1000 410; 0 0 1]
p=[0.015 0.02 2]'
Tutorial 2.2 x=m*p
Given u=x(1)/x(3)
v=x(2)/x(3)
Answer : X,Y,Z are in 3D
• Focal length is 3mm, sx=sy=3um u =627.5,v = 420 u,v are in 2D
• (ox,oy)=(620,410) in pixels Y
• A 3D point is at X=0.015 meters, Y=0.02 meters v
high and Z=2 meters away from the camera center. Z
• Fill in the blanks (__?)

s * u  X   __ 1000 __? 0 _ 620 ___?   _ 0.015 ___? u=410


s * v  M Y    
 0 _ 1000 ___? _ 410 ___?  _ 0.02 ___?  Y=384 X
  int    
 s   Z    
0 0 1  2
(1,1)

X=512

• [u v]T=?? v=620
ox,oy=620, 410
• Answer: [u v]T =___627.5, 420_________________? (Image center is
shifted to ox,oy)
Ch2. Cameras v.9e2 20
Complete camera model
Giả sử Camera lúc này có tọa độ so với một hệ gốc tọa độ khác=> Công thức tổng
quát biến đổi từ hệ
Xw Xw
 
su Y   
su Y 
 sv   K * M  w   sv   P *  w
  int(33) ext (3 4)
 Zw    (3 4)
 Zw 
 s  (31)    
  (31)
s  
  (41)
1  1  (41)

Với Kint: ma trận thông số nội (intrinsic parameters)

R t
Mext:   ma trận thông số ngoại dùng để chuyển tọa độ vật từ tọa độ
 013 1 

toàn cục về tọa độ camera

Robotic & Vision Lab RoVis


®
a university for the
real world
CRICOS No. 00213J
Ví dụ 11.2: Một điểm P có t ọa độ trong toàn c ục là P(16,0, -1), camera {C}
có f=512, (ox,oy)=(256,256), vị trí Camera và tâm xe được biểu diễn bởi ma

trận H cv , Vị trí tâm xe so với toàn cục H vw . Tìm tọa độ pixel của điểm P trong

camera. Tọa độ của camera trong tọa độ toàn cục là


0 0 1 6
 1 0 0 0 
H cw  H vw * H cv   
0  1 0 3 
 
0 0 0 1

0 0 1  0 1 0 
1 0 0 5 0 0 1 1  Rcw   1 0 0   R c  0
 w  0  1 
0  1 0 
0  v 1 0 0 0  0  1 0  0 1 0 
H vw   , Hc 
0 0  1 1 0 1 0  2 
   
0 0 0 1 0 0 0 1 
0  1 0 0 
 M ext  [ R  R(6;0;3)]  0 0  1 3 

1 0 0  6 

16 
 su   0   2560  u   256 
  sv   K int M ext     4608    v    460.8
   
 1
 s     10   1   10 
 
1

Robotic & Vision Lab RoVis


Tutorial 2.9
• Suppose that the intrinsic parameters of our
𝑓 𝑜𝑥
camera can be represented by 𝑀𝑖𝑛𝑡 = 𝑓 𝑜𝑦
1
• The camera coordinate center is the same as the
world center, hence Mext =[ I | 0]
1 0 0 0
• There are two points P1, P2 in the world
M ext  0 1 0 0
coordinates
0 0 1 0
𝑇
• P1 = 0.1, 0.2, 1 meters , its image is at 𝐼1 =
1000, 2000 𝑇
• 𝑃2 = 0.3, 0.4, 1 𝑇 meters, its image is at 𝐼2 =
3000, 2200 𝑇 respectively.
• Compute the matrix 𝑀𝑖𝑛𝑡 . (hints : image 𝐼𝑖 = 𝑀𝑖𝑛𝑡 ∗
𝑀𝑒𝑥𝑡 ∗ 𝑃𝑖)

Robotic & Vision Lab RoVis


Ch2.
Camera
27
Answers Tutorial 2.9 I  M int*M ext*P

• 1 0 0 0
M ext  0 1 0 0
0 0 1 0
 0.1 
 s1*1000   f ox  1 0 0 0    f ox   0.1
 s * 2000   f o y  0 1 0 0    
0 .2
f o y  0.2 and
1   1
 s1   1  0 0 1 0    1   1 
 
1
 s2* 3000  f ox  0.3
 s *x    f o y  0.4 
 2  
 s2   1   1 
From the last row of each matrix equation : s1  1, s2  1
1000  f* 0.1  ox , hence ox  1000  f* 0.1    (i)
3000  f*( 0.3 )  ox ----(ii)
put (i) in (ii)
3000  f*( 0.3 )  1000  f* 0.1
2000  f * 0.2, f  1000
Put back to (i) , ox  1000  1000* 0.1  900
Since
2000  f* 0.2  o y , so o y  2000  1000 * 0.2  1800
1000 900 
M int 
 1000 1800
 1 

Robotic & Vision Lab RoVis


28
Camera calibration Direct method

Robotic & Vision Lab RoVis


Ch2.
Camera
29
How to calibrate the camera?

x  KR t  X
X 
 su  * * * * Y 
 sv   * * * *  
    Z 
 s  * * * *  
1

Robotic & Vision Lab RoVis


James Hays
How do we calibrate a camera?
Known 2d Known 3d world
image coords locations
880 214 312.747 309.140 30.086
43 203 305.796 311.649 30.356
270 197 307.694 312.358 30.418
886 347 310.149 307.186 29.298
745 302 311.937 310.105 29.216
943 128 311.202 307.572 30.682
476 590 307.106 306.876 28.660
419 214 309.317 312.490 30.230
317 335 307.435 310.151 29.318
783 521 308.253 306.300 28.881
235 427 306.650 309.301 28.905
665 429 308.069 306.831 29.189
655 362 309.671 308.834 29.029
427 333 308.255 309.955 29.267
412 415 307.546 308.613 28.963
746 351 311.036 309.206 28.913
434 415 307.518 308.175 29.069
525 234 309.950 311.262 29.990
716 308 312.160 310.772 29.080
602 187 311.988 312.709 30.514

Robotic & Vision Lab RoVis


James Hays
Unknown Camera Parameters

X 
 su   m11 m12 m13 m14   Y  Known 3d
Known 2d  sv   m
image coords    21 m22 m23 m24   
  Z  locations
 s  m31 m32 m33 m34   
1
First, work out su  m11 X  m12Y  m13 Z  m14
where X,Y,Z
projects to under sv  m21 X  m22Y  m23 Z  m24
candidate M.
s  m31 X  m32Y  m33 Z  m34

m11 X  m12Y  m13 Z  m14


Two equations u
per 3D point
m31 X  m32Y  m33 Z  m34
correspondence
m21 X  m22Y  m23 Z  m24
v
m31 X  m32Y  m33 Z  m34 James Hays

Robotic & Vision Lab RoVis


Unknown Camera Parameters

X 
 su   m11 m12 m13 m14   
Known 2d  sv   m  Y Known 3d
m m m  
image coords    21 22 23 24   Z  locations
 s  m31 m32 m33 m34   
1
Next, rearrange into form
m11 X  m12Y  m13 Z  m14
where all M coefficients are u
individually stated in terms of m31 X  m32Y  m33 Z  m34
X,Y,Z,u,v.
-> Allows us to form lsq m21 X  m22Y  m23 Z  m24
matrix. v
m31 X  m32Y  m33 Z  m34
( m31 X  m32Y  m33 Z  m34 )u  m11 X  m12Y  m13 Z  m14
( m31 X  m32Y  m33 Z  m34 )v  m21 X  m22Y  m23 Z  m24
m31uX  m32uY  m33uZ  m34u  m11 X  m12Y  m13 Z  m14
m31vX  m32 vY  m33vZ  m34 v  m21 X  m22Y  m23 Z  m24
Robotic & Vision Lab RoVis
Unknown Camera Parameters
X 
 su   m11 m12 m13 m14   Y  Known 3d
Known 2d  sv   m
image coords    21 m22 m23 m24   
  Z  locations
 s  m31 m32 m33 m34   
1
Next, rearrange into form
where all M coefficients are
individually stated in terms of
X,Y,Z,u,v.
-> Allows us to form lsq
matrix.
m31uX  m32uY  m33uZ  m34u  m11 X  m12Y  m13 Z  m14
m31vX  m32 vY  m33vZ  m34 v  m21 X  m22Y  m23 Z  m24

0  m11 X  m12Y  m13 Z  m14  m31uX  m32uY  m33uZ  m34u


0  m21 X  m22Y  m23 Z  m24  m31vX  m32 vY  m33vZ  m34 v

Robotic & Vision Lab RoVis


Unknown Camera Parameters

X 
 su   m11 m12 m13 m14  Y 
Known 2d  sv   m m22 m23 m24    Known 3d
image coords    21   Z  locations
 s  m31 m32 m33 m34   
1
• Finally, solve for m’s entries using linear least squares
• Method 1 – Ax=b form
 m11 
m 
 12 
 m13 
 
 X 1 Y1 Z1 1 0 0 0 0  u1 X 1  u1Y1  u1Z1   m14   u1 
0 0 0 0 X1 Y1 Z1 1  v1 X 1  v1Y1  v1Z1   m21   v1  M = A\Y;
    
   m22      M = [M;1];
   
 X n Yn Zn 1 0 0 0 0  un X n  unYn  un Z n   m23  un  M = reshape(M,[],3)';
 
 0 0 0 0 X n Yn Zn 1  vn X n  vnYn  vn Z n  m24   vn 
m 
 31 
m32 
m 
 33 

Robotic & Vision Lab RoVis


James Hays
Unknown Camera Parameters

X 
 su   m11 m12 m13 m14  Y 
Known 2d  sv   m m22 m23 m24    Known 3d
image coords    21   Z  locations
 s  m31 m32 m33 m34   
1
• Or, solve for m’s entries using total linear least-squares.
• Method 2 – Ax=0 form  m11 
m 
 12 
 m13 
 
m14
 X 1 Y1 Z1 1 0 0 0 0  u1 X 1  u1Y1  u1Z1  u1    0
0 0  m21 
 0 0 X1 Y1 Z1 1  v1 X 1  v1Y1  v1Z1  v1    0 [U, S, V] = svd(A);
 m22   
     M = V(:,end);
   m23   
 X n Yn Zn 1 0 0 0 0  un X n  unYn  un Z n  u n    0 
m24  M = reshape(M,[],3)';
 0 0 0 0 X n Yn Zn 1  vn X n  vnYn  vn Z n  vn    0
m
 31 
 m32 
 
 m33 
 m34 

Robotic & Vision Lab RoVis


James Hays
How do we calibrate a Known
Known 2d camera?
3d world
image coords locations
880 214 312.747 309.140 30.086
43 203 305.796 311.649 30.356
270 197 307.694 312.358 30.418
886 347 310.149 307.186 29.298
745 302 311.937 310.105 29.216
943 128 311.202 307.572 30.682
476 590 307.106 306.876 28.660
419 214 309.317 312.490 30.230
317 335 307.435 310.151 29.318
783 521 308.253 306.300 28.881
235 427 306.650 309.301 28.905
665 429 308.069 306.831 29.189
655 362 309.671 308.834 29.029
427 333 308.255 309.955 29.267
412 415 307.546 308.613 28.963
746 351 311.036 309.206 28.913
434 415 307.518 308.175 29.069
525 234 309.950 311.262 29.990
716 308 312.160 310.772 29.080
602 187 311.988 312.709 30.514

Robotic & Vision Lab RoVis


James Hays
Known 2d image coords Known 3d world locations
1st point 880 214 (𝑢1, 𝑣1) 312.747 309.140 30.086 (𝑋1, 𝑌1, 𝑍1)
43 203 305.796 311.649 30.356
270 197 307.694 312.358 30.418
886 347 310.149 307.186 29.298
745 302 311.937 310.105 29.216
943 128 311.202 307.572 30.682
476 590 307.106 306.876 28.660
419 214 309.317 312.490 30.230
317 335 307.435 310.151 29.318
… …..
 m11 
m 
Projection error defined by two equations – one for u and one for v  12 
 m13 
 
m14
312.747 309.140 30.086 1 0 0 0 0  880  312.747  880  309.140  880  30.086  880    0
 0          m21  0
 0 0 0 312 . 747 309 . 140 30 . 086 1 214 312 . 747 214 309 . 140 214 30 . 086 214  m   
    22     
   m23   
 X n Y n Z n 1 0 0 0 0  u X
n n  u Y
n n  u Z
n n  u n 
  0 
 0 m24 
0 0 0 Xn Yn Zn 1  vn X n  vnYn  vn Z n  vn    0
m
 31 
 m32 
 
 m33 
 m34 

Robotic & Vision Lab RoVis


Known 2d image coords Known 3d world locations
880 214 312.747 309.140 30.086
2nd point 43 203 (𝑢2, 𝑣2) 305.796 311.649 30.356 (𝑋2, 𝑌2, 𝑍2)
270 197 307.694 312.358 30.418
886 347 310.149 307.186 29.298
745 302 311.937 310.105 29.216
943 128 311.202 307.572 30.682
476 590 307.106 306.876 28.660
419 214 309.317 312.490 30.230
317 335 307.435 310.151 29.318
… …..

 m11 
Projection error defined by two equations – one for u and one for v m 
 12 
 m13 
312.747 309.140 30.086 1 0 0 0 0  880  312.747  880  309.140  880  30.086  880   
 0 0 0 0 312.747 309.140 30.086 1  214  312.747  214  309.140  214  30.086  214   0
m14
  m21 
305.796 311.649 30.356 1 0 0 0 0  43  305.796  43  311.649  43  30.356  43    0
  m22   
 0 0 0 0 305.796 311.649 30.356 1  203  305.796  203  311.649  43  30.356  203      
  23  0
m
 
  m24   
 Xn Yn Zn 1n 0 0 0 0  un X n  unYn  un Z n  un    0
 0 m31
 0 0 0 Xn Yn Zn 1  vn X n  vnYn  vn Z n  vn   
 m32 
 
 m33 
 m34 

Robotic & Vision Lab RoVis


(Method 1) Step 2: find intrinsic/extrinsic parameters

Robotic & Vision Lab RoVis


(Method 1) Step 2: find intrinsic/extrinsic parameters

• Let’s define the following vectors:

Robotic & Vision Lab RoVis


(Method 1) Step 2: find intrinsic/extrinsic parameters

• The solutions are as follows (see book chapter for


details):

• The rest parameters are easily computed ....

Robotic & Vision Lab RoVis


(Method 1) Exercise

Robotic & Vision Lab RoVis


A1(1) = M(1,1);
A1(2) = M(1,2);
I = imread('Img.jpg'); A1(3) = M(1,3);
I = rgb2gray(I);
imshow(I);
A2(1) = M(2,1);
C = corner(I, 'MinimumEigenvalue', 40, 'QualityLevel', 0.4); A2(2) = M(2,2);
imshow(I); A2(3) = M(2,3);
hold on
plot(fineC(:,1), fineC(:,2), 'r*'); A3(1) = M(3,1);
A3(2) = M(3,2);
n=32; A3(3) = M(3,3);
P(1:2*n,1:12) = 0;
j=1; B = M(:,4);
for i=1:2:64
P(i,1) = WorldCor(j,1); P(i+1,5) = WorldCor(j,1); f = norm(A3);
P(i,2) = WorldCor(j,2); P(i+1,6) = WorldCor(j,2); R3 = (1/f)*A3; % rotation matrix1
P(i,3) = WorldCor(j,3); P(i+1,7) = WorldCor(j,3); e = dot(A2,transpose(R3));
P(i,4) = 1; P(i+1,8) = 1; temp = A2 - e * R3;
P(i,9:12) = P(i,1:4)*-1*fineC(j,1); d = norm(temp);
P(i+1,9:12) = P(i,1:4)*-1*fineC(j,2); R2 = (1/d)*temp; % rotation matrix2
j = j+1; c = dot(A1, transpose(R3));
end b = dot(A1, transpose(R2));
temp2 = A1 -(b * R2) - (c * R3);
%singular value decomposition to find 'm' which is the singular column a = norm(temp2);
of R1 = (1/a) * temp2; % rotation matrix3
%'V' with minimum values.
U=transpose(P)*P; % Normalizing values with respect to f since f= 1 on matrix K.
[V,D] = eig(U); % These are Intrinsic parameters.
M=zeros(3,4); alpha = a / f;
j=0; skew = b / f;
for i=1:1:3 u_0 = c / f;
M(i,1)=V(j+1,1); beta_sinTheta = d / f;
M(i,2)=V(j+2,1); v_0 = e / f;
M(i,3)=V(j+3,1); f = f / f;
M(i,4)=V(j+4,1); K =[alpha skew u_0;

Robotic & Vision Lab RoVis


j=j+4; 0 beta_sinTheta v_0;
end 0 0 f;]; %Intrinsic Parameter Matrix.
% World Coordinates that are calculated from the calibration object.
%Selected image corner coordinates that are detected with The
minimum eigenvalue corner detector. % measurements are in mms.
fineC = [1161 2857; WorldCor = [0 84 56;
2370 2698; 168 0 112;
1661 1611; 28 0 252;
1417 2279; 0 28 140;
2361 2882; 168 0 84;
2058 2412; 112 0 140;
1540 2853; 0 0 28;
2218 2272; 140 0 168;
2390 2317; 168 0 168;
1904 3022; 84 0 28;
2060 2232; 112 0 168;
2617 1732; 196 0 252;
2403 2121; 168 0 196;
2597 1948; 196 0 224;
2052 2586; 112 0 112;
1791 1985; 56 0 196;
2083 1666; 112 0 252;
1544 1930; 0 0 196;
1664 2123; 28 0 168;
2535 2761; 196 0 112;
1930 1833; 84 0 224;
1278 1625; 0 56 252;
2202 2638; 140 0 112;
2067 2045; 112 0 196;
2382 2510; 168 0 140
2196 2819; 140 0 84;
2237 1886; 140 0 224;
1910 2532; 84 0 112;
1026 3082; 0 112 28;
2075 1857; 112 0 224;
2244 1686; 140 0 252;
1654 2599]; 28 0 84];

Robotic & Vision Lab RoVis


Camera calibration using checkboard (Zhang method)

Robotic & Vision Lab RoVis


Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Camera Calibration

http://www.vision.caltech.edu/bouguetj/calib_doc
/
Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Stereo vision
• Stereo matching là quá trình lấy hai hay nhiều điểm ảnh từ đó tái tạo lại hình ảnh
3D của vật bằng cách tìm điểm tương ứng trên hình và chuyển tọa độ pixel thành
tọa độ 3D.

• Thông thường sẽ có hai camera và chúng ta sẽ tìm điểm tương ứng trên
camera trái và camera phải như hình

Robotic & Vision Lab RoVis


Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Nếu xem tâm tọa độ là camera trái

Robotic & Vision Lab RoVis


Nếu xem tâm tọa độ là camera trái

Robotic & Vision Lab RoVis


Nếu xem tâm tọa độ là camera trái

Robotic & Vision Lab RoVis


Nếu xem tâm tọa độ là điểm b ở giữa 2 camera
Nếu hai camera đặt song song với nhau và cách nhau một khoảng cách b nhất định khi này ta có:

Tọa độ chiếu (mm) của điểm P trên camera trái:

 X L , YL , Z L    X  b / 2, Y , Z 
Tọa độ chiếu (mm) của điểm P trên camera phải:

 X R , YR , Z R    X  b / 2, Y , Z 
Tọa độ điểm ảnh của điểm P trên camera trái và phải:
b f b f
 ul  ( X  ) ur  ( X  )
2 Z 2 Z
Tọa độ thực điểm P được tính từ tọa độ ảnh pixel được tính:

bf bf b(uL  uR ) b(uL  uR ) by by
 Z  X  Y 
uL  uR d 2(uL  uR ) 2d uL  uR d

Với d: sự khác giữa hai điểm ảnh (disparity)

Robotic & Vision Lab RoVis


Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis
Stereo vision

Triangulate on two images of


the same point to recover
depth depth.
• Feature matching across views
baseline • Calibrated cameras
Left Right

Matching correlation
windows across scan lines

Robotic & Vision Lab RoVis


Pixel matching

For each epipolar line


For each pixel in the left image
• compare with every pixel on same epipolar line in right image
• pick pixel with minimum match cost
This leaves too much ambiguity, so:

Improvement: match windows


(Seitz)
Robotic & Vision Lab RoVis
Correspondence Using Correlation

Left Right

scanline

SSD error

disparity

Robotic & Vision Lab RoVis


Sum of Squared (Pixel) Differences

Left Right
wL wR
m
wL wR m

( xL , y L ) ( xL  d , y L )
IL IR

wL and wR are corresponding m by m windows of pixels.


We define the window function :
Wm ( x, y )  {u, v | x  m2  u  x  m2 , y  m2  v  y  m2 }
The SSD cost measures the intensity difference as a function of disparity :
C r ( x, y , d )  [I L (
( u ,v )Wm ( x , y )
u , v )  I R (u  d , v )]2

Robotic & Vision Lab RoVis


Robotic & Vision Lab RoVis
Robotic & Vision Lab RoVis

You might also like