3DCV_lec01_camera
3DCV_lec01_camera
1
Augmented Vision
• Head: Prof. Didier Stricker
• Founded in July 2008
• 60 PhDs and fulltime researchers
• 3 strongly connected research areas
Computer Vision
Virtual and
Body Sensor
Augmented
Networks
Reality
2
3D Computer Vision
Lecture:
• Room: 01-006
• Time: Wednesday, 10:00 to 11:30
• In-person lecture
• “SWS: 2V+1Ü“
• Credit points: 4 CP
• Language: English
3D Computer Vision
Exercise:
• Room: 11-243
• Time: Thursday, 15:30 to 17:00 (tomorrow!)
• Q&A-Style session with tutors
• Quizzes
• Programming exercise
• Final presentation
3D Computer Vision
Quizzes:
• After every lecture (11:30) in OLAT.
• Usually 1 week time (check deadlines in OLAT).
Exercise:
• Large programming task
• Available on 30th of October (11:30)
• Deadline on 16th of January (23:59)
• Final presentation of your solution
• (Optional) Introduction to Python & Jupyter Notebook
• After THIS lecture on OLAT
3D Computer Vision
Exam:
• 28.02.2025, 13:30, 46-210 tentative
• 27.03.2025, 13:30, 46-210 tentative
• Tutors:
• Marcel Rogge
• Katharina Bendig
• Topics:
Camera pose
• Introduction: what is a camera? images
• Camera model and camera calibration
• Fitting and parameter estimation
• 2D-image transformation (mapping) and panorama
• Two cameras: epipolar geometry and triangulation
3D reconstruction
• Multiple view reconstruction
• Depth maps and multiple view stereo reconstruction
• Structured light: laser, coded light
Texturing Texturing 8
Overview
• Digital cameras
• Types of sensors
• Color
9
Augmented Vision: Maintenance
1998-2000
Augmented Reality: Marker-less tracking
2004 2002
Augmented Things
The universal interface for the Internet of Things
23.10.2024 13
Autonomous Driving
14
Body tracking
23.10.2024 15
Example
3D model
http://youtu.be/Hg792_Mh1Cg
Theta camera 360° (RICOH)
• Image stack (1 cm, 60 images)
Source: http://cvpr2019.thecvf.com
• Today
23.10.2024 23
Overview
• Digital cameras
• Types of sensors
• Color
24
Today: The Camera
25
How do we see the world?
26
Slide by Steve Seitz
Pinhole camera
27
Slide by Steve Seitz
Pinhole camera model
• Pinhole model:
• Captures pencil of rays – all rays through a single point
• The point is called Center of Projection (focal point)
• The image is formed on the Image Plane
28
Slide by Steve Seitz
Dimensionality Reduction Machine (3D to 2D)
3D world 2D image
Point of observation
• Many-to-one: any point along the same ray maps to the same point in the
image
• Points → points
• But the projection of points on the focal plane is undefined
32
Perspective distortion
• Problem for architectural photography: converging verticals
33
Source: F. Durand
Perspective distortion
• What does a sphere project to?
34
Image source: F. Durand
Perspective distortion
• What does a sphere project to?
35
Perspective distortion
• The exterior columns appear bigger
• The distortion is not due to lens flaws
• Problem pointed out by Da Vinci
36
Slide by F. Durand
Perspective distortion: People
37
Modeling projection
38
Source: J. Ponce, S. Seitz
Modeling projection
•Projection equations
• Compute intersection with Π’ of ray from P = (x,y,z) to O
• Derived using similar triangles
• Digital cameras
• Types of sensors
• Color
40
Building a real camera
41
Camera Obscura
42
Source: A. Efros
Home-made pinhole camera
35mm-pinhole-camera
Why so
blurry?
43
Slide by A. Efros http://www.debevec.org/Pinhole/
Shrinking the aperture
Diffraction effects
45
The reason for lenses
46
Slide by Steve Seitz
Adding a lens
47
Slide by Steve Seitz
Adding a lens
focal point
48
Slide by Steve Seitz
Adding a lens
“circle of
confusion”
49
Slide by Steve Seitz
Thin lens formula
D’ D
f
D’ D
f
http://www.cambridgeincolour.com/tutorials/depth-of-field.htm
Slide by A. Efros 55
How can we control the depth of field?
• Changing the aperture size affects depth of field
• A smaller aperture increases the range in which the object is approximately in focus
• But small aperture reduces amount of light – need to increase exposure
At f/5.6, the flowers are isolated from the background.
f / 5.6
At f/32, the background is distracting.
f / 32
56
Flower images from Wikipedia http://en.wikipedia.org/wiki/Depth_of_field Slide by A. Efros
Varying the aperture
58
Slide by A. Efros
Field of View
f
f
61
Source: F. Durand
Real lenses
62
Lens Flaws: Chromatic Aberration
63
Lens flaws: Vignetting
64
Radial Distortion
• Caused by imperfect lenses
• Deviations are most noticeable for rays that pass through the
edge of the lens
65
Overview
• Digital cameras
• Types of sensors
• Color
66
Digital camera
67
Slide by Steve Seitz
CCD vs. CMOS
• CCD: transports the charge across the chip and reads it at one corner of the array. An analog-to-digital converter (ADC) then turns each
pixel's value into a digital value by measuring the amount of charge at each photosite and converting that measurement to bin ary form
• CMOS: uses several transistors at each pixel to amplify and move the charge using more traditional wires. The CMOS signal is digital, so
it needs no ADC.
http://electronics.howstuffworks.com/digital-camera.htm
68
http://www.dalsa.com/shared/content/pdfs/CCD_vs_CMOS_Litwiller_2005.pdf
High Dynamic Range Images
• CCD / CMOS have a given dynamic range for capturing the luminance of the scene – LDR cameras (Low Dynamic
Range)
• Problem: The dynamic range of the scene illumination is higher than the dynamic of the camera
• It results:
• Underexposed areas
• Saturation areas
69
High Dynamic Range Images
• High dynamic range image
- Standard RGB images are encoded over 8 bit integer (255 values)
- HDR: 12-, 16- or 32-bit floating point
See: www.OpenEXR.com
70
Color sensing in camera: Color filter array
Bayer grid
Estimate missing
components from
neighboring values
(demosaicing)
• 3 CCD camera
• requires three chips and precise alignment
• More expensive
CCD(R)
CCD(G)
CCD(B) 72
Issues with digital cameras
• Noise
• low light is where you most notice noise
• light sensitivity (ISO) / noise tradeoff
• stuck pixels
• In-camera processing
• Over-sharpening can produce halos
• Blooming
• Charge overflowing into neighboring pixels
• Color artefacts
• Purple fringing from microlenses, artefacts from Bayer patterns
• white balance 73
Slide by Steve Seitz
How to build your own camera
• https://www.youtube.com/watch?v=PaXweP73NT4
• https://www.instructables.com/DIY-Image-Sensor-and-Digital-Camera/
74
Historical context
• Pinhole model: Mozi (470-390 BCE), Aristotle (384-322 BCE)
• Principles of optics (including lenses):
Alhacen (965-1039 CE) Alhacen’s notes
76