0% found this document useful (0 votes)
29 views

Lecture05 Image Processing Pipeline

Uploaded by

waifungloo
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Lecture05 Image Processing Pipeline

Uploaded by

waifungloo
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 64

Computational Imaging

Lecture 05: Cameras II—Image Signal Processing

Qilin Sun(孙启霖)

香港中文大学(深圳)

点昀技术(Point Spread Technology)


Today’s
Topic
Bayer Domain Processing

GAMES204 Computational Imaging, Qilin Sun


Bayer pattern

wikipedia
GAMES204 Computational Imaging, Qilin Sun
High-dimensional integration over angle, wavelength, time

plenoptic function

plenoptic function:
[Adelson 1991]
GAMES204 Computational Imaging, Qilin Sun
field sequential multiple sensors vertically stacked

GAMES204 Computational Imaging, Qilin Sun


Ø Notable French inventor

Ø Nobel price for color


photography in

Ø 1908 = volume emulsion


capturing interference

Ø Today, this process is most


similar to volume holography!

Ø Also invented integral imaging Gabriel Lippmann


(will hear more…)

GAMES204 Computational Imaging, Qilin Sun


beam splitter prism

Philips / wikipedia

GAMES204 Computational Imaging, Qilin Sun


Stacked Sensor

Foveon X3

Sigma SD9

GAMES204 Computational Imaging, Qilin Sun


OmniVision:
RGB + near IR!

GAMES204 Computational Imaging, Qilin Sun


Other Wavelengths

Ø Thermal IR

Ø Often use Germanium


optics (transparent IR)

Ø Sensors don’t use


silicon: indium,
mercury, lead, etc.

FLIR Systems
GAMES204 Computational Imaging, Qilin Sun
GAMES204 Computational Imaging, Qilin Sun
RAW image
JPEG image
(dcraw –D)

GAMES204 Computational Imaging, Qilin Sun


Ø Darkcount Correction Ø Anti-alising
Ø Hot/Dead Pixels Removal Ø Demosaicking
Ø Vignetting Correction Ø Edge Enhancement
Ø White Balance Ø Color Correction
Ø Auto Exposure Ø Tone Mapping
Ø Distortation Correction Ø Compression

GAMES204 Computational Imaging, Qilin Sun


Sensor Input Dead Pixel Black Level Lens Shading Anti-aliasing
Correction Compensation Correction Noise Filter

Noise Filter AWB Gain


for Chroma Control

Color
Gamma CFA Noise Filter for
Correction
Correction Interpolation Luma
Matrix

Color Space Noise Filter for False Color Hue/Saturati-


Conversion Chroma Suppresion on Control

Non-local Means
Denoise

Bilateral Edge Contrast/Brigh- Data


Filter Enhancement tness Control Formatter
GAMES204 Computational Imaging, Qilin Sun
Marc Levoy, CS 448
GAMES204 Computational Imaging, Qilin Sun
Exif Meta Data (exchangeable)

GAMES204 Computational Imaging, Qilin Sun


Source: sensor manufacturing
Types of defective pixels:
Ø dead (always low)
Ø hot (always high)
Ø stuck (always a value between low and high)

GAMES204 Computational Imaging, Qilin Sun


Ø The differences between center pixel and neighbored 8 pixels
are calculated.

Ø The differences between center pixel and neighbored 8 pixels


are calculated.

GAMES204 Computational Imaging, Qilin Sun


Ø Interpolate with gradient-based filter

Ø Get directions gradient

Ø The output

GAMES204 Computational Imaging, Qilin Sun


Source: circuit design (not pure dark)

Correction:

where R_offset, Gr_offset, Gb_offset, B_offset, α, β are configurable parameters.

GAMES204 Computational Imaging, Qilin Sun


Non-local means
Ø Most image details occur repeatedly
Ø Each color indicates a group of squares
in the image which are almost
indistinguishable
Ø Image self-similarity can be used to
eliminate noise
Ø it suffices to average the squares which
resemble each other
Image and movie denoising by nonlocal means
Buades, Coll, Morel, IJCV 2006

GAMES204 Computational Imaging, Qilin Sun


BM3D (Block Matching 3D)

GAMES204 Computational Imaging, Qilin Sun


Approach
Ø Pixels are covered by color filters
Ø Twice as many green pixels as red or blue ones
Ø Interpreted as “intensity”
Ø Missing color values are interpolated for every pixel
Ø More in later lecture
Ø Note: camera specs report the total
number of pixels in a sensor
Ø i.e. a image recorded by a 1-chip
camera only consists of 1/3
measured values, 2/3 are interpolated

GAMES204 Computational Imaging, Qilin Sun


Automated selection of key camera control values
Ø auto-focus
Ø auto-exposure
Ø auto-white-balance

GAMES204 Computational Imaging, Qilin Sun


Target: removing unrealistic color casts

GAMES204 Computational Imaging, Qilin Sun


Ø Estimate the color temperature first

Ø Where Ravg, Gavg, Bavg is the average of different channels on bayer domain. is the
digital
Ø Gain applied to all channels to keep the image brightness constant.

GAMES204 Computational Imaging, Qilin Sun


Ø Use scene mode
Ø Use gray world assumption (R = G = B) in sRGB
space
Ø really, just R = B, ignore G
Ø Estimate color temperature in a given image
Ø apply pre-computed matrix to get sRGB for
T1 and T2
Ø calculate the average values R, B
Ø solve α, use to interpolate matrices (or 1/T)

GAMES204 Computational Imaging, Qilin Sun


Ø Goal: well-exposed image (not a very well defined goal!)
Ø Possible parameters to adjust
Ø Exposure time
Ø Longer exposure time leads to brighter image, but also motion blur
Ø Aperture (f-number)
Ø Larger aperture (smaller f-number) lets more light in causing the image
to be brighter, also makes depth of field shallower
Ø Phone cameras often have fixed aperture
Ø Analog and digital gain
Ø Higher gain makes image brighter but amplifies noise as well
Ø ND filters on some cameras

GAMES204 Computational Imaging, Qilin Sun


Ø Cumulative Density Function of image intensity values
Ø P percent of image pixels have an intensity

GAMES204 Computational Imaging, Qilin Sun


Ø Adjustment examples
Ø P = 0.995, Y = 0.9
Ø max 0.5% of pixels are saturated (highlights)
Ø P = 0.1, Y = 0.1
Ø max 10% of pixels are under-exposed (shadows)
Ø Auto-exposure somewhere in between, e.g., P = 0.9, Y =0.4

Highlights Auto-exposure Shadows


GAMES204 Computational Imaging, Qilin Sun
Ø Passive autofocus: contrast detection

Ø an image sensor will see an object as contrasty if it’s in focus, or of low contrast if it’s not
Ø move the lens until the image falling on the sensor is contrasty
Ø compute contrasty-ness using local gradient of pixel values

GAMES204 Computational Imaging, Qilin Sun


Passive autofocus: contrast detection

cinema lenses
do not autofocus

(howstuffworks.com)

Ø requires repeated measurements as lens moves,measurements are captured using the main
sensor
Ø equivalent to depth-from-focus in computer vision
Ø slow, requires hunting, suffers from overshootingit’s ok if still cameras overshoot, but
video cameras shouldn’t
GAMES204 Computational Imaging, Qilin Sun
Ø AI servo (Canon) / Continuous servo (Nikon)
Ø predictive tracking so focus doesn’t lag axially moving objects
Ø continues as long as shutter is pressed halfway
Ø focusing versus metering
Ø autofocus first, then meter on those points “trap focus”
Ø trigger a shot if an object comes into focus (Nikon)
Ø depth of field focusing
Ø find closest and furthest object; set focus and N accordingly
Ø overriding autofocus
Ø manually triggered autofocus (AF-ON in Canon)
Ø all autofocus methods fail if object is textureless!

GAMES204 Computational Imaging, Qilin Sun


(Goldberg)

Ø SONAR = Sound Navigation and Ranging


Ø Polaroid system used ultrasound (50KHz)
Ø well outside human hearing (20Hz - 20KHz)
Ø limited range, stopped by glass
Ø hardware salavaged and re-used in amateur robotics

GAMES204 Computational Imaging, Qilin Sun


Ø LIDAR = Light Detection and Ranging
Ø PointSpread Tech OKulo uses 940nm infrared light
Ø light travels 1 foot per nanosecond, so accuracy requires fast circuitry
(±5mm is typical)
Ø work in bright scenes
GAMES204 Computational Imaging, Qilin Sun
The technology depicted here is
not used on any current consumer
camera. Don’t confuse it with the
LED that turns on to illuminate a
dark scene and thereby assist
many cameras with phase or
contrast based passive
autofocusing. That LED is
sometimes called the “autofocus
assist light”.

Ø infrared (IR) LED flash reflects from subject


Ø angle of returned reflection depends on distance
Ø fails on dark or shiny objects

GAMES204 Computational Imaging, Qilin Sun


Source: Irradiance is proportional to
Ø projected area of aperture as seen from pixel
Ø projected area of pixel as seen from aperture
distance2 from aperture to pixel
Combining all these: light drops as cos4 θ
Types of Lens Shading:
Ø luma shading
Ø color shading.
Color shading Luma shading Ideal

GAMES204 Computational Imaging, Qilin Sun


Multiply a gain on each pixel at different locations

Ø Take a image with uniform light condition. And divide the image into m*n
blocks. Four points of each block have a correction coefficient. Store the
coefficients into look-up table.
Ø Based on the pixel location, calculating the pixel falls into which block. Then
get four coefficient of the block from LUT. (Or use surface fitting)
Ø Calculating the correction gain of the pixel by interpolation.
Ø Multiplying the correction gain by the pixel.

GAMES204 Computational Imaging, Qilin Sun


Source: Samping high freq image with low freq
Two layers of birefrigent material
Ø splits one ray into 4 rays

No anti-aliasing filtered Anti-aliasing filtered

GAMES204 Computational Imaging, Qilin Sun


Source: sensor color filter array

GAMES204 Computational Imaging, Qilin Sun


GAMES204 Computational Imaging, Qilin Sun
Ø Advantage: Easy to implement

Ø Disadvantage: Fails at sharp edges

GAMES204 Computational Imaging, Qilin Sun


Ø Use bilateral filtering
Ø avoid interpolating across edges

ADAPTIVE DEMOSAICKING Ramanath, Snyder, JEI 2003

GAMES204 Computational Imaging, Qilin Sun


Ø Predict edges and adjust
Ø assumptions
Ø luminance correlates with RGB
Ø edges = luminance change
Ø When estimating G at R
Ø if the R differs from bilinearly estimated R
Ø →luminance changes
Ø Correct the bilinear estimate
Ø by the difference between the estimate
and real value

HIGH-QUALITY LINEAR INTERPOLATION FOR


DEMOSAICING OF BAYER-PATTERNED COLOR IMAGES
Malvar, He, Cutler, ICASSP 2004
GAMES204 Computational Imaging, Qilin Sun
Color Correction

GAMES204 Computational Imaging, Qilin Sun


Source:
Ø Spectral characteristics of the optics
(lens, filters)
Ø Lighting source variations like daylight,
fluorescent, or tungsten
Ø Characteristics of the color filters of
the sensor

Color Correction Matrix (CCM) is dedicated to


transfer the sensor RGB color space to
sRGB/other color space.

GAMES204 Computational Imaging, Qilin Sun


Edge Enhancement

GAMES204 Computational Imaging, Qilin Sun


Ø Unsharp masking algorithm:
Ø Gain. This controls the extent to which contrast in the edge detected area is
enhanced.
Ø Radius or aperture. This affects the size of the edges to be detected or enhanced,
and the size of the area surrounding the edge that will be altered by the
enhancement. A smaller radius will result in enhancement being applied only to
sharper, finer edges, and the enhancement being confined to a smaller area
around the edge.
Ø Threshold. Where available, this adjusts the sensitivity of the edge detection
mechanism. A lower threshold results in more subtle boundaries of color being
identified as edges. A threshold that is too low may result in some small parts of
surface textures, film grain or noise being incorrectly identified as being an edge.

GAMES204 Computational Imaging, Qilin Sun


Ø Extract the edge map from Y channel (YCrCb)

Ø The filter is applied on 3x5 region.

Ø The extracted edge map is given by

Ø Where i, j ranges from -2 to 2 with interval of 1.


GAMES204 Computational Imaging, Qilin Sun
Ø The edge map (EM) is further modified through a lookup table (EMLUT) as

Where m1,m2 is the gain for different thresholds. When -x1 < x ≤ x2the pixel is noise not the
edge.
Ø There are two methods to calculate the Y value:

GAMES204 Computational Imaging, Qilin Sun


Ø The Avg is calculated by the gradient of Ø Find the minimum of , then Avg takes
neighbor pixels the value of that direction.

Ø The EMULUT(x) is further clipped before adding to the original Y channel

Ø To avoid over-shoot on edges, the Y’ is further clipped for output.

GAMES204 Computational Imaging, Qilin Sun


False Color Suppression

GAMES204 Computational Imaging, Qilin Sun


Source:
Ø demosaicing phase where very fine details resolved
Ø lens, optical filter, dynamic compression etc
Conducted on chroma (Cb, Cr) channel. Luma (Y) is not influenced
Ø The edge map acquired from Edge Enhancement is taken by absolute value:
EM’=abs(EM) . Then clipping the absolute edge map into {edge_min, edge_max}.

GAMES204 Computational Imaging, Qilin Sun


Ø The chroma gain is calculated by

Where

Ø Do false color suppression on chroma channel when edge is larger than edge_min.

GAMES204 Computational Imaging, Qilin Sun


Brightness/Contrast Control

GAMES204 Computational Imaging, Qilin Sun


Ø Brightness and contrast are both applied on luma channel.

Where brightness ranges from [-128, 127]

Where contrast is the adjustment ratio, ranges from [0, 1.992x].

GAMES204 Computational Imaging, Qilin Sun


Gamma and Tone Mapping

GAMES204 Computational Imaging, Qilin Sun


Ø Goal: accurately reproduce relative scene luminances on a display screen
Ø absolute luminance is impossible to reproduce
Ø humans are sensitive to relative luminance anyway
Ø in some workflows, pixel value is made proportional to scene luminance, in
other systems to perceived brightness
Ø in CRTs luminance was proportional to voltageγ with γ ≈ 2.5, so TV cameras
were designed to output voltage ∝ scene luminance1/γ
Ø pixel value ∝ luminance 1/2.5 is roughly perceptually uniform, so it’s a good
space for quantization, for example in JPEG files

GAMES204 Computational Imaging, Qilin Sun


Ø Manual editing
Ø capture image in RAW mode, then fiddle with histogram in Photoshop,
dcraw, Canon Digital Photo Professional, etc.
Ø to expand contrast, apply an S-curve to pixel values

(cambridgeincoulour.com)
GAMES204 Computational Imaging, Qilin Sun
Ø Mmanual editing
Ø capture image in RAW mode, then fiddle with histogram in Photoshop,
dcraw, Canon Digital Photo Professional, etc.
Ø to expand contrast, apply an S-curve to pixel values
Ø Gamma transform (in addition to RAW→JPEG gamma)
Ø output = inputγ (for 0 ≤ Ii ≤ 1)
Ø simple but crude

original γ = 0.5 γ = 2.0


GAMES204 Computational Imaging, Qilin Sun
Ø Mmanual editing
Ø capture image in RAW mode, then fiddle with histogram in Photoshop,
dcraw, Canon Digital Photo Professional, etc.
Ø to expand contrast, apply an S-curve to pixel values
Ø Gamma transform (in addition to RAW→JPEG gamma)
Ø output = inputγ (for 0 ≤ Ii ≤ 1)
Ø simple but crude
Ø Global versus local transformations

GAMES204 Computational Imaging, Qilin Sun


Today’s
Topic
Thank You!

Qilin Sun(孙启霖)

香港中文大学(深圳)

点昀技术(Point Spread Technology)

You might also like