Unit - 3
Unit - 3
COMPUTER VISION
1 7 A I X 0 7
ALIGNMENT
J a n u a r y | 2 0 2 4
CONTENTS
Points and Patches 01
Edges 02
Lines 03
Segmentation 04
Pose Estimation 10
Scale-Invariant
Points Feature Transform
(SIFT) Points
Interest Points Corner Points Edge Points
Patches
Descriptor
Image Patches Patch Texture Patches
Matching Patches
1. Object Recognition
2. Image Stitching
4. Machine Learning
5. Texture Analysis
Edges
Definition
1. Object Recognition
2. Image Segmentation
3. Feature Extraction
4. Image Stitching
7. Quality Inspection
8. Gesture Recognition
Lines
Definition
In computer vision, lines are significant geometric features that represent straight or curved
paths within an image. Detecting lines is a fundamental aspect of image processing and
computer vision, contributing to tasks such as object recognition, scene analysis, and image
understanding.
Key Concepts
2. Document Analysis
6. Robotics
8. Gesture Recognition
Segmentation
Definition
Clustering
Utilizes clustering algorithms, such as k-means, to group
04 pixels with similar features into clusters, representing
different segments.
Edge-Based Segmentation
05 Detects edges and boundaries between different regions in
an image, defining the limits of distinct segments.
Watershed Segmentation
06 Mimics the behavior of water flowing along gradients in an image.
Watershed lines mark boundaries between segments.
Applications
1. Object Recognition
4. Autonomous Vehicles
6. Facial Recognition
7. Video Surveillance
8. Robotics
Split and Merge
Definition
Merging
• After the splitting phase, regions are
Recursive Splitting Homogeneity Criteria
examined for homogeneity.
The image is initially The decision to split or merge • Adjacent regions that satisfy the
split into smaller sub- regions is based on homogeneity homogeneity criteria are merged to
regions. This splitting criteria, which can include color form larger, more homogeneous
process is applied consistency, intensity similarity, segments.
recursively until certain or texture uniformity within a • The merging process continues
criteria are met or a region. until stopping criteria are satisfied.
predefined segmentation
level is reached.
Splitting
Stopping Criteria • The image is initially considered as a Region Merging
single region. If the region does not After splitting, regions are merged if
The process continues until stopping
meet the homogeneity criteria, it is they satisfy the homogeneity criteria.
criteria are met. Stopping criteria could
split into smaller sub-regions. Merging involves combining adjacent
include reaching a specified number of •
The splitting process is applied regions to create larger, more
segments, achieving a desired level of
recursively until stopping criteria are coherent segments.
homogeneity, or other application-specific
met.
requirements.
Applications
1. Image Segmentation
4. Texture Analysis
5. Object Recognition
7. Remote Sensing
Mean Shift and Mode Finding
Definition
Definition
Histogram Analysis
Mode finding often involves
Intensity Peaks
analyzing histograms to identify
peaks, which correspond to the For grayscale images, mode finding
modes in the distribution of pixel can identify intensity levels that
values or feature responses. occur most frequently, providing
insights into the overall brightness
or contrast.
Dominant Colors
In image processing, mode finding
can be applied to identify dominant
colors within an image. This is useful
in tasks like color-based image
segmentation.
Normalized Cuts
Definition
Graph Representation
Represent the image as a graph, where pixels are nodes, and edges
represent pairwise relationships. Edge weights typically capture the
similarity between pixel intensities or feature vectors.
Affinity Matrix
Construct an affinity matrix based on the pairwise similarities. Common
choices for similarity measures include Gaussian functions or gradients.
Spectral Decomposition
Decompose the affinity matrix into its eigenvectors and eigenvalues. The
eigenvectors capture the structural information of the graph.
1. Image Segmentation
2. Object Recognition
4. Video Segmentation
5. Scene Understanding
6. Image Editing
Graph Cuts and Energy-Based
Methods
Definition
1. Image Segmentation
2. Interactive Segmentation
3. Video Segmentation
4. Stereo Matching
5. Object Recognition
Graph Cuts and Energy-Based
Methods
Definition
Energy Function
Smoothness Term
Define an energy function that quantifies the
goodness or badness of a particular Encodes the preference for smooth
configuration. The energy function typically configurations. It penalizes abrupt changes and
consists of data and smoothness terms. encourages coherence in neighboring regions.
Data Term
Optimization
Measures the agreement between the
Minimize the energy function by searching
observed data and the predicted
for the configuration that achieves the
configuration. Lower energy is assigned
lowest energy. This can be done using
to configurations that better match the
optimization techniques such as gradient
observed data.
descent or graph cuts.
Applications
1. Image Denoising
2. Object Recognition
3. Image Restoration
4. Motion Estimation
5. Image Inpainting
2D and 3D Feature-Based
Alignment
Definition
Definition
Definition
2. Robotics
3. 3D Reconstruction
4. Object Tracking
5. Autonomous Vehicles
6. Medical Imaging
Geometric Intrinsic Calibration
Definition
Calibration Patterns
2. Robotics
Variability in Environment
3. 3D Reconstruction Changes in lighting conditions or environmental
factors can impact the accuracy of calibration.
4. Computer Vision System
Lens Distortion Modeling
Accurate modeling of lens distortion is crucial
for precise calibration and correction.
THANK YOU
THANKS