0% found this document useful (0 votes)
2 views52 pages

Lecture01-Part I

The document outlines a Data Science program focused on Image Processing, covering key concepts, theories, and algorithms in digital image processing. It includes a detailed course outline, essential and recommended textbooks, and an introduction to digital image processing, image sources, and the formation and representation of digital images. The course aims to develop hands-on experience with tools like MATLAB and Python's OpenCV library while fostering critical thinking about current advancements in the field.

Uploaded by

Zyad Samy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views52 pages

Lecture01-Part I

The document outlines a Data Science program focused on Image Processing, covering key concepts, theories, and algorithms in digital image processing. It includes a detailed course outline, essential and recommended textbooks, and an introduction to digital image processing, image sources, and the formation and representation of digital images. The course aims to develop hands-on experience with tools like MATLAB and Python's OpenCV library while fostering critical thinking about current advancements in the field.

Uploaded by

Zyad Samy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 52

DATA SCIENCE PROGRAM

DS617

Image Processing

Lecture 1 – Part I
Course Objectives
 Cover basic concepts, theories and algorithms widely used
in digital image processing

 Develop hands-on experience in processing digital images

 Familiarize with Image Processing Tools (MATLAB Image


Processing Toolbox / OpenCV Library with Python)

 Develop critical thinking about the state of the art


Course Outline
1. Introduction To Digital Image Processing
2. Digital Image Enhancement
A. Intensity Transformations
B. Digital Image Filtering
I. In Spatial Domain
II. In Frequency Domain
3. Digital Image Restoration
4. Morphological Digital Image Processing
5. Wavelets And Multi-Resolution Processing
6. Colored Digital Image Processing
7. Digital Image Segmentation
8. Representation And Description
9. Digital Image Compression
Course Essential Textbooks
 Digital Image Processing
Rafael C. Gonzalez and Richard E. Woods
4th Edition, Pearson Prentice Hall, 2018
http://www.imageprocessingplace.com/DIP-4E/dip4e_main_page.htm
Course Recommended Textbooks - 1
 Digital Image Processing Using Fundamentals of Digital Image
MATLAB Processing: A Practical Approach
with Examples in MATLAB
Rafael C. Gonzalez, Richard E.
Woods and Steven L. Eddins Chris Solomon and Toby Breckon
2nd Edition, Pearson Prentice 1st Edition, Wiley-Blackwell,2011
Hall, 2009 https://www.wiley.com/en-
eg/Fundamentals+of+Digital+Image+
http://www.imageprocessingplace.c Processing%3A+A+Practical+Approac
om/DIPUM2E/dipum2e_main_page. h+with+Examples+in+Matlab-p-
htm 9781119957003
Course Recommended Textbooks - 2
 Hands-On Image Processing with Python: Advanced Methods for
Analyzing, Transforming, and Interpreting Digital Images with
Expertise
Sandipan Dey
2nd Edition, Packt Publishing, 2025
Lecture Outline
 Introduction To Digital Image Processing
1. Image Definition
2. Image Sources
3. Image Formation
4. Image Processing Definition
5. Digital Image Processing Definition
6. Digital Image Definition
7. Digital Image Capturing
8. Digital Image Sensing
9. Digital Image Representation
10. Criteria For Best Image Representation
11. Digital Image Types
12. Fields Of Digital Image Processing
13. Fundamental Steps In Digital Image Processing
What Is An Image?
 An image is a projection of a 3D scene into a 2D projection plane.

 An image is defined mathematically as a 2D-function of two variables,


f(x,y): R2R , where for each position (x,y) in the projection plane,
f(x,y) defines the light intensity at this position.

 x and y are spatial coordinates, and the amplitude of f at any pair of


coordinates (x, y) is called the intensity or gray-level (scale) of the
image at that coordinates.

8
What Is An Image? - 2
 Assume that an image intensity values can be any real numbers in the
range 0.0 (black) to 1.0 (white).

9
What Are Image Sources ?
 The principal source for images is the electromagnetic (EM) energy
spectrum ranging from gamma rays to radio waves.

 EM waves can be visualized as propagating sinusoidal waves with


wavelengths.

 Images based on radiation from the EM spectrum are the most


familiar, especially in the X-ray and visible colors of the spectrum.
What Are Image Sources ? - 2
 The visible color spectrum is divided into six broad regions: violet, blue,
green, yellow, orange, and red.

 The colors that humans perceive in an object are determined by the


nature of the light reflected from the object.

 An object that reflects light relatively balanced in all visible wavelengths


appears white to the observer.

 However, an object that favors reflectance in a limited range of the


visible spectrum exhibits some shades of color.

 Light is a particular type of EM radiation that is sensed by human eye.

 Light that is void of color is called monochromatic (achromatic) light,


where its only attribute is its intensity or amount.
What Are Image Sources ? - 3
 Because the perceived intensity of monochromatic light vary from
black to grays and finally to white, the term gray level is used
commonly to denote monochromatic intensity.

 The range of measured values of monochromatic light from black to


white is usually called the gray scale.

 The other type of light is the chromatic (colored) light.

 Three basic quantities are used to describe the quality of a chromatic


light source: radiance, luminance, and brightness.

 Radiance is the total amount of energy that flows from the light
source, and it is usually measured in watts.
What Are Image Sources ? - 4
 Luminance, measured in lumens, gives a measure of the amount of
energy an observer perceives from a light source.

 Brightness is a subjective descriptor (that is practically impossible to


measure ) of reflected or absorbed light perception from an object by
an observer, and it is affected by the background.
What Are Image Sources ? - 5
What Are Image Sources ? - 6
 Other important sources of images include:
 Ultrasonic used for medical diagnosis.
What Are Image Sources ? - 7
 Other important sources of images include:
 Electronic energies as scanning electron microscopy (SEM) shows
information on a material’s surface composition and topography.
What Are Image Sources ? - 8
 Other important sources of images include:
 Synthetic images generated by computer used for modeling and
visualization.
How Is An Image Formed?
 Most of the images are generated by the combination of an “illumination”
source and the reflection or absorption of energy from that source by the
elements of the “scene” being imaged.

 The 2D Image function, f(x,y), may be mathematically characterized by two


components:
1. The amount of illumination incident on the scene being viewed,
i(x,y), that is determined by the illumination source.

2. The amount of illumination reflected or absorbed by the object in the


scene, r(x,y), is determined by the characteristics of the imaged object.

 The two components combine as a product to form f(x,y):


f(x,y) = i(x,y) * r(x,y)
Where 0 < i(x,y) < ∞
(Total Absorption) 0 < r(x,y) < 1 (Total Reflection)
How Is An Image Formed? - 2
 To transform illumination energy into an image:
1. Incoming energy is transformed into a voltage by the combination
of input electrical power and sensor material that is responsive to
the particular type of energy being detected.

2. The output voltage waveform is the response of the sensor(s), and


a digital quantity is obtained from each sensor by digitizing its
response.

3. Arrangement of individual sensors in the form of line or 2D-array.


What Is Image Processing?
 Changing the nature of an image in order to either:
1. Improve its pictorial information for human interpretation.
2. Render it more suitable for autonomous machine perception.

 These two aspects represent two separate but equally important


aspects of image processing.

 Procedure which satisfies condition (1) -makes an image look


better- may be the very worst procedure for satisfying condition
(2) and vice versa.

 Humans like their images to be sharp, clear and detailed.

 Machines prefer their images to be simple and uncluttered.


20
What Is Image Processing? - 2
 Examples of image processing methods satisfying (1) may include:
 Enhancing the edges of an image to make it appear sharper in order
for an image to appear at its best on the printed page.

21
What Is Image Processing? - 3
 Removing noise (random errors ) from an image. Noise is a very
common problem in data transmission because all sorts of electronic
components may affect data passing through them, and the results
may be undesirable. Noise may take many different forms; each type
of noise requiring a different method of removal.

22
What Is Image Processing? - 4
 Removing motion blur from an image. Motion blur may occur when
the shutter speed of the camera is too long for the speed of the
object. In photographs of fast moving objects: vehicles for example,
the problem of blur may be considerable.

23
What Is Image Processing? - 5
 Examples of image processing methods satisfying (2) may include:
 Obtaining the edges of an image using edge detection algorithms for
the measurement of objects in an image; Once we have their edges
we can measure their spread, and the area contained within them.

24
What Is Image Processing? - 6
 Removing detail from an image. For measurement or counting
purposes, we may not be interested in all the detail in an image so we
simplify the image to measure the size and shape of the object
without being distracted by unnecessary detail.

25
What Is Digital Image Processing?
 Processing digital images by means of a digital computer.

 Some argument about where digital image processing ends and


computer vision starts.

 The continuum from image processing to computer vision can be


broken up into low-, mid- and high-level processes.

26
What Is Digital Image Processing? - 2
Low Level Process Mid Level Process High Level Process
Input: Image Input: Image Input: Attributes
Output: Image Output: Attributes Output: Understanding
Examples: Examples: Examples:
Noise Removal, Object Segmentation, Scene Understanding,
Image Sharpening Object Recognition Autonomous Navigation

Image Processing Computer Vision

Image Acquisition, Edge Detection Object Detection,


Image Enhancement, Region Segmentation, Object Recognition,
Image Restoration, Feature Extraction, Object Tracking,
Image Compression, Pattern Matching, Scene Reconstruction,
… … …

27
What Is Digital Image Processing? - 3

Classification / decision
• Recognition,
Interpretation
Algorithm HIGH Amount of – Intelligent
Complexity Data • Edges Extraction &
Increases MEDIUM Decreases Joining
• Acquisition,
LOW Preprocessing
– No Intelligence

Raw data
What Is A Digital Image?
 When x, y, and the amplitude values of f(x,y) are all finite, discrete
quantities (usually they take on only integer values), we call the
image a digital image.

 A digital image is composed of a finite number of elements, each of


which has integer spatial coordinates (location) and intensity value,
and these elements are referred to as picture elements, image
elements, pels, and pixels.

 Pixel is the term that is used most widely to denote an element of a


digital image.

• I(m,n) is the term used most widely to denote a digital image, where
m =0, 2, . . . ,M-1 and n = 0, 2, . . . ,N-1 denote the pixel located at the
mth row and nth column starting from a top-left image origin.
29
What Is A Digital Image? - 2

30
How Is A Digital Image Captured?
1. Charged Coupled Device (CCD) Camera: Such a camera has a 2D-
array of silicon electronic devices (sensors) whose voltage output is
proportional to the intensity of light falling on them.
How Is A Digital Image Captured? - 2
2. Flat Bed Scanner: This works on a principle similar to the CCD
camera. Instead of the entire image being captured at once on a
large array, a single row of photosites (sensors) is moved across the
image, capturing it row-by-row as it moves.
How Is A Digital Image Captured? - 3
 Image Acquisition I(m,n)

f(x,y) = i(x,y) * r(x,y)


How Is A Digital Image Sensed?
 The output of most sensors is a analog voltage waveform whose
amplitude and spatial coordinates are continuous.

 To convert it to digital form, we have to digitize the function or make


it discrete in both coordinates and in amplitude.

 Digitizing the coordinates values is called image sampling.

 Digitizing the amplitude values is called image quantization.

 The quality of a digital image is determined to a large degree by the


number of discrete samples (sampling rate) and discrete intensity
levels (quantization levels) used in sampling and quantization
respectively.
How Is A Digital Image Sensed? - 2
 Image sampling is dividing the captured continuous mage
coordinates into discrete pixels of the same size and shape.

 Resolution is measure of how much the detail of an image could be


seen or perceived.

 Sampling resolution is measure of the smallest perceptible detail in


the image and it is dependent on the used number of samples.
How Is A Digital Image Sensed? - 3
 Image Sampling
Spatial resolution = Sampling locations
Original image
Sampled image

Under sampling, we lost some image details!


How Is A Digital Image Sensed? - 4
 Image Sampling
Original image

Sampled image

1mm

2mm

No detail is lost!
Minimum Spatial resolution Nyquist Rate: Spatial resolution must
Period (sampling rate) be less or equal half of the minimum
period of the image or sampling
= Sampling locations frequency must be greater or equal
twice of the maximum frequency.
How Is A Digital Image Sensed? - 5
 Varying Image Sampling
a b c
d e f
(a) 1024x1024
(b) 512x512
(c) 256x256
(d) 128x128
(e) 64x64
(f) 32x32

If resolution is decreased too much, the checkerboards effect can occur.


How Is A Digital Image Sensed? - 6
 Image quantization is dividing the captured image continuous
intensity range of values into discrete number of intensity (gray)
levels according to the number of assigned bits to encode gray level.

 Resolution is measure of how much the detail of an image could be


seen or perceived.

 Quantization resolution is measure of the smallest perceptible


change in intensity level in the image and it is dependent on the
used number of quantization levels.
How Is A Digital Image Sensed? - 7
 Varying Image Quantization
(e) levels = 16
(f) levels = 8
(g) levels = 4
(h) levels = 2

If levels are not enough, false contouring can occur on the smooth area
How Is A Digital Image Sensed? - 8
 Image Sampling & Quantization
How Is A Digital Image Sensed? - 9
 Image Sampling & Quantization
How Is A Digital Image Represented?
 After image sampling and quantization, the digitized image function,
f(x,y), is represented as 2D numerical array (matrix) containing M
rows and N columns where x = 0,1,2,…, M-1 and y = 0,1,2,…,N-1.

 There are no restrictions placed on M and N, other than they have to


be positive integers

 Due to storage and quantizing hardware considerations, the number


of intensity levels L typically is an integer power of 2
L = 2k
Where k is the number of bits assigned to store the intensity value.
How Is A Digital Image Represented? - 2
 Spatial resolution refers to the smallest detectable detail in an image,
and it is measured by the number of dots (pixels) per unit distance,
i.e., inch (dpi).

 Intensity resolution (pixel depth) refers to the smallest detectable


change in an image intensity level, and it is measured by the number
of bits, k, (usually is an integer power of two due to hardware
considerations) used to quantize intensity.

 The spatial and intensity resolutions of an image interact in


determining its size in bits and perceived quality.

 The number, b, of bits required to store a digitized image is b=M*N*k

 If M = N, then b = N2*k
How Is A Digital Image Represented? - 3
 The discrete intensity levels are equally spaced and they are integers
in the interval [0, L-1].

 The dynamic range of an imaging system is the ratio of the maximum


measurable intensity to the minimum detectable intensity level in
the system and, consequently, that its output image can have

 Closely associated with the dynamic range concept is image contrast,


which is the difference in intensity between the highest and lowest
intensity levels in an image.

 When an appreciable number of pixels in an image have a high


dynamic range, we can expect the image to have high contrast.

 Conversely, an image with low dynamic range typically has a dim,


washed-out gray look and low contrast.
How Is A Digital Image Represented? - 4
 As a rule, the upper intensity limit is determined by saturation and
the lower intensity limit by noise.

 Saturation is the highest value beyond which all intensity levels are
clipped.

 Noise is a variation of the intensity level from its true value by a small
(random) amount due to external or internal factors in the image-
processing pipeline.
How Is A Digital Image Represented? - 5
How Is A Digital Image Represented? - 6
How Is A Digital Image Represented? - 7
What Is The Best Digital Image Representation?
 How to select the suitable sampling and quantization resolutions for
digitizing an image?

 The word “suitable” is subjective: depending on “subject”.

Low Detail Image Medium Detail Image High Detail Image


Face Cameraman Crowd
 Sets of these three types of images were generated by varying N and
k, and observers were then asked to rank them according to their
subjective quality.
What Is The Best Digital Image Representation? - 2
 Results were summarized in the form of isopreference curves in the
Nk-plane.

 Each point in the Nk-plane represents an image having values of N


and k equal to the coordinates of that point.

 Points lying on an isopreference curve correspond to images of equal


subjective quality.
What Is The Best Digital Image Representation? - 3
 The isopreference curves tend to become more vertical as the detail
in the image increases (crowd image curve is nearly vertical).

 So for a fixed value of N, the perceived quality for high detail image is
nearly independent of the number of intensity levels used.

 For the other two image categories, the perceived quality remained
the same in some intervals in which the number of samples was
increased, but the number of intensity levels actually decreased.

 The most likely reason for this result is that a decrease in k tends to
increase the apparent contrast, a visual effect that humans often
perceive as improved quality in an image.

 So for images of the same size, low detail image need more pixel depth
and as image size increase, fewer gray levels are needed.

You might also like