Chapter 1 Handouts
Chapter 1 Handouts
Handouts
The function f(x, y) may be characterized by two components: (1) the amount of source illumination
incident on the scene being viewed, and (2) the amount of illumination reflected by the objects in the
scene. These are called the illumination and reflectance components and are denoted by i(x, y) and r(x, y),
respectively. The two functions combine as a product to form f(x, y):
The nature of i(x, y) is determined by the illumination source, and r(x, y) is determined by the
characteristics of the imaged objects. Figure 2 shows a simple image formation model.
Image sensors basically comprise two elements to acquire digital images. The first is a physical device
that is sensitive to the energy radiated by the object we wish to image. The second, called a digitizer, is a
device for converting the output of the physical sensing device into digital form. For instance, in a digital
video camera, the sensors produce an electrical output proportional to light intensity. The digitizer
converts these outputs to digital data.
Specialized image processing hardware usually consists of the digitizer, plus hardware that performs
other operations, such as an arithmetic logic unit (ALU), which performs arithmetic and logical
operations in parallel on entire images. One example of how an ALU is used is in averaging images as
quickly as they are digitized, for the purpose of noise reduction.
The computer in an image processing system is a general-purpose computer and can range from a PC to
a supercomputer. In a general-purpose image processing system, almost any well-equipped PC-type
machine is suitable for off-line image processing tasks.
Image processing software consists of specialized modules that perform specific tasks. A well-designed
package also includes the capability for the user to write code that utilizes the specialized modules.
Mass storage capability is a must in image processing applications. An image of size 1024 x1024 pixels,
in which the intensity of each pixel is an 8-bit, requires one megabyte of storage space if the image is not
compressed. Hence, providing adequate mass storage in a digital image processing system is necessary.
Image displays in use today are mainly color TV monitors. Monitors are driven by the outputs of image
and graphics display cards that are an integral part of the computer system. In some cases, it is necessary
to have stereo displays, and these are implemented in the form of headgear containing two small displays
embedded in goggles worn by the user.
Hardcopy devices for recording images include laser printers, film cam-eras, heat-sensitive devices,
inkjet units, and digital units, such as optical and CD-ROM disks.
Networking component is required for transmission of data to remote sites. Most of the general purpose
image processing systems nowadays are equipped with networking components.
Figure 17: The electromagnetic spectrum arranged according to energy per photon
8.1. Gamma-Ray Imaging
Major uses of imaging based on gamma rays include nuclear medicine and astronomical observations. In
nuclear medicine, the approach is to inject a patient with a radioactive isotope that emits gamma rays as it
decays. Images are produced from the emissions collected by gamma ray detectors.
8.2. X-Ray Imaging
X-rays are among the oldest sources of EM radiation used for imaging. The best known use of X-rays is
medical diagnostics, but they also are used extensively in industry and other areas, like astronomy. X-rays
for medical and industrial imaging are generated using an X-ray tube, which is a vacuum tube with a
cathode and anode.
8.3. Ultraviolet Imaging
Ultraviolet imaging includes lithography, industrial inspection, microscopy, lasers, biological imaging,
and astronomical observations. Ultraviolet light is used in fluorescence microscopy, one of the fastest
growing areas of microscopy.
8.4. Visible and Infrared Imaging
The infrared band often is used in conjunction with visual imaging. The examples range from
pharmaceuticals and micro inspection to materials characterization. Another major area of visual
processing is remote sensing, which usually includes several bands in the visual and infrared regions of
the spectrum.
8.5. Microwave Imaging
The dominant application of imaging in the microwave band is radar. The unique feature of imaging radar
is its ability to collect data over virtually any region at any time, regardless of weather or ambient lighting
conditions. Some radar waves can penetrate clouds, and under certain conditions can also see through
vegetation, ice, and extremely dry sand. In many cases, radar is the only way to explore inaccessible
regions of the Earth’s surface.
8.6. Radio Imaging
The major applications of imaging in the radio band are in medicine and astronomy. In medicine radio
waves are used in magnetic resonance imaging (MRI). This technique places a patient in a powerful
magnet and passes radio waves through his or her body in short pulses. Each pulse causes a responding
pulse of radio waves to be emitted by the patient’s tissues. The location from which these signals
originate and their strength are determined by a computer, which produces a two-dimensional picture of a
section of the patient.
9. Image Sensing and Acquisition
An image sensor or imaging sensor is a sensor that detects and conveys the information that constitutes
an image. It does so by converting light waves (as they pass through or reflect off objects) into signals,
the small bursts of current that convey the information. The waves can be light or other electromagnetic
radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which
include digital cameras, camera modules, medical imaging equipment, night vision equipment such
as thermal imaging devices, radar, sonar, and others.
Figure 18 shows the single imaging sensor used to transform illumination energy into digital images.
Incoming energy is transformed into a voltage by the combination of input electrical power and sensor
material that is responsive to the particular type of energy being detected. The output voltage waveform is
the response of the sensor(s), and a digital quantity is obtained from each sensor by digitizing its
response.
Figure 21: Generating a digital image. (a) Continuous image. (b) A scan line from A to B in the continuous image,
used to illustrate the concepts of sampling and quantization. (c) Sampling and quantization. (d) Digital scan line.
11. Spatial Resolution
Spatial resolution is a term that refers to the number of pixels utilized in construction of a digital image.
Images having higher spatial resolution are composed with a greater number of pixels than those of lower
spatial resolution. The detail discernible in an image is dependent on the spatial resolution of the sensor
and refers to the size of the smallest possible feature that can be detected. Figure 22 shows an image with
different spatial resolutions.
12. Gray Level Resolution
In short gray level resolution is equal to the number of bits per pixel. Gray level resolution refers to the
predictable or deterministic change in the shades or levels of gray in an image. The mathematical relation
that can be established between gray level resolution and bits per pixel can be given as L 2 , where L
k
refers to the number of gray levels and k refers to bpp or bits per pixel. Figure 23 shows an image with
different gray level resolutions.
Figure 22: An image with different spatial resolutions
Figure 23: A 452 x 376 image with gray levels of 256, 128, 64, 32, 16, 8, 4 and 2.