0% found this document useful (0 votes)
30 views

Chapter 4

Uploaded by

Legesse Samuel
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views

Chapter 4

Uploaded by

Legesse Samuel
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 40

Chapter 4

Fundamental concepts in
video

1
Outline
 Introduction to video data
 Types of Video
 Analog Video
 Digital Video
 Types of Video Display
 Types of color Video Signals
 Different TV Standards
 Four factors of digital video

2
Introduction
 This chapter introduce the principal notions
needed to understand video.
 Digital video compression will be explored
later chapter.
 Since video is created from a variety of
sources, we begin with the signals
themselves.

3
Introduction
Video is the technology of electronically
 capturing,
 recording,
 processing,
 storing,
 transmitting, and
 reconstructing a sequence of still images
representing scenes in motion.

4
Basic concepts in Video
 Video is a series of images. When this series of
images are displayed on screen at fast speed ( e.g 30
images per second), we see a perceived motion.
 It projects single images at a fast rate producing the
illusion of continuous motion.
 These single images are called frames.
 The rate at which the frames are projected is
generally between 24 and 30 frames per second (fps).
 Each screen-full of video is made up of thousands of
pixels.
 A pixel is the smallest unit of an image. A pixel can
display only one color at a time.
 Your television has 720 vertical lines of pixels (from
left to right) and 486 rows of pixels (top to bottom).
 A total of 349,920 pixels (720 x 486) for a single
frame.
5
Types of Video
 There are two Types of video
 Analog video is represented as a continuous
(time-varying) signal.
 Digital video is represented as a sequence of
digital images.

6
Analog Video
 Most TV is still sent and received as analog signal.
 An analog signal f(t) samples a time-varying image.
So-called “progressive” scanning traces through a
complete picture (a frame) row-wise for each time
interval.
 A high resolution computer monitor typically uses a
time interval of 1/72 second.
 In TV, and in some monitors and multimedia
standards as well, another system, called
“interlaced” scanning is used:
 The odd-numbered lines are traced first, and then the
even-numbered lines are traced. This results in “odd” and
“even” fields —two fields make up one frame.
 In fact, the odd lines (starting from 1) end up at the middle of a line
at the end of the odd field, and the even scan starts at a half-way
point.
7
Analog Video

 Figure 5.1 shows the scheme used. First the solid (odd) lines are
traced, P to Q, then R to S, etc., ending at T; then the even field
starts at U and ends at V.
 The jump from Q to R, etc. in Figure 5.1 is called the horizontal
retrace, during which the electronic beam in the CRT is blanked.
The jump from T to U or V to P is called the vertical retrace.
 The scan lines are not horizontal because a small voltage is
applied, moving the electron bean down over time.

8
Analog Video
 Interlacing was invented because, when
standards were being defined, it was difficult to
transmit the amount of information in a full
frame quickly enough to avoid flicker, the double
number of fields presented to the eye reduces
the eye perceived flicker.
 Because of interlacing, the odd and even lines
are displaced in time from each other —generally
not noticeable except when very fast action is
taking place on screen, when blurring may occur.
 Since it is sometimes necessary to change the
frame rate, resize, or even produce stills from an
interlaced source video, various schemes are
used to “de-interlace” it.
9
Analog Video
a) The simplest de-interlacing method consists
of discarding one field and duplicating the
scan lines of the other field. The information
in one field is lost completely using this
simple technique.
b) Other more complicated methods that retain
information from both fields are also
possible.

10
Digital video
 The advantages of digital representation for
video are many, For example:
 Video can be stored on digital devices or in
memory, ready to be processed (noise removal,
cut and paste, etc.), and integrated to various
multimedia applications;
 Direct access is possible, which makes
nonlinear video editing achievable as a simple,
rather than a complex, task;
 Repeated recording does not degrade image
quality.
 Ease of encryption and better tolerance to channel
noise
11
The disadvantages of digital video are:
 Analog-type of distortions, as well unique
digital distortions related to sampling and
quantizing, result in a variety of visible
impairments.
 Wide bandwidth requirements for
recording, distribution and transmission
necessitate sophisticated bit-rate reduction
and compression schemes to achieve
manageable bandwidths.
 Unlike analog signals, the digital signals do
not degrade gracefully and are subjected to
a cliff effect.
12
Types of video Display
There are two ways of displaying video on
screen:
 Interlaced scanning
 Progressive scanning

13
Interlaced Scanning
 Interlaced scanning writes every second line
of the picture during a scan, and writes the
other half during the next sweep.
 Doing that we only need 25/30 pictures per
second.
 This idea of splitting up the image into two
parts became known as interlacing and the
splitted up pictures as fields.
 Graphically seen a field is basically a picture
with every 2nd line black/white.

14
Interlaced Scanning
 During the first scan the upper field is
written on screen.
 The first, 3rd, 5th, etc. line is written and
after writing each line the electron beam
moves to the left again before writing the
next line.
 Currently the picture exhibits a
"combing" effect, it looks like you’re
watching it through a comb.
 When people refer to interlacing artifacts
or say that their picture is interlaced this
is what they commonly refer to.
 Once all the odd lines have been written
the electron beam travels back to the
upper left of the screen and starts writing
the even lines.
 As it takes a while before the phosphor
stops emitting light and as the human
brain is too slow instead of seeing two
fields what we see is a combination of
both fields - in other words the original
15 picture.
Interlaced Scanning

16
Progressive Scanning
 PC CRT displays are fundamentally different
from TV screens.
 Monitor writes a whole picture per scan.
 Progressive scan updates all the lines on the
screen at the same time, 60 times every
second.
 This is known as progressive scanning.
 Today all PC screens write a picture like this.

17
Progressive Scanning

18
Comparisons b/n computer and TV
Display

Computer Television
Scans 480 horizontal lines from Scans 625, 525 horizontal lines
top to bottom
Scan each line progressively Scan line using interlacing
system
Scan full frame at a rate of Scan 25-30 HZ for full time
typically 66.67 HZ or higher

Use RGB color model Uses limited color palette and


restricted luminance (lightness
or darkness)

19
Recording Video
 CCDs (Charge Coupled Devices) a chip containing a
series of tiny, light-sensitive photo sites.
 It forms the heart of all electronic and digital
cameras. CCDs can be thought of as film for
electronic cameras.
 CCDs consist of thousands or even millions of cells,
 Each of which is light-sensitive and capable of
producing varying amounts of charge in response to
the amount of light they receive.
 The electrons pass through an analog-to-digital
converter, which produces a file of encoded digital
information in which bits represent the color and
tonal values of a
subject.
20
Recording Video
 Digital camera uses lens which focuses the image
onto a Charge Coupled Device (CCD), which then
converts the image into electrical pulses.
 These pulses are then saved into memory.
 In short, Just as the film in a conventional camera
records an image when light hits it, the CCD
records the image electronically.
 The photo sites convert light into electrons.
 The performance of a CCD is often measured by
its output resolution, which in turn is a function of
the number of photo sites on the CCD's surface.

21
Recording Video

22
Chrominance
Chrominance (chroma for short), is the signal used in video
systems to convey the color information of the picture,
separately from the accompanying luma signal.

Chrominance is usually represented as two color-difference


components: U = B'–Y' (blue – luma) and V = R'–Y' (red –
luma). Each of these difference components may have scale
factors and offsets applied to them, as specified by the
applicable video standard.

luma represents the brightness in an image

23
Types of Video Signals
 Video signals can be organized in three different ways:
component video, composite video, and S-video.
 Component video
 In popular use, it refers to a type of analog video
information that is transmitted or stored as three separate
signals for the red, green, and blue image planes. Each
color channel is sent as a separate video signal.
 This kind of system has three kind wires (and connectors)
connecting the camera or other devices to a TV or
monitor.
 Most computer systems use Component Video, with
separate signals for R, G, and B signals.
 For any color separation scheme, Component Video gives the
best color reproduction since there is no “crosstalk” between
the three channels.
 This is not the case for S-Video or Composite Video, discussed
next. Component video, however, requires more bandwidth
and good synchronization of the three components.

24
Component video

25
Composite Video — 1 Signal
 Composite video: color (“chrominance”)
and intensity (“luminance”) signals are
mixed into a single carrier wave.
 This type of signal used by broadcast color TVs; it is
downward compatible with black-and-white TV.
 When connecting to TVs, Composite Video uses only one
wire and video color signals are mixed, not sent
separately. The audio and sync signals are additions to
this one signal.
 Since color and intensity are wrapped into the
same signal, some interference between the
luminance and chrominance signals is
inevitable
26
S-Video (separate video) — 2
Signals
 S-video as a compromise, uses two wires, one for
luminance and another for a composite chrominance
signal.
 As a result, there is less crosstalk between the color
information and the crucial gray-scale information.
 The reason for placing luminance into its own part of
the signal is that black-and-white information is most
crucial for visual perception.
 In fact, humans are able to differentiate spatial
resolution in grayscale images with a much higher
acuity than for the color part of color images.
 As a result, we can send less accurate color
information than must be sent for intensity
information — we can only see fairly large blobs of
color, so it makes sense to send less color detail.
27
High definition TV (HDTV)
 refers to video having resolution substantially
higher than traditional television systems. HD
has one or two million pixels per frame.
 The first generation of HDTV was based on an
analog technology developed by Sony in Japan
in the late 1970s.
 Modern plasma television uses this
 It consists of 720-1080 lines and higher
number of pixels (as many as 1920 pixels).
 Having a choice in between progressive and
interlaced is one advantage of HDTV.

28
Video Broadcasting Standards/ TV standards
 There are three different video broadcasting
standards:
 PAL,
 NTSC, and
 SECAM

29
NTSC Video
 NTSC (National Television System Committee) TV
standard is mostly used in North America and Japan. It
uses the familiar 4:3 aspect ratio (i.e., the ratio of
picture width to its height) and uses 525 scan
lines per frame at 30 frames per second (fps).
 The problem is that NTSC is an analog system. In
computer video, colors and brightness are represented
by numbers (digital). But with analog television,
everything is just voltages, and voltages are affected
by wire length, connectors, heat, cold, video tape, and
so on.
a) NTSC follows the interlaced scanning system, and each frame
is divided into two fields, with 262.5 lines/field.
b) Thus the horizontal sweep frequency is 525×29.97
≈ 15, 734 lines/sec,
30
PAL Video
 PAL (Phase Alternating Line) is a TV
standard widely used in Western Europe,
China, India, and many other parts of the
world.
 PAL uses 625 scan lines per frame, at
25frames/second(40 ms/frames), with a 4:3
aspect ratio and interlaced fields.
 In Broadcast TV signal uses composite Video.

31
SECAM (Sequential Color with Memory)
 SECAM uses the same bandwidth as PAL but
transmits the color information sequentially.
 SECAM is very similar to PAL.
 It specifies the same number of scan lines and
frames per second.
 It is the broadcast standard for France, Russia,
and parts of Africa and Eastern Europe.

32
Television standards used in different
countries

33
HDTV vs Existing Signals (NTSC, PAL, or
SECAM)
 The HDTV signal is digital resulting in crystal
clear, noise-free pictures and CD quality
sound.
 It has many viewer benefits like choosing
between interlaced or progressive scanning.

34
Video File Formats
 File formats in the PC platform are indicated by the 3 letter filename
extension.
 .mov= QuickTime Movie Format
 .avi= Windows movie format
(audio video interleave)
 .mpg =MPEG file format
(Moving picture expert groups)
 .3GP= 3rd Generation
 High speed wireless network
 Suitable for mobile platform

35
Four Factors of Digital Video
 With digital video, four factors have to be kept
in mind. These are :
 Frame rate
 Spatial Resolution
 Color Resolution
 Image Quality

36
Frame Rate
 The standard for displaying any type of
non-film video is 30 frames per second
(film is 24 frames per second).
 This means that the video is made up of 30
(or 24) pictures or frames for every second
of video.
 Additionally these frames are split in half
(odd lines and even lines), to form what
are called fields.

37
Spatial Resolution
 The second factor is spatial resolution - or in other words,
"How big is the picture?". Since PC and Macintosh
computers generally have resolutions in excess of 640 by
480, most people assume that this resolution is the video
standard.
 A standard analogue video signal displays a full, over
scanned image without the borders common to computer
screens.
 The National Television Standards Committee ( NTSC
standard used in North America and Japanese Television
uses a 768 by 484 display.
 The Phase Alternative system (PAL) standard for European
television is slightly larger at 768 by 576.
 Most countries endorse one or the other, but never both.
 Since the resolution between analogue video and
computers is different, conversion of analogue video to
digital video at times must take this into account.
 This can often the result in the down-sizing of the video
and the loss of some resolution.
38
Color Resolution
 This third factor is a bit more complex.
 Color resolution refers to the number of colors
displayed on the screen at one time.
 Computers deal with color in an RGB (red-green
blue) format, while video uses a variety of
formats.
 One of the most common video formats is
called YUV.
 Although there is no direct correlation between
RGB and YUV, they are similar in that they both
have varying levels of color depth (maximum
number of colours).
39
Image Quality
 The last, and most important factor is video
quality.
 The final objective is video that looks
acceptable for your application.
 For some this may be 1/4 screen, 15 frames
per second (fps), at 8 bits per pixel.
 Other require a full screen (768 by 484), full
frame rate video, at 24 bits per pixel (16.7
million colours).

40

You might also like