0% found this document useful (0 votes)
2 views38 pages

ch1 (1)

â

Uploaded by

Thu Thuy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views38 pages

ch1 (1)

â

Uploaded by

Thu Thuy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

Chapter 1

Overview of Multimedia
Communication

1.1
1. Newspaper: Perhaps the first mass communication
medium, uses text, graphics, and images.
2. Motion pictures: Conceived of in 1830’s in order to
observe motion too rapid for perception by the human
eye.
3. Wireless radio transmission: Guglielmo Marconi, at
Pontecchio, Italy, in 1895.
4. Television: The new medium for the 20th century,
established video as a commonly available medium
and has since changed the world of mass
communications.
1.2
5. The connection between computers and ideas about
multimedia covers what is actually only a short period:
Ø 1945 - Vannevar Bush wrote a landmark article describing what
amounts to a hypermedia system called Memex.
Ø 1960 - Ted Nelson coined the term hypertext.
Ø 1967 - Nicholas Negroponte formed the Architecture
Machine Group.
Ø 1968 - Douglas Engelbart demonstrated the On-Line System
(NLS), another very early hypertext program.
Ø 1969 - Nelson and van Dam at Brown University created an
early hypertext editor called FRESS.
Ø 1976 - The MIT Architecture Machine Group proposed a project
entitled Multiple Media resulted in the Aspen Movie Map, the
first hypermedia videodisk, in 1978.
1.3
Ø 1985 - Negroponte and Wiesner co-founded the MIT Media
Lab.
Ø 1989 - Tim Berners-Lee proposed the World Wide Web.
Ø 1 9 9 0 - K r i s t i n a H o o p e r Wo o l s e y h e a d e d t h e A p p l e
Multimedia Lab.
Ø 1991 - MPEG-1 was approved as an international standard for
digital video LED to the newer standards, MPEG-2, MPEG-4,
and further MPEGs in the 1990s.
Ø 1991 - The introduction of PDAs in 1991 began a new period
in the use of computers in multimedia.
Ø 1992 - JPEG was accepted as the international standard for
digital image compression led to the new JPEG2000 standard.
Ø 1992 - The first MBone audio multicast on the Net was made.
1.4
Ø 1993 - The University of Illinois National Center for
Supercomputing Applications produced NCSA Mosaic the first
full-fledged browser.
Ø 1994 - Jim Clark and Marc Andreessen created the Netscape
program.
Ø 1995 - The JAVA language was created for platform-
independent application development.
Ø 1996 - DVD video was introduced; high quality full-length
movies1998 were distributed on a single disk.
Ø 1998 - XML 1.0 was announced as a W3C Recommendation.
Ø 1998 - Hand-held MP3 devices first made inroads into
consumerist tastes in the fall of 1998, with the introduction of
devices holding 32MB of flash memory.
Ø 2000 - WWW size was estimated at over 1 billion pages.
1.5
The World Wide Web (WWW) - the best example of a
hypermedia application.

1.6
Figure 1.1 Hypertext is nonlinear

1.7
Ø Digital video editing and production systems.
Ø Electronic newspapers/magazines.
Ø World Wide Web.
Ø On-line reference works: ... encyclopedia, games, etc
Ø Home shopping.
Ø Interactive TV.
Ø Multimedia courseware.
Ø Video conferencing.
Ø Video-on-demand.
Ø Interactive movies.
1.8
What is Multimedia?

Ø A PC vendor: a PC that has sound capability, a DVD-ROM drive,


and perhaps the superiority of multimedia-enabled microprocessors
that understand additional multimedia instructions.
Ø A consumer entertainment vendor: interactive cable TV with
hundreds of digital channels available, or a cable TV-like service
delivered over a high-speed Internet connection.
Ø A Computer Science (CS) student: applications that use multiple
modalities, including text, images, drawings (graphics), animation,
video, sound including speech, and interactivity.

Ø Graphics, HCI, visualization, computer vision, data compression,


graph theory, networking, database systems. Multimedia and
Hypermedia.
1.9
Examples of how these modalities are put to use:
1. Video teleconferencing.
2. Distributed lectures for higher education.
3. Tele-medicine.
4. Co-operative work environments.
5. Searching in (very) large video and image databases for target
visual objects.
6. “Augmented” reality: placing real-appearing computer
graphics and video objects into scenes.
1.10
7. Including audio cues for where video-conference participants
are located.

8. Building searchable features into new video, and enabling


very high- to very low-bit-rate use of new, scalable multimedia
products.

9. Making multimedia components editable.

10. Building “inverse-Hollywood” applications that can recreate


the process by which a video was made.

11. Using voice-recognition to build an interactive environment,


say a kitchen-wall web browser.
1.11
1. Multimedia processing and coding: multimedia content
analysis, content-based multimedia retrieval, multimedia
security, audio/image/video processing, compression, ...
2. Multimedia system support and networking: network
protocols, Internet, operating systems, servers and clients,
quality of service (QoS), and databases.
3. Multimedia tools, end-systems and applications:
hypermedia systems, user interfaces, authoring systems.
4. Multi-modal interaction and integration: “ubiquity” web-
everywhere devices, multimedia education including Computer
Supported Collaborative Learning, and design and applications
of virtual environments.
1.12
1. Camera-based object tracking technology: tracking of the
control objects provides user control of the process.
2. 3D motion capture: used for multiple actor capture so that
multiple real actors in a virtual studio can be used to
automatically produce realistic animated models with natural
movement.
3. Multiple views: allowing photo-realistic (video-quality)
synthesis of virtual actors from several cameras or from a single
camera under differing lighting.
4. 3D capture technology: allow synthesis of highly realistic
facial animation from speech.
1.13
5. Specific multimedia applications: aimed at handicapped
persons with low vision capability and the elderly a rich field of
endeavor.

6. Digital fashion: aims to develop smart clothing that can


communicate with other such enhanced clothing using wireless
communication, so as to artificially enhance human interaction in
a social setting.

7. Electronic Housecall system: an initiative for providing


interactive health monitoring services to patients in their homes

8. Augmented Interaction applications: used to develop


interfaces between real and virtual humans for tasks such as
augmented storytelling.

1.14
1. Universal access of web resources (by everyone everywhere).
2. Effectiveness of navigating available information.
3. Responsible use of posted material.

Ø 1960s - Charles Goldfarb et al. developed the Generalized


Markup Language (GML) for IBM.
Ø 1986 - The ISO released a final version of the Standard
Generalized Markup Language (SGML).
Ø 1990 - Tim Berners-Lee invented the HyperText Markup
Language (HTML), and the HyperText Transfer Protocol
(HTTP).
1.15
Ø 1993 - NCSA released an alpha version of Mosaic based on the
version by Marc Andreessen for X-Windows the first popular browser.

Ø 1994 - Marc Andreessen et al. formed Mosaic Communications


Corporation later the Netscape Communications Corporation.
Ø 1998 - The W3C accepted XML version 1.0 specifications as a
Recommendation the main focus of the W3C and supersedes HTML.

1.16
Method URI Version
Additional-Headers:
Message-body

1.17
response
Version Status-Code Status-Phrase
Additional-Headers
Message-body

1. 200 OK - the request was processed successfully.


2. 404 Not Found - the URI does not exist.

1.18
1. HTML uses ASCII, it is portable to all different (possibly binary
incompatible) computer hardware.
2. The current version of HTML is version 4.01 (HTML4) &
(HTML5).
3. The next generation of HTML is XHTML - a reformulation
of HTML using XML.

Ø <token params> - defining a starting point.


Ø </token> - the ending point of the element.
Ø Some elements have no ending tags.
1.19
<HTML>
<HEAD>
<TITLE>A sample web page.</TITLE>
<META NAME = "Author" CONTENT = "Cranky Professor">
</HEAD>
<BODY>
<P> We can put any text we like here,since this is a
paragraph element.
</P>
</BODY>
</HTML>

1.20
1.21
1.22
First use a global Document Type Definition (DTD) that is
already defined.
The server side script will abide by the DTD rules to generate an
XML document according to the query using data from your
database.
Finally send user the XML Style Sheet (XSL) depending on the
type of device used to display the information.
1.23
Ø All tags are in lower case, and a tag that has only inline data has
to terminate itself, i.e., <token params />.
Ø Uses name spaces so that multiple DTDs declaring different
elements but with similar tag names can have their elements
distinguished.
Ø DTDs can be imported from URIs as well.

1.24
1.25
Ø XML Protocol: used to exchange XML information between
processes.

Ø XML Schema: a more structured and powerful language for


defining XML data types (tags).

Ø XSL: basically CSS for XML.

ØSMIL: Synchronized Multimedia Integration Language,


pronounced “smile” a particular application of XML (globally
predefined DTD) that allows for specification of interaction among
any media types and user input, in a temporally scripted manner.

1.26
1.27
All SMIL elements are divided into modules sets of XML
elements, attributes and values that define one conceptual
functionality.
In the interest of modularization, not all available modules need
to be included for all applications.
Language Profiles: specifies a particular grouping of modules,
and particular modules may have integration requirements that a
profile must follow.
Ø SMIL 2.0 has a main language profile that includes almost all
SMIL modules.

1.28
1.29
<!DOCTYPE smil PUBLIC "-//W3C//DTD SMIL 2.0"
"http://www.w3.org/2001/SMIL20/SMIL20.dtd">
<smil xmlns="http://www.w3.org/2001/SMIL20/Language">
<head>
<meta name="Author" content="Some Professor" />
</head>
<body>
<par id="MakingOfABook">
<seq>
<video src="authorview.mpg" />
<img src="onagoodday.jpg" />
</seq>
<audio src="authorview.wav" />
<text src="http://www.cs.sfu.ca/mmbook/" />
</par>
</body>
</smil>
1.30
1. Music Sequencing and Notation

2. Digital Audio

3. Graphics and Image Editing

4. Video Editing

5. Animation

6. Multimedia Authoring

1.31
Ø The term sequencer comes from older devices that stored
sequences of notes (“events”, in MIDI).
Ø It is also possible to insert WAV files and Windows MCI
commands (for animation and video) into music tracks (MCI is
a ubiquitous component of the Windows API.)

1.32
q Digital Audio tools deal with accessing and editing the
actual sampled sounds that make up audio

Ø Cool Edit: a very powerful and popular digital audio toolkit;


emulates a professional audio studio multitrack productions and
sound file editing including digital signal processing effects.

Ø Sound Forge: a sophisticated PC-based program for editing


audio WAV files.

Ø Pro Tools: a high-end integrated audio production and editing


environment MIDI creation and manipulation; powerful audio
mixing, recording, and editing software.

1.33
Graphics and Image Editing

Ø Allows layers of images, graphics, and text that can be


separately manipulated for maximum flexibility.
Ø Filter factory permits creation of sophisticated lighting-effects
filters.

q Macromedia Freehand: a text and web graphics editing


tool that supports many bitmap formats such as GIF, PNG,
1.34 and JPEG.
Ø Video and audio are arranged in “tracks”.
Ø P r o v i d e s a l a rg e n u m b e r o f v i d e o a n d a u d i o t r a c k s ,
superimpositions and virtual clips.
Ø A large library of built-in transitions, filters and motions for
clips  effective multimedia productions with little effort.

q Final Cut Pro: a video editing tool by Apple; Macintosh


only.
1.35
Ø Java3D: API used by Java to construct and render 3D graphics,
similar to the way in which the Java Media Framework is used
for handling media files.
1. Provides a basic set of object primitives (cube, splines, etc.)
for building scenes.
2. It is an abstraction layer built on top of OpenGL or DirectX
(the user can select which).

1.36
1.37
1.38

You might also like