0% found this document useful (0 votes)
14 views

Lecture 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views

Lecture 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 40

Dr.

Matheel Emadaludeen
Multimedia
It can be useful to distinguish between
multimedia and multiple media. The
distinction is best understood from the point
of view of the user. It is commonplace that we
perceive different media in different ways: we
just referred to reading text, looking at
pictures, listening to sound, and so on.
Cognitive scientists call these different sorts
of perception modalities. True multimedia
requires us to combine modalities.
Multimedia
Digital multimedia is any combination of two
or more media, represented in a digital form,
sufficiently well integrated to be presented
via a single interface, or manipulated by a
single computer program.
Media Dynamic Media
Video
Animation
sound
dynamic

Static media Static Media


Image
Text
Media
Media

dynamic
Exhibit some change over time
Static
Supplied with player controls
No change over time
Static media have no player
controls
Multimedia
 Multimedia combines text, photographic images, video,
animations and sound to provide the means to visualize
and interact with information in a meaningful way.

 The most familiar example of digital multimedia is the


world wide web which combines the different media in a
network of pages connected by links. Also magazines,
newspapers and books combine text and images, TV
combines Sound , image, video and text.

 There are two models currently in use for combining


elements of different media types:
 Page-based
 Time-based
Multimedia
 In the page_based model, text, images and video
are laid out in books and magazines.

 Time_ based elements such as video clips and


sound are embedded in the page as if they were,
images, occupying a fixed area; controls may be
provided within the page to start and stop their
playback.
Delivery of multimedia

On Off
line line
Delivery of multimedia
Online delivery uses a network to send
information from one computer to another.
The network may be LAN or WAN (internet).

Offline delivery used removable storage, CDs,


etc.
authoring systems
The making of multimedia requires software
not only for the preparation of individual
media elements, but for their integration into
a finished production. Programs that allow a
designer to assemble different media
elements in space and time, and add
interactive behavior to them, are usually
called authoring systems
Linear and Non-linear Multimedia
Hypertext and Hypermedia
Individual pages can be combined using
links, allowing the user to jump from one
page to another by clicking on a
representation of the link.

Linked pages of text are called Hypertext.

When other media can be embedded in the


linked pages the combination is called
Hypermedia
Disadvantages of Multimedia
 High monetary cost
 May require reformatting of information
 Requires high amount of disk space
 Requires powerful computer
 Large amount of varied hardware is necessary
 CDROM drive
 Sound card
 Speakers
 Scanner

 Technical expertise needed to set up multimedia system


 Encourages reliance (‫)يشجع األتكال‬on technology
 Users may begin to rely on the presence of technology
 Can be seen as only source of information
Digitization of multimedia data
 In multimedia, we encounter values of several kinds
that change continuously, either because they
originated in physical phenomena or because they
exist in some analogue representation .
 For example,
 The amplitude of (volume ) of a sound wave varies
continuously over time, as does the amplitude of an
electrical signal produced by a microphone in
response to time wave.
 The color of the image formatted inside a camera by
its lens varies continuously across the image plane.
Digitization Definition
 When we have a continuously varying signal,
both the value we measure and the intervals at
which we can measure it, can vary infinitesimally.

 In contrast, if we were to convert it to a digital


signal, we would have to restrict both of these to
a set of discrete values that could be represented
in some fixed number of bits.

 Therefore,
 Digitization is the process of converting a signal
from analogue to digital form. it consists of two
steps:
Digitization Definition
Sampling, when we measure the signal’s
value at discrete intervals.
Quantization, when we restrict the value to a
fixed set of levels.

Sampling and quantization can be carried out


in either order.
Sampling rate is the number of samples in a
fixed amount of time or space.
Quantization level is the equally spaced levels
to which a signal is quantized.
Digital versus analogue Representation
1. Noise Resistance:

 One of the great advantages that the digital representations have


over analogue ones stems from the fact that only certain signal
values –those at the quantization levels- are valid. If a signal is
transmitted over a wire or stored on some physical medium,
inevitably some random noise is introduced, either because of
interference from stray magnetic fields, or simply because of the
unavoidable fluctuations in thermal energy of the transmission
medium. This noise will cause the signal value to be changed.

 If the signal is an analogue one, these changes will not be


detectable.
 If the signal is a digital one, any minor functions caused by noise
will usually transform a legal value into an illegal one that lies
between quantization levels.

 Digital signals are therefore much more strong than analogue


ones, and do not suffer degradation when they are copied, or
transmitted over noisy media.
Digital versus analogue Representation
2. Accuracy
Some information must have been lost during the
digitization process. The only meaningful measure of
accuracy must be how closely the original can be
reconstructed. In order to reconstruct an analogue.
If the original samples were too far apart, any
reconstruction is going to be inadequate, because
there may be details in the analogue signal that, as it
were, slipped between samples.
It is easily to see that if the sampling rate is too low
some detail will be lost in the sampling.
Frequency
Any periodic waveform can be
decomposed into a collection of different
frequency components, each a pure sine
wave.

Hence, in general frequencies, like


signals, may be either temporal or
spatial. A spatial varying signal may vary
in two dimensions.
Example

Consider for example, a signal is composed by


adding together sin waves at frequencies of 1
Hz, 2 Hz, and 3 Hz
Time or spatial Domain Versus Frequency
Domain
 Normally, the signal is represented in time
domain by showing how the amplitude varies
with time and space.
 In frequency domain, we can use the frequencies
and amplitudes of its components to represent
the signals.
 Frequency domain can be computed using a
mathematical operation known as the Fourier
Transform. The resultant signal is represented in
the form of a graph. The horizontal axis
represents frequency and vertical axis represents
amplitude.
Frequency Spectrum

………. ……….
-9f -7f -5f -3f .- 0 f 3 5 7 9
f f f f f
What is the sampling rate that guarantees
accurate reconstruction of a signal ?

The sampling theorem states that,


if the highest frequency component
of a signal is at fh , the signal can be
properly reconstructed if it has been
sampled at a frequency greater than
2fh. This is known as the Nyquist
rate.
Example
Consider for example a signal
composed of a single sine wave at a
frequency of 1 Hz :
Example
If we sample this waveform at 2 Hz (as dictated
by Nyquest theorem) , that is sufficient to capture
each peak and through of the signal
Example
If we sample this waveform at 3 Hz there are
more than enough samples to capture the signal
Example
If we sample this waveform at 1.5 Hz then there
are no enough samples to capture each peak and
through of the signal
Example

Consider again the example of a signal which is


composed by adding together sin waves at
frequencies of 1 Hz, 2 Hz, and 3 Hz then the sampling
rate have to be 6 Hz.
Aliasing
 If we under sample a signal –sample it at less than the Nyquist rate-
some frequency components in the original will get transformed into
other frequencies when the signal is reconstructed. This
phenomenon is known as aliasing.
 aliasing refers to an effect that causes different signals to become
indistinguishable (or aliases of one another) when sampled. It also
refers to the distortion or artifact that results when the signal
reconstructed from samples is different from the original continuous
signal.
 Aliasing can occur in signals sampled in time, for instance digital
audio and is referred to as temporal aliasing. Aliasing can also occur
in spatially sampled signals, for instance digital image. Aliasing in
spatially sampled signals is called spatial aliasing.
 With sound, aliasing effect is heard as distortion. In image, it is seen
as in the form of jagged edges. In moving pictures, temporal
undersampling leads to jerkiness of motion.
Under Quantization Levels
 The effect of an insufficient number of quantization levels are
generally easier to grasp than those of inadequate sampling rate. If
we can only represent a limited number of different values, we will
be unable to make fine distinction among those that fall between.

 Example:
 In image, insufficient number of quantization levels as if we were
forced to make do with only a few different colors, and so hard to
use. This effect is called posterization or brightness contouring,
where color areas coalesce, somewhat like a cheaply printed poster.

 When sound is quantized to too few amplitude levels, the result is


percieved as a form of distortion, sometimes referred to as
quantization noise, it also leads to the loss of quiet passages, and a
general fuzziness in sound( rather like a mobile phone in an area of
low signal strength).
Software
An essential part of our definition of multimedia is the

condition that the combined media be sufficiently well

integrated to be presented via a unified interface, and

manipulated by a single computer program. This is in complete

contrast to the situation we have just described with respect to

multimedia production, with its multitude of different software

tools and team input. The key to integration is a framework

that can accommodate a multiplicity of media and present

them to the user. Three distinct approaches can be

distinguished.
Software
The World Wide Web that can accommodate different
media, and view it using a dedicated browser.

Architecture, comprising a format that can contain


different media types, together with an API
(Application Programming Interface) that provides a
rich set of functions to manipulate data in that
format. The prime example here is QuickTime. A
QuickTime movie can contain video, sound,
animation, images, virtual reality panoramas, and
interactive elements, all synchronized and
coordinated.
Software
The third approach is quite the opposite to
the previous one: deliver the multimedia
production in a 'stand alone' form, that needs
no additional software to be used.
Software
Networks
Online distribution of multimedia over LANs or the Internet
is almost always based on the client/server model of distributed
computation. In this model, programs called servers 'listen‘ on a
communication channel for requests from other programs,
called clients, which are generally running on a different
machine elsewhere on the network. Whenever a server receives
a request, it sends a response, which provides some service or
data to the client.
The requests and responses conform to a protocol, a set of
rules governing their format and the actions to be taken by a
server or client when it receives a request or response. The most
popular form of online multimedia delivery is the World Wide
Web, whose implementation is an example of the client/server
 model. Web servers and clients communicate with each other
using the HyperText Transfer Protocol, usually abbreviated to
HTTP.
Reference
Digital Multimedia, Nigel Chapman, Jenny
Chapman john wiley & sons, Ltd., 2007.
SEMENAR
1.Computer Graphics
2.Vector and BitmapS
images
3. Text
4. Video
5. Animation
6. Sound
7. Virtual spatial sound
8. Multimedia and network
9. Scripts
10. Security of Multimedia
11. Image, Audio and Video Quality
Measurements
SEMENARS
12. Intelligent MM processing
13. Semantic Audio and Video
15. 3D image construction and Processing
16. Video Conference
17. Multimedia for Learning
18. Certain aspects of MV in the art
19. Augmented reality
References
Introduction to MM
systems
MM making it work
Digital MM
Fundamentals of MM

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy