0% found this document useful (0 votes)
11 views5 pages

Information theory1

Information theory is a mathematical framework for understanding the transmission and processing of information, primarily through Shannon's communication model, which includes components like the message source, encoder, channel, noise, decoder, and message receiver. It distinguishes between discrete and continuous signals, as well as noisy and noiseless communication, with practical applications in data compression, error correction, and cryptology. Shannon's model emphasizes the importance of accurately transmitting signals despite potential interference from noise.

Uploaded by

dwivedialok
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views5 pages

Information theory1

Information theory is a mathematical framework for understanding the transmission and processing of information, primarily through Shannon's communication model, which includes components like the message source, encoder, channel, noise, decoder, and message receiver. It distinguishes between discrete and continuous signals, as well as noisy and noiseless communication, with practical applications in data compression, error correction, and cryptology. Shannon's model emphasizes the importance of accurately transmitting signals despite potential interference from noise.

Uploaded by

dwivedialok
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

information theory

Information theory, a mathematical representation of the conditions


and parameters affecting the transmission and processing of information.

Shannon’s communication model

Shannon’s theory deals primarily with the encoder, channel, noise source, and
decoder. As noted above, the focus of the theory is on signals and how they can be
transmitted accurately and efficiently

Shannon developed a very simple, abstract model of communication, as shown in


the figure. Because his model is abstract, it applies in many situations, which
contributes to its broad scope and power.

The first component of the model, the message source, is simply the entity that
originally creates the message. Often the message source is a human, but in
Shannon’s model it could also be an animal, a computer, or some other inanimate
object. The encoder is the object that connects the message to the actual physical
signals that are being sent.

For example, there are several ways to apply this model to two people having a
telephone conversation. On one level, the actual speech produced by one person
can be considered the message, and the telephone mouthpiece and its associated
electronics can be considered the encoder, which converts the speech into electrical
signals that travel along the telephone network.

The channel is the medium that carries the message. The channel might be wires,
the air or space in the case of radio and television transmissions, or fibre-optic
cable. In the case of a signal produced simply by banging on the plumbing, the
channel might be the pipe that receives the blow.
Noise is anything that interferes with the transmission of a signal. In telephone
conversations interference might be caused by static in the line, cross talk from
another line, or background sounds. Signals transmitted optically through the air
might suffer interference from clouds or excessive humidity. Clearly, sources of
noise depend upon the particular communication system. A single system may
have several sources of noise, but, if all of these separate sources are understood, it
will sometimes be possible to treat them as a single source.

The decoder is the object that converts the signal, as received, into a form that the
message receiver can comprehend. In the case of the telephone, the decoder could
be the earpiece and its electronic circuits. Depending upon perspective, the decoder
could also include the listener’s entire hearing system.

The message receiver is the object that gets the message. It could be a person, an
animal, or a computer or some other inanimate object.

Four types of communication

There are two fundamentally different ways to transmit messages:

discrete signals

continuous signals.

Discrete signals can represent only a finite number of different, recognizable states.
For example, the letters of the English alphabet are commonly thought of as
discrete signals.

Continuous signals, also known as analog signals, are commonly used to transmit
quantities that can vary over an infinite set of values—sound is a typical example.
However, such continuous quantities can be approximated by discrete signals—for
instance, on a digital compact disc or through a digital telecommunication system
—by increasing the number of distinct discrete values available until any
inaccuracy in the description falls below the level of perception or interest.

Communication can also take place in the presence or absence of noise. These
conditions are referred to as noisy or noiseless communication, respectively.
There are four cases to consider:

discrete, noiseless communication;


discrete, noisy communication;
continuous, noiseless communication; and
Continuous, noisy communication.
It is easier to analyze the discrete cases than the continuous cases; likewise, the
noiseless cases are simpler than the noisy cases. Therefore, the discrete, noiseless
case will be considered.

Discrete, noiseless communication

From message alphabet to signal alphabet


The English alphabet is a discrete communication system. It consists of a
finite set of characters, such as uppercase and lowercase letters, digits, and various
punctuation marks. Messages are composed by stringing these individual
characters together appropriately.

For noiseless communications, the decoder at the receiving end receives exactly
the characters sent by the encoder. However, these transmitted characters are
typically not in the original message’s alphabet

Discrete, noisy communication

In the real world, however, transmission errors are unavoidable—especially given


the presence in any communication channel of noise, which is the sum total of
random signals that interfere with the communication signal. In order to take
the inevitable transmission errors of the real world into account, some adjustment
in encoding schemes is necessary
.

The figure shows a simple model of transmission in the presence of noise,


the binary symmetric channel. Binary indicates that this channel transmits only two
distinct characters, generally interpreted as 0 and 1, while symmetric indicates that
errors are equally probable regardless of which character is transmitted. The
probability that a character is transmitted without error is labeled p; hence, the
probability of error is 1 − p.

Consider what happens as zeros and ones, hereafter referred to as bits, emerge
from the receiving end of the channel. Ideally, there would be a means of
determining which bits were received correctly. In that case, it is possible to
imagine two printouts:

10110101010010011001010011101101000010100101—Signal

00000000000100000000100000000010000000011001—Errors

Signal is the message as received, while each 1 in Errors indicates a mistake in the
corresponding Signal bit. (Errors itself is assumed to be error-free.)

Shannon showed that the best method for transmitting error corrections requires an
average length of
E = p log2(1/p) + (1 − p) log2(1/(1 − p))
bits per error correction symbol. Thus, for every bit transmitted at least E bits have
to be reserved for error corrections.

Applications of information theory


Data compression
Error-correcting and error-detecting codes
Cryptology : Cryptology is the science of secure communication. It concerns both
cryptanalysis, the study of how encrypted information is revealed (or decrypted)
when the secret “key” is unknown, and cryptography, the study of how information
is concealed and encrypted in the first place.

You might also like