Information theory
Information theory
Shannon’s theory deals primarily with the encoder, channel, noise source, and
decoder. As noted above, the focus of the theory is on signals and how they can be
transmitted accurately and efficiently
The first component of the model, the message source, is simply the entity that
originally creates the message. Often the message source is a human, but in
Shannon’s model it could also be an animal, a computer, or some other inanimate
object. The encoder is the object that connects the message to the actual physical
signals that are being sent.
For example, there are several ways to apply this model to two people having a
telephone conversation. On one level, the actual speech produced by one person
can be considered the message, and the telephone mouthpiece and its associated
electronics can be considered the encoder, which converts the speech into electrical
signals that travel along the telephone network.
The channel is the medium that carries the message. The channel might be wires,
the air or space in the case of radio and television transmissions, or fibre-optic
cable. In the case of a signal produced simply by banging on the plumbing, the
channel might be the pipe that receives the blow.
Noise is anything that interferes with the transmission of a signal. In telephone
conversations interference might be caused by static in the line, cross talk from
another line, or background sounds. Signals transmitted optically through the air
might suffer interference from clouds or excessive humidity. Clearly, sources of
noise depend upon the particular communication system. A single system may
have several sources of noise, but, if all of these separate sources are understood, it
will sometimes be possible to treat them as a single source.
The decoder is the object that converts the signal, as received, into a form that the
message receiver can comprehend. In the case of the telephone, the decoder could
be the earpiece and its electronic circuits. Depending upon perspective, the decoder
could also include the listener’s entire hearing system.
The message receiver is the object that gets the message. It could be a person, an
animal, or a computer or some other inanimate object.
discrete signals
continuous signals.
Discrete signals can represent only a finite number of different, recognizable states.
For example, the letters of the English alphabet are commonly thought of as
discrete signals.
Continuous signals, also known as analog signals, are commonly used to transmit
quantities that can vary over an infinite set of values—sound is a typical example.
However, such continuous quantities can be approximated by discrete signals—for
instance, on a digital compact disc or through a digital telecommunication system
—by increasing the number of distinct discrete values available until any
inaccuracy in the description falls below the level of perception or interest.
Communication can also take place in the presence or absence of noise. These
conditions are referred to as noisy or noiseless communication, respectively.
There are four cases to consider:
For noiseless communications, the decoder at the receiving end receives exactly
the characters sent by the encoder. However, these transmitted characters are
typically not in the original message’s alphabet
Consider what happens as zeros and ones, hereafter referred to as bits, emerge
from the receiving end of the channel. Ideally, there would be a means of
determining which bits were received correctly. In that case, it is possible to
imagine two printouts:
10110101010010011001010011101101000010100101—Signal
00000000000100000000100000000010000000011001—Errors
Signal is the message as received, while each 1 in Errors indicates a mistake in the
corresponding Signal bit. (Errors itself is assumed to be error-free.)
Shannon showed that the best method for transmitting error corrections requires an
average length of
E = p log2(1/p) + (1 − p) log2(1/(1 − p))
bits per error correction symbol. Thus, for every bit transmitted at least E bits have
to be reserved for error corrections.