0% found this document useful (0 votes)
42 views2 pages

07 Task Performance Flores

The document discusses calculating information entropies for a communications channel. It defines 4 equiprobable input states and 3 equiprobable noise values. It then: 1) Calculates the input entropy as 2 bits 2) Calculates the noise entropy as log2(12) bits 3) Lists the output values for each combination of input and noise 4) Calculates the output entropy as approximately 3.585 bits

Uploaded by

angelo flores
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views2 pages

07 Task Performance Flores

The document discusses calculating information entropies for a communications channel. It defines 4 equiprobable input states and 3 equiprobable noise values. It then: 1) Calculates the input entropy as 2 bits 2) Calculates the noise entropy as log2(12) bits 3) Lists the output values for each combination of input and noise 4) Calculates the output entropy as approximately 3.585 bits

Uploaded by

angelo flores
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Flores, Angelo M.

Principles of Communication
BSIT 302

07 Task Performance

Assume that there are 4 equiprobable input states, such as 𝑥1 = 35, 𝑥2 = 65, 𝑥3 = 95, and 𝑥4 = 125

and 3 equiprobable values for the channel noise, such as 𝜂1 = 5, 𝜂2 = 10, and 𝜂3 = 15, identify the

following:

a. Input Entropy

𝐻(𝑋) = − ∑𝑛𝑖=1 𝑝𝑖 ∙ 𝑙𝑜𝑔2 (𝑝𝑖)


4 1 1
𝐻(𝑋) = − ∑ ∙ 𝑙𝑜𝑔2 ( )
𝑖=1 4 4

1 1
𝐻(𝑋) = −4 ∙ ( ∙ 𝑙𝑜𝑔2 ( ))
4 4
1
𝐻(𝑋) = −𝑙𝑜𝑔2 ( )
4
𝐻(𝑋) = −(−2)
𝐻(𝑋) = 2
The input entropy is 2 bits.

b. Noise Entropy
𝑁𝑥 𝑁𝑛
𝐻 = −∑ ∑ 𝑃(𝑥𝑖, 𝜂𝑗) ∙ 𝑙𝑜𝑔2 [𝑃(𝑥𝑖, 𝜂𝑗)]
𝑖=1 𝑗=1

4 3 1 1
𝐻 = −∑ ∑ ∙ 𝑙𝑜𝑔2 ( )
𝑖=1 𝑗=1 12 12
1 1
𝐻 = −12 ∙ ∙ 𝑙𝑜𝑔2 ( )
12 12
1
𝐻 = −𝑙𝑜𝑔2 ( )
12
𝐻 = 𝑙𝑜𝑔2 (12)
The Noise Entropy is 𝑙𝑜𝑔2 (12) bits.
Flores, Angelo M. Principles of Communication
BSIT 302

c. Outputs of each equiprobable input states.

Output = Input + Noise

1. For x1 = 35:

• Output 1: 35 + 5 = 40
• Output 2: 35 + 10 = 45
• Output 3: 35 + 15 = 50

2. For x2 = 65

• Output 1: 65 + 5 = 70
• Output 2: 65 + 10 = 75
• Output 3: 65 + 15 = 80

3. For x3 = 95

• Output 1: 95 + 5 = 100
• Output 2: 95 + 10 = 105
• Output 3: 95 + 15 =110

4. For x4 = 125:

• Output 1: 125 + 5 = 130


• Output 2: 125 + 10 = 135
• Output 3: 125 + 15 = 140

d. Output Entropy
1 1
𝐻(𝑌) = − ∑ 𝑙𝑜𝑔2 ( )
𝑖𝑗 12 12
1
𝐻(𝑌) = − ∑ 𝑥(−3.585)
𝑖𝑗 12

1
𝐻(𝑌) = 3.585𝑥 𝑥12
12
𝐻(𝑌) = 3.585
The Output Entropy in bits is approximately 3.585

You might also like