RSP Question Bank Final (1)
RSP Question Bank Final (1)
1. Course Description
Course Overview
This is the fundamental course in signal processing and communication Engineering. This
course provides a foundation in the theory and applications of probability and stochastic
processes and an understanding of the mathematical techniques relating to random processes
in the areas of signal processing, detection & estimation theory, and communications. This
course also focuses on the application of statistical techniques to the study of random signals
and noise concepts. This course forms the basis for the study of advanced subjects like Analog
and Digital Communications, Radar Communications, Cellular and Mobile Communications,
Digital image processing, Speech processing and Machine Learning.
Course Pre/corequisites
The course has no specific prerequisite and corequisite.
CO2 L3
Distinguish temporal and spectral characteristics of stochastic processes
CO3 Apply the concepts of correlation and covariance functions to solve the L3
engineering problems of signal processing systems
CO4 Analyze the random signal response of linear systems with the help of L4
temporal and spectral characteristics
CO5 Develop the knowledge of random noise and model practical noisy sources L6
Page 1 of 5
Department of Electronics and Communication Engineering
5. Course Syllabus
UNIT I:
Concepts of Random variables: single and multiple random variables- definition and
classification, joint distribution and density functions, properties, jointly Gaussian random
variables, central limit theorem, joint moments, joint characteristic functions, sum of random
variables, Transformations of a random variable.
UNIT II:
Random Process - Temporal Characteristics: Random process concept, distribution and
density functions, statistical independence, stationary, time averages and Ergodicity,
correlation functions and properties, covariance functions.
UNIT III:
Random Process-Spectral Characteristics: Power density spectrum and its properties,
relationship between PSD and autocorrelation function, cross-power density spectrum and its
properties, relationship between cross-PSD and cross-correlation function.
UNIT IV:
Random signal response of linear systems: Linear system fundamentals, system response-
convolution, mean and mean squared value of system response, autocorrelation function of
response, cross correlation functions of input and output, spectral characteristics of system
response.
UNIT V:
Random Noise Processes: Noise definitions and classification, white and colored noise, Noise
bandwidth, Band pass, band-limited and Narrow band processes, properties of band-limited
processes, modelling of Noise sources, Incremental modelling of Noisy Networks, modelling
of practical noisy networks.
6. Books and Materials
Text Book(s)
1. Peyton Z. Peebles (2009), Probability Random variables and Random signal principles,
4th Edition, Tata McGraw Hill, New Delhi, India.
Reference Book(s)
1. Athanasius Papoulis, Unni Krishna Pillai (2002), Probability, Random variables and
stochastic processes, 4th edition, Tata McGraw Hill, New Delhi, India.
2. Steven Kay (2006), Intuitive probability and random processes using MATLAB, Springer
US.
Page 3 of 5
HOD, EC
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
UNIT-1
Concept of random variables
1.1 Define random variable. What are the different types of random variables
explain through examples?
Page 1 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
The new space is sometimes called the range sample space or two-dimensional
product space.
𝑆
𝐴 = {𝑋 ≤ 𝑥} 𝑆𝐽
𝐴
𝑦
𝐴∩𝐵
𝐴∩𝐵 =
{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦}
𝐵 𝐵 = {𝑌 ≤ 𝑦}
Page 2 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
1.3 Define probability distribution and density functions and discuss their
properties.
(OR)
Differentiate Probability Distribution function and Probability Density function
Page 3 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
1.4 Define joint distribution and joint density functions and discuss their properties.
(OR)
Define the following terms and list their properties:
(i) Joint cumulative distribution function
(ii) Joint probability density function
Page 4 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
Similarly, for 𝑁 random variables, the joint density function is given by,
𝜕 𝑁 𝐹𝑋1 ,𝑋2 ,…,𝑋𝑁 (𝑥1 , 𝑥2 , … , 𝑥𝑁 )
𝑓𝑋1 ,𝑋2,…,𝑋𝑁 (𝑥1 , 𝑥2 , … , 𝑥𝑁 ) =
𝜕𝑥1 𝜕𝑥2 … 𝜕𝑥𝑁
1.5 Define Gaussian Random Variable. Derive the equation for its normalized case.
(OR)
Explain Gaussian random variable with necessary equations and graph.
(OR)
Write a note on PDF and CDF of Gaussian Random Variable.
𝑓𝑿 (𝑥) 𝐹𝑿 (𝑥)
1
1
√2𝜋𝜎2𝑋
0.5
𝑎𝑥 𝑥 𝑎𝑥 𝑥
(a) (b)
Fig. (a) Density and (b) Distribution functions of Gaussian random variable.
Page 5 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
1.6 Write the equation of Joint Gaussian density function and prove that the joint
gaussian density function is equal to the product of marginal densities.
Its maximum value is located at the point (𝑋̅, 𝑌̅). The maximum value is obtained
from
1
𝑓𝑋,𝑌 (𝑥, 𝑦) ≤ 𝑓𝑋,𝑌 (𝑋̅, 𝑌̅) =
2𝜋𝜎𝑋 𝜎𝑌 √1 − 𝜌2
Page 6 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
1 −(𝑦 − 𝑌̅)2
𝑓𝑌 (𝑦) = 𝑒𝑥𝑝 { }
√2𝜋𝜎𝑌2 2𝜎𝑌2
1.7 Write note on moments of a random variable. Derive expression for variance
and skew.
(OR)
Define the following:
a) Moments about the origin
b) Central moment
c) Variance
d) Skew
or
Page 7 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
∞
𝒏]
𝒎𝒏 = 𝑬[𝑿 = ∑ 𝒙𝒏 𝑷(𝒙𝒊 ) , 𝑿 𝒊𝒔 𝒅𝒊𝒔𝒄𝒓𝒆𝒕𝒆
𝒊=𝟏
Here,
𝑖𝑓 𝑛 = 0, 𝑚0 = 𝐸[𝑋 0 ] = 1
𝑖𝑓 𝑛 = 1, 𝑚1 = 𝐸[𝑋1 ] = 𝑋̅
𝑖𝑓 𝑛 = 2, 𝑚2 = 𝐸[𝑋 2 ] = ̅̅̅
𝑋 2̅
Moments about the mean of a random variable are also called as “Central
moments”.
Moments about the mean of a random variable are denoted by ‘𝝁’.
The number of a moment of the random variable is indicated with its suffix.
The nth moment about the mean of a random variable is given by:
∞
̅ )𝒏 ] = ∫ (𝒙 − 𝒎𝟏 )𝒏 𝒇𝑿 (𝒙)𝒅𝒙, 𝑋 𝑖𝑠 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠
𝝁𝒏 = 𝑬[(𝑿 − 𝑿
−∞
or
∞
The second order central moment 𝝁𝟐 is also called the “variance” of the random
variable 𝑋.
The non-negative square-root of the variance is called the “standard deviation”
of the random variable 𝑋. It is denoted by 𝝈𝑿 .
The third central moment 𝝁𝟑 is called the “skew” of the density function
It is a measure of the ‘asymmetry’ of 𝑓𝑋 (𝑥) about 𝑥 = 𝑋̅ = 𝑚1.
Page 8 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
The normalized third central moment 𝝁𝟑 /𝝈𝟑𝑿 is known as the “Coefficient of the
Skewness” of the density function.
1.8 Write a note on Joint characteristic functions. Write the equation to compute joint
moments using the same.
Joint moments about the origin 𝒎𝒏𝒌 can be found from the joint characteristic
function as follows:
𝒏+𝒌
𝝏𝒏+𝒌 𝝓𝑿,𝒀 (𝝎𝟏 , 𝝎𝟐 )
𝒎𝒏𝒌 = (−𝒋) |
𝝏𝝎𝒏𝟏 𝝏𝝎𝒌𝟐 𝝎 𝟏 =𝟎,𝝎𝟐 =𝟎
Page 9 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
Page 10 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
Page 11 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
1.12 Prove that the density function of the sum of two statistically independent
random variables is the convolution of their individual density functions.
(OR)
Derive an equation for the density function of sum of two independent random
variables. Also extend the concept for N- random variables.
(OR)
Prove that 𝒇𝑾 (𝒘) = 𝒇𝒀 (𝒚) ∗ 𝒇𝑿 (𝒙) where X and Y are two statistically
independent random variables and W=X+Y.
Page 12 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
𝒇𝑾 (𝒘) = ∫ 𝒇 𝒀 (𝒚) 𝒇𝑿 (𝒘 − 𝒚) 𝒅𝒚
−∞
This expression is recognized as a convolution integral. That is, the density function
of the sum of two statistically independent random variables is the convolution of
their individual density functions. i.e.,
𝒇𝑾 (𝒘) = 𝒇𝒀 (𝒚) ∗ 𝒇𝑿 (𝒙)
Extending the above concept for sum of N independent random variables, i.e.,
Let 𝑊 be a random variable equal to the sum of several independent random
variables 𝑊1 , 𝑊2 , …, 𝑊𝑁 .
Then the density function of the new random variable w is given by,
𝒇𝑾 (𝒘) = 𝒇𝑾𝑵 (𝒘𝑵 ) ∗ 𝒇𝑾𝑵−𝟏 (𝒘𝑵−𝟏 ) ∗ … ∗ 𝒇𝑾𝟐 (𝒘𝟐 ) ∗ 𝒇𝑾𝟏 (𝒘𝟏 )
1.13 Discuss central limit theorem for sum of large number of Radom variables
(OR)
State central limit theorem
(i) Uniform
(ii) Exponential
(iii) Rayleigh
(iv) Binomial
(v) Poisson
Page 14 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
Page 15 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
UNIT-2
Random Process-Temporal Characteristics
2.1 Define and classify random processes with suitable examples.
(OR)
What is a random process? Explain classification of random process.
Page 16 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
As shown in Figure, the voltage available at one end of the switch because of
opening and closing of the switch is a discrete random process.
𝑇𝑠 is sampling interval and the sampling rate is 1/𝑇𝑠 samples per second.
These types of processes are important in the analysis of various digital signal
processing (DSP) systems.
Page 17 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
Page 18 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
Page 19 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
Page 20 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
2.5 Define Autocorrelation function of a random variable and discuss its properties.
5. If 𝑋(𝑡) has a periodic component, then 𝑅𝑋𝑋 (𝜏) will also have a periodic
component with the same period
6. If 𝑋(𝑡) is ergodic, zero mean, and has no periodic component then,
𝑙𝑖𝑚 𝑅𝑋𝑋 (𝜏) = 0
|𝜏|→∞
Page 21 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
9. If there are two random processes 𝑋(𝑡) and 𝑌(𝑡) such that 𝑍(𝑡) = 𝑋(𝑡) +
𝑌(𝑡), then
𝑅𝑍𝑍 (𝜏) = 𝑅𝑋𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏) + 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏)
Ans:- Covariance is defined as a measure of how much two random processes change
together. Covariance can be of two types, namely, auto-covariance and cross-
covariance.
Page 22 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
AUTO-COVARIANCE FUNCTION
The auto-covariance is defined as a measure of variation of a random process with
its replica with a time difference 𝜏. It is denoted by 𝐶𝑋𝑋 (𝑡1 , 𝑡2 ). Mathematically it
is given by,
𝐶𝑋𝑋 (𝑡1 , 𝑡2 ) = 𝐸{ [𝑋(𝑡1 ) − 𝐸(𝑋(𝑡1 ))] [𝑋(𝑡2 ) − 𝐸(𝑋(𝑡2 ))] }
= 𝐸[[𝑋(𝑡1 ) 𝑋(𝑡2 )] − 𝐸[𝑋(𝑡1 )] 𝐸[𝑋(𝑡2 )]
= 𝑅𝑋𝑋 (𝑡1 , 𝑡2 ) − 𝐸[𝑋(𝑡1 )]𝐸[𝑋(𝑡2 )]
Let 𝑡1 = 𝑡 and 𝑡2 = 𝑡1 + 𝜏 = 𝑡 + 𝜏 with 𝜏 a real number, we get
𝐶𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) − 𝐸[𝑋(𝑡)] 𝐸[𝑋(𝑡 + 𝜏)]
If 𝑋(𝑡) is a wide-sense stationary random process then, autocorrelation
function and auto covariance function depends on only the time difference
𝜏 = 𝑡2 − 𝑡1. Thus, for a WSS process we can write,
𝐶𝑋𝑋 (𝜏) = 𝑅𝑋𝑋 (𝜏) − 𝐸[𝑋(𝑡)] 𝐸[𝑋(𝑡 + 𝜏)]
If 𝑡1 = 𝑡2 = 𝑡, i.e., 𝐶𝑋𝑋 is observed at the same time, then
𝐶𝑋𝑋 (𝑡, 𝑡) = 𝑅𝑋𝑋 (𝑡, 𝑡) − 𝐸[𝑋(𝑡)] 𝐸[𝑋(𝑡)]
2
= 𝐸[𝑋 2 (𝑡)] − 𝐸[𝑋(𝑡)]2 = 𝐸[𝑋(𝑡) − ̅̅̅̅̅̅
𝑋(𝑡)] = 𝑉𝑎𝑟[𝑋(𝑡)]
CROSS-COVARIANCE FUNCTION
The Cross-covariance is defined as a measure of variation of two random process
with each other at two different time instants 𝑡1 and 𝑡2 . It is denoted by 𝐶𝑋𝑌 (𝑡1 , 𝑡2 ).
Mathematically it is given by,
𝐶𝑋𝑌 (𝑡1 , 𝑡2 ) = 𝐸{ [𝑋(𝑡1 ) − 𝐸(𝑋(𝑡1 ))] [𝑌(𝑡2 ) − 𝐸(𝑌(𝑡2 ))] }
= 𝐸[[𝑋(𝑡1 ) 𝑌(𝑡2 )] − 𝐸[𝑋(𝑡1 )] 𝐸[𝑌(𝑡2 )]
= 𝑅𝑋𝑌 (𝑡1 , 𝑡2 ) − 𝐸[𝑋(𝑡1 )] 𝐸[𝑌(𝑡2 )]
Let 𝑡1 = 𝑡 and 𝑡2 = 𝑡1 + 𝜏 = 𝑡 + 𝜏 with 𝜏 a real number, we get
𝐶𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) − 𝐸[𝑋(𝑡)] 𝐸[𝑌(𝑡 + 𝜏)]
For two random processes, if 𝐶𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 0, they are called un
correlated processes. Hence,
𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)] 𝐸[𝑌(𝑡 + 𝜏)]
If 𝑋(𝑡) and Y(t) are jointly wide sense stationary random processes then,
cross-correlation function and cross covariance function depends on only
the time difference 𝜏 = 𝑡2 − 𝑡1 , and their mean values are constant. Thus
for jointly WSS processes, we can write,
𝐶𝑋𝑌 (𝜏) = 𝑅𝑋𝑌 (𝜏) − 𝐸[𝑋(𝑡)] 𝐸[𝑌(𝑡 + 𝜏)]
= 𝑅𝑋𝑌 (𝜏) − 𝑋̅ ̅𝑌
Page 23 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
UNIT III
Random Process-Spectral Characteristics
3.1 Derive the relation between autocorrelation function and power spectral density
and comment on it.
(OR)
State and prove Weiner Khintchine relations.
(OR)
Show that autocorrelation function and power spectral density forms Fourier
transform pair.
Taking inverse Fourier transform on both sides of the above equation, we get
∞ ∞ 𝑇 𝑇
1 1 1
∫ 𝑆𝑋𝑋 (𝜔)𝑒 𝑗𝜔𝜏 𝑑𝜔 = ∫ 𝑙𝑖𝑚 ∫ ∫ 𝑅𝑋𝑋 ( 𝑡1 , 𝑡2 ) 𝑒 −𝑗𝜔(𝑡2 −𝑡1 ) 𝑒 𝑗𝜔𝜏 𝑑𝑡2 𝑑𝑡1 𝑑𝜔
2𝜋 2𝜋 𝑇→∞ 2𝑇
−∞ −∞ −𝑇 −𝑇
∞ 𝑇 𝑇
1 1
= ∫ 𝑙𝑖𝑚 ∫ ∫ 𝑅𝑋𝑋 ( 𝑡1 , 𝑡2 ) 𝑒 −𝑗𝜔(𝑡2 −𝑡1 −𝜏) 𝑑𝑡2 𝑑𝑡1 𝑑𝜔
2𝜋 𝑇→∞ 2𝑇
−∞ −𝑇 −𝑇
Page 24 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
𝑇 𝑇 ∞
1 1
= 𝑙𝑖𝑚 ∫ ∫ 𝑅𝑋𝑋 ( 𝑡1 , 𝑡2 ) 𝑑𝑡2 𝑑𝑡1 ( ∫ 𝑒 −𝑗𝜔(𝑡2 −𝑡1 −𝜏) 𝑑𝜔)
𝑇→∞ 2𝑇 2𝜋
−𝑇 −𝑇 −∞
𝑇 𝑇 ∞
1 1
= 𝑙𝑖𝑚 ∫ ∫ 𝑅𝑋𝑋 ( 𝑡1 , 𝑡2 ) 𝑑𝑡2 𝑑𝑡1 ( ∫ 𝑒 𝑗𝜔(𝑡1 +𝜏−𝑡2 ) 𝑑𝜔)
𝑇→∞ 2𝑇 2𝜋
−𝑇 −𝑇 −∞
From the definition of the impulse function,
∞
1
𝛿(𝑥) = ∫ 𝑒 𝑗𝜔𝑥 𝑑𝜔
2𝜋
−∞
Therefore,
∞ 𝑇 𝑇
1 1
∫ 𝑆𝑋𝑋 (𝜔)𝑒 𝑗𝜔𝜏 𝑑𝜔 = 𝑙𝑖𝑚 ∫ ∫ 𝑅𝑋𝑋 ( 𝑡1 , 𝑡2 ) 𝛿(𝑡1 + 𝜏 − 𝑡2 ) 𝑑𝑡2 𝑑𝑡1
2𝜋 𝑇→∞ 2𝑇
−∞ −𝑇 −𝑇
From the even symmetry property of the impulse function, 𝛿(𝑡1 + 𝜏 − 𝑡2 ) =
𝛿(𝑡2 − 𝑡1 − 𝜏), thus,
∞ 𝑇 𝑇
1 1
∫ 𝑆𝑋𝑋 (𝜔)𝑒 𝑗𝜔𝜏 𝑑𝜔 = 𝑙𝑖𝑚 ∫ ( ∫ 𝑅𝑋𝑋 ( 𝑡1 , 𝑡2 )𝛿(𝑡2 − (𝑡1 + 𝜏)) 𝑑𝑡2 ) 𝑑𝑡1
2𝜋 𝑇→∞ 2𝑇
−∞ −𝑇 −𝑇
Now, from the integral definition of impulse function,
∞
From the above two equations, it can be concluded that, time autocorrelation and
power spectrum are Fourier transform pair.
Page 25 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
3.2 Derive an expression for total power in terms of power spectral density.
(OR)
Derive an expression for the power spectral density.
Page 26 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
Page 27 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
3.4 Derive the relation between cross-correlation function and cross power spectral
density and comment on it.
(OR)
Show that cross-correlation function and cross power spectral density forms
Fourier transform pair.
Page 28 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
𝑇 𝑇 ∞
1 1
= 𝑙𝑖𝑚 ∫ ∫ 𝑅𝑋𝑌 ( 𝑡1 , 𝑡2 ) 𝑑𝑡2 𝑑𝑡1 ( ∫ 𝑒 −𝑗𝜔(𝑡2 −𝑡1 −𝜏) 𝑑𝜔)
𝑇→∞ 2𝑇 2𝜋
−𝑇 −𝑇 −∞
𝑇 𝑇 ∞
1 1
= 𝑙𝑖𝑚 ∫ ∫ 𝑅𝑋𝑌 ( 𝑡1 , 𝑡2 ) 𝑑𝑡2 𝑑𝑡1 ( ∫ 𝑒 𝑗𝜔(𝑡1 +𝜏−𝑡2 ) 𝑑𝜔)
𝑇→∞ 2𝑇 2𝜋
−𝑇 −𝑇 −∞
From the definition of the impulse function,
∞
1
𝛿(𝑥) = ∫ 𝑒 𝑗𝜔𝑥 𝑑𝜔
2𝜋
−∞
Therefore,
∞ 𝑇 𝑇
1 1
∫ 𝑆𝑋𝑌 (𝜔)𝑒 𝑗𝜔𝜏 𝑑𝜔 = 𝑙𝑖𝑚 ∫ ∫ 𝑅𝑋𝑌 ( 𝑡1 , 𝑡2 ) 𝛿(𝑡1 + 𝜏 − 𝑡2 ) 𝑑𝑡2 𝑑𝑡1
2𝜋 𝑇→∞ 2𝑇
−∞ −𝑇 −𝑇
From the even symmetry property of the impulse function, 𝛿(𝑡1 + 𝜏 − 𝑡2 ) =
𝛿(𝑡2 − 𝑡1 − 𝜏), thus,
∞ 𝑇 𝑇
1 1
∫ 𝑆𝑋𝑌 (𝜔)𝑒 𝑗𝜔𝜏 𝑑𝜔 = 𝑙𝑖𝑚 ∫ ( ∫ 𝑅𝑋𝑌 ( 𝑡1 , 𝑡2 )𝛿(𝑡2 − (𝑡1 + 𝜏)) 𝑑𝑡2 ) 𝑑𝑡1
2𝜋 𝑇→∞ 2𝑇
−∞ −𝑇 −𝑇
Now, from the integral definition of impulse function,
∞
Page 29 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
From the above two equations, it can be concluded that, time cross-correlation and
cross-power spectrum are Fourier transform pair. If both of the random processes
are jointly wide-sense stationary, the time cross-correlation depends only on time
difference 𝜏. Thus, the above equations modify to,
∞
𝟏
𝑹𝑿𝒀 (𝝉) = ∫ 𝑺𝑿𝒀 (𝝎) 𝒆𝒋𝝎𝝉 𝒅𝝎
𝟐𝝅
−∞
∞
Page 30 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
∞
∫0 𝜔𝑆𝑋𝑋 (𝜔) 𝑑𝜔
𝜔0 =
̅̅̅̅ ∞
∫0 𝑆𝑋𝑋 (𝜔) 𝑑𝜔
Page 31 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
𝑇 ∞
1 1 𝐸[𝑋𝑇∗ (𝜔) 𝑌𝑇 (𝜔)]
𝑃𝑋𝑌 = 𝑙𝑖𝑚 ∫ 𝑅𝑋𝑌 (𝑡, 𝑡) 𝑑𝑡 = ∫ 𝑙𝑖𝑚 𝑑𝜔
𝑇→∞ 2𝑇 2𝜋 𝑇→∞ 2𝑇
−𝑇 −∞
From the above equation, the following expressions can be deduced,
𝑇
1
𝑃𝑋𝑌 = 𝑙𝑖𝑚 ∫ 𝑅𝑋𝑌 (𝑡, 𝑡) 𝑑𝑡 = 𝐴[𝑅𝑋𝑌 (𝑡, 𝑡)]
𝑇→∞ 2𝑇
−𝑇
And
∞
1 𝐸[𝑋𝑇∗ (𝜔) 𝑌𝑇 (𝜔)]
𝑃𝑋𝑌 = ∫ 𝑙𝑖𝑚 𝑑𝜔
2𝜋 𝑇→∞ 2𝑇
−∞
𝐸[𝑋𝑇∗ (𝜔) 𝑌𝑇 (𝜔)]
Here, the term, 𝑙𝑖𝑚 indicates the cross-power spectral density and is
𝑇→∞ 2𝑇
denoted by 𝑆𝑋𝑌 (𝜔). Thus,
𝑬[𝑿∗𝑻 (𝝎) 𝒀𝑻 (𝝎)]
𝑺𝑿𝒀 (𝝎) = 𝒍𝒊𝒎
𝑻→∞ 𝟐𝑻
Therefore, the cross-power formula now becomes,
∞
1
𝑃𝑋𝑌 = ∫ 𝑆𝑋𝑌 (𝜔) 𝑑𝜔
2𝜋
−∞
By repeating the above procedure, we can also define another cross-power density
spectrum by,
𝑬[𝒀∗𝑻 (𝝎) 𝑿𝑻 (𝝎)]
𝑺𝒀𝑿 (𝝎) = 𝒍𝒊𝒎
𝑻→∞ 𝟐𝑻
And its cross-power formula is given by,
∞
𝟏
𝑷𝒀𝑿 = ∫ 𝑺𝒀𝑿 (𝝎) 𝒅𝝎 = 𝑷∗𝑿𝒀
𝟐𝝅
−∞
Page 32 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
If the random processes are wide-sense stationary, then they are defined as
follows.
∞
1
𝑅𝑋𝑌 (𝜏) = ∫ 𝑆𝑋𝑌 (𝜔) 𝑒 𝑗𝜔𝜏 𝑑𝜔
2𝜋
−∞
∞
1
𝑅𝑌𝑋 (𝜏) = ∫ 𝑆𝑌𝑋 (𝜔) 𝑒 𝑗𝜔𝜏 𝑑𝜔
2𝜋
−∞
∞
Page 33 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
UNIT IV
RANDOM SIGNAL RESPONSE OF LINEAR SYSTEMS
4.1 Derive an equation for mean square value of random signal response of linear
systems.
4.2 Derive an equation for mean value of random signal response of linear systems.
Page 34 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
𝐻(0) = ∫ ℎ(𝑡) 𝑑𝑡
−∞
is called the zero-frequency response of the system. Substituting this we get
̅ = 𝑯(𝟎)𝑿
𝑬[𝒀 (𝒕)] = 𝒀 ̅
is constant. Thus, the mean value of the output response 𝑌 (𝑡) of a WSS random
process is equal to the product of the mean value of the input process and the zero-
frequency response of the system.
4.3 Derive an equation for autocorrelation function of an LTI system having white
noise as its input.
4.4 Derive an equation for cross-correlation functions of an LTI system having white
noise as its input.
If the input X (t) is WSS random process, then the cross-correlation function of input
X (t) and output Y(t) is
𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸 [𝑋(𝑡)𝑌(𝑡 + 𝜏)] = 𝐸 [𝑋(𝑡)(ℎ(𝑡 + 𝜏) ∗ 𝑋(𝑡 + 𝜏))]
∞
Page 36 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
On substitution we get,
∞ ∞ ∞
𝑆𝑌𝑌 (𝜔) = ∫ ℎ(𝜏1 ) ∫ ℎ(𝜏2 ) ∫ 𝑅𝑋𝑋 (𝛼) 𝑒 −𝑗𝜔(𝛼−𝜏1 +𝜏2 ) 𝑑𝛼 𝑑𝜏1 𝑑𝜏2
−∞ −∞ −∞
∞ ∞ ∞
where,
∞
∗ (𝝎)
𝑯 = ∫ ℎ(𝜏1 )𝑒 𝑗𝜔𝜏1 𝑑𝜏1
−∞
∞
4.6 Derive an equation for cross power spectral density of Input and Output response
of an LTI system.
∞ ∞
Page 38 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
4.6
Prove the following relations with respect to the random signal response of
linear systems:
𝐑 𝐘𝐘 (𝛕) = 𝐑 𝐗𝐘 (𝛕) ∗ 𝐡(−𝛕)
𝐑 𝐘𝐘 (𝛕) = 𝐑 𝐘𝐗 (𝛕) ∗ 𝐡(𝛕)
Page 39 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
must hold, where 𝛼𝑛 are arbitrary constant and N may be infinite. Here L is an
operator representing the action of the system on inputs 𝑥𝑛 (𝑡).
A general linear system block diagram is shown in figure below.
𝑥(𝑡) = ∫ 𝑥(𝜏)𝛿(𝑡 − 𝜏) 𝑑𝜏
−∞
On substitution, we get
∞ ∞
= ∫ 𝑥(𝜏)𝐻(𝜔) 𝑒 −𝑗𝜔𝜏 𝑑𝜏
−∞
∞
∴ 𝒀(𝝎) = 𝑯(𝝎)𝑿(𝝎)
Page 41 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
UNIT V
Random Noise Processes
5.1 Define and classify Noise with respect to communication system.
(OR)
Classify the different types of Internal noise. Explain each.
Ans: Noise is defined in electrical sense as an unwanted energy tending to interfere with
-
reception and reproduction of wanted signals. It represents a basic limitation on
transmission and detection of signals in communication systems. No system can be
designed which is free from any kind of noise. Noise is random in nature.
Examples:
Noise generates a random fuzzy sound in broadcast receiver.
In pulse communication systems, noise may produce unwanted pulses or cancel
out wanted pulses which may cause errors in the receiver.
In RADAR systems, noise may cause reduction in bandwidth.
In image processing system, noise may cause errors in image capturing system.
Noise in Communication Systems:
Noise is often described as the limiting factor in communication systems: indeed, if
there was no noise there would be virtually no problem in communications. Noise is
a general term which is used to describe an unwanted signal which affects a wanted
signal.
For example consider the general block diagram of a communication system shown
in Figure
The noise, being an unwanted signal, arise from a variety of sources which may be
considered in one of two main categories: -
a) Interference, usually from a human source (man-made)
b) Naturally occurring random noise.
Page 42 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
TYPES OF NOISE
In an electronic communication system shown in Figure 5.1, there exists variety of
noises that affects the communication processes. Figure 5.2 illustrates the noise
categories of an electronic communication system
Page 43 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
Page 44 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
• Apart from man-made noise, it is the strongest component over the range of
about 20 to 120 MHz.
• Little cosmic noise below 20MHz penetrates the ionosphere, while its eventual
disappearance at frequencies in excess of 1.5 GHz
• We also receive noise from Galaxy and hence is also known as Galactic noise.
(iv) Thermal Noise (Johnson Noise)
This type of noise is generated by all resistances (e.g. a resistor,
semiconductor, the resistance of a resonant circuit, i.e. the real part of the
impedance, cable etc.).
Free electrons are in contact by random motion for any temperature above
absolute zero (00 K, ~ -2730 C) as shown in figure below.
As the temperature increases, the random motion increases, hence thermal
noise, and since moving electron constitute a current, although there is no net
current flow, the motion can be measured as a mean square noise value
I n2 2I DC 2 I o qe B (amps) 2
Where
I DC is the direct current as the pn junction (amps),
Page 46 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
5.2 With suitable mathematical equations, write a note on white noise. Also define
colour noise.
(OR)
Discuss in detail bout White and colored noises with appropriate equations and
sketch.
Page 47 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
𝒩0
𝑆𝑁𝑁 (𝜔) =
2
where 𝒩0 is a real positive constant.
The autocorrelation function of 𝑁(𝑡) can be obtained by taking inverse
Fourier transform of above equation, and is given by;
𝒩0
ℛ𝑁𝑁 (𝜏) = ( ) 𝛿(𝑡)
2
The above two functions are illustrated in Figure below.
Noise having a nonzero and constant power spectrum over a finite frequency band
and zero everywhere else is called band-limited white noise.
Colored Noise:
A noise that is not a white noise is called colored noise. Alternatively, a noise that has
only a portion of visible light frequencies in its spectrum, called as colored noise.
Ans:
Types of Random Processes
Page 48 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
The spectral components outside the band W are very small and can be neglected.
For example,
• Modulated signals with carrier frequency 𝜔 = 0, and bandwidth W are
bandpass random processes.
• The noise transmitting over a communication channel can be modelled as a
band pass process.
Band limited random processes:
• A bandpass random process is said to be band limited if its power spectrum
components are zero outside the frequency band of width W that does not
include 𝜔 = 0.
• The power density spectrum of band limited bandpass process is shown in figure
Page 49 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
• It can be expressed as
where 𝐴(𝑡) is an amplitude random process and 𝜃(𝑡) is phase random process.
Representation of Narrow band random processes:
The PSD of narrow band noise process is
A typical sample function n(t) of narrow band random process N(t), might look like
Page 50 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
Let 𝑁(𝑡) = 𝑋(𝑡) 𝑐𝑜𝑠(𝜔0 𝑡) − 𝑌(𝑡) 𝑠𝑖𝑛(𝜔0 𝑡) be any band limited WSS random
process with zero mean value and a power spectral density 𝑆𝑁𝑁 (𝜔), then some
important properties of 𝑋(𝑡) and 𝑌(𝑡) are given by:
1. If 𝑁(𝑡) is WSS, then 𝑋(𝑡) and 𝑌(𝑡) are jointly wide sense stationary
2. If 𝑁(𝑡) has zero mean, i.e. if 𝐸[𝑁(𝑡)] = 0, then
𝐸[𝑋(𝑡)] = 𝐸[𝑌(𝑡)] = 0
3. The mean square values of the processes are equal
i.e. 𝐸[𝑁 2 (𝑡)] = 𝐸[𝑋 2 (𝑡)] = 𝐸[𝑌 2 (𝑡)]
4. Both processes 𝑋(𝑡) and 𝑌(𝑡) have the same autocorrelation functions
𝑅𝑋𝑋 (𝜏) = 𝑅𝑌𝑌 (𝜏)
5. The cross correlation function of 𝑋(𝑡) and 𝑌(𝑡) are given by
𝑅𝑌𝑋 (𝜏) = −𝑅𝑌𝑋 (𝜏)
If the processes are orthogonal, then
𝑅𝑋𝑌 (𝜏) = 𝑅𝑌𝑋 (𝜏) = 0
6. Both 𝑋(𝑡) and 𝑌(𝑡) have same power spectral densities
Page 51 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
𝑆𝑁 (𝜔 − 𝜔0 ) + 𝑆𝑁 (𝜔 + 𝜔0 ), |𝑊| ≤ 𝜔0
𝑆𝑌𝑌 (𝜔) = 𝑆𝑋𝑋 (𝜔) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
7. The cross power spectrums are
𝑆𝑋𝑌 (𝜔) = −𝑆𝑌𝑋 (𝜔)
8. If 𝑁(𝑡) is a Gaussian random process, then 𝑋(𝑡) and 𝑌(𝑡) are jointly Gaussian
9. The relationship between autocorrelation and power spectrum 𝑆𝑁𝑁 (𝜔) is
∞
1
𝑅𝑋𝑋 (𝜏) = ∫ 𝑆𝑁𝑁 (𝜔) 𝑐𝑜𝑠[(𝜔 − 𝜔0 )𝜏] 𝑑𝜔
𝜋
0
∞
1
𝑅𝑌𝑌 (𝜏) = ∫ 𝑆𝑁𝑁 (𝜔) 𝑠𝑖𝑛[(𝜔 − 𝜔0 )𝜏] 𝑑𝜔
𝜋
0
10. If 𝑁(𝑡) is a zero mean Gaussian and its PSD, 𝑆𝑁𝑁 (𝜔) is symmetric about then
𝑋(𝑡) and 𝑌(𝑡) are statistically independent
5.5 Show that available power across a noisy resistor is independent of resistance of a
source but depends only on its physical temperature.
Ans:
RESISTIVE (THERMAL) NOISE:
-
Thermal noise generated by a resistor is proportional to its absolute
temperature and bandwidth.
Therefore the noise power 𝑷𝒏 ∝ 𝑻𝑩
or 𝑷𝒏 = 𝒌 𝑻 𝑩 watts
Where
k=Boltzmann’s constant=1.38 × 10−23 J/oK
T=Absolute temperature in Kelvin
B=Bandwidth in Hz
Now let us consider a “Noise resistor R”.
The noise voltage generated in the noisy resistor may be considered as a noise
voltage source and can be modelled as a Thevenin’s equivalent circuit as shown
in figure below.
From the circuit, the mean square value of the noise voltage can be written as
𝑣𝑛2 = 4𝑘𝑇𝐵𝑅𝑒𝑞
Page 52 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
5.6 Explain how the available power gain of a two-port network can be estimated.
(OR)
Define the Available Power Gain for a two port network and prove the following
in case of cascade connections:
𝐌
𝐆𝐚 (𝛚) = ∏ 𝐆𝐦 (𝛚).
𝐦=𝟏
Ans:
AVAILABLE POWER GAIN
For a two port network, the available power gain is defined as the ratio of
maximum PSD of the signal at the output to the input, and is given by
𝑀𝑎𝑥. 𝑃𝑆𝐷 𝑜𝑓 𝑡ℎ𝑒 𝑠𝑖𝑔𝑛𝑎𝑙 𝑎𝑡 𝑡ℎ𝑒 𝑜𝑢𝑡𝑝𝑢𝑡 𝑜𝑓 𝑡ℎ𝑒 𝑛𝑒𝑡𝑤𝑜𝑟𝑘
𝐺𝑎 (𝜔) =
𝑀𝑎𝑥. 𝑃𝑆𝐷 𝑜𝑓 𝑡ℎ𝑒 𝑠𝑖𝑔𝑛𝑎𝑙 𝑎𝑡 𝑡ℎ𝑒 𝑖𝑛𝑝𝑢𝑡 𝑜𝑓 𝑡ℎ𝑒 𝑛𝑒𝑡𝑤𝑜𝑟𝑘
𝑆𝑠𝑜 (𝜔)
∴ 𝐺𝑎 (𝜔) =
𝑆𝑠𝑖 (𝜔)
Therefore the available output power can be written as
∞ ∞
1 1
𝑃𝑎𝑜 = ∫ 𝑆𝑠𝑜 (𝜔) 𝑑𝜔 = ∫ 𝐺𝑎 (𝜔) 𝑆𝑠𝑖 (𝜔) 𝑑𝜔
2𝜋 2𝜋
−∞ −∞
Page 53 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
∴ 𝑮𝒂 (𝝎) = ∏ 𝑮𝒎 (𝝎)
𝒎=𝟏
i.e. the available power gain of the cascade network is equal to the product of
individual gains.
5.6 Discuss and drive the expression for noise bandwidth for a Low pass filter with
input as white noise.
• Consider a system having a low pass transfer function 𝐻(𝜔) and assume
white noise is applied at the input as shown in figure below.
Now consider an idealized system that is equivalent to the actual system which
means that
• Both produce the same average output power when they are excited by same
white noise source.
• Both have the same value of power transfer function at mid band i.e. |𝑯(𝟎)|𝟐
is same in both systems.
• The transfer function of idealized system is defined as
Page 54 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
𝟏 ∞
(𝟐) ∫−∞|𝑯(𝝎)|𝟐 𝒅𝝎
⟹ 𝑾𝑵 =
|𝑯(𝟎)|𝟐
By assuming actual system impulse response is real and hence its transfer function is
an even function. Then,
∞ ∞ ∞
1 1
( ) ∫ |𝐻(𝜔)|2 𝑑𝜔 = ( ) . 2 ∫ |𝐻(𝜔)|2 𝑑𝜔 = ∫ |𝐻(𝜔)|2 𝑑𝜔
2 2
−∞ 0 0
Therefore
∞
∫ |𝑯(𝝎)|𝟐 𝒅𝝎
𝑾𝑵 = 𝟎
|𝑯(𝟎)|𝟐
𝑊𝑁 is called the Noise bandwidth of the system.
If the centre band frequency of band pass transfer function is assumed as 𝜔0 ,
then noise bandwidth is given by
∞
∫𝟎 |𝑯(𝝎)|𝟐 𝒅𝝎
𝑾𝑵 =
|𝑯(𝝎𝟎 )|𝟐
5.7 For a cascade connection of M- two-port networks, derive the expression for
overall equivalent noise temperature.
(OR)
For M cascaded networks, show that
𝑻𝒆𝟐 𝑻𝒆𝟑 𝑻𝒆𝑴
𝑻𝒆 = 𝑻𝒆𝟏 + + + ⋯+
𝑮𝟏 𝑮𝟏 𝑮𝟐 𝑮𝟏 𝑮𝟐 𝑮𝟑 … 𝑮𝑴−𝟏
Ans:
Page 55 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
As shown in figure, the total available noise power is denoted as 𝑃𝑇𝐴 , the available
noise power due to source alone is denoted by 𝑃𝑆𝐴 , and the available noise power
due to network alone is denoted as 𝑃𝑁𝐴 .
From the definition of noise power due to temperature, the available power due to
source alone is given by
𝑃𝑆𝐴 = 𝐺𝑎 𝑘 𝑇𝑠 𝐵
And the noise power due to network alone is given by
𝑃𝑁𝐴 = 𝐺𝑎 𝑘 𝑇𝑒 𝐵
Therefore, the total available noise power in a network is given by
𝑃𝑇𝐴 = 𝑃𝑆𝐴 + 𝑃𝑁𝐴 = 𝐺𝑎 𝑘 𝑇𝑠 𝐵 + 𝐺𝑎 𝑘 𝑇𝑒 𝐵 = 𝐺𝑎 𝑘( 𝑇𝑠 + 𝑇𝑒 ) 𝐵
Now the network can be redrawn as shown in figure below
Now let us consider M-cascaded networks each with available power gain G1, G2, …
GM respectively where M=1, 2, .. Indicates the number of a network as shown in
figure below.
𝑃𝑇𝐴1 = 𝐺1 𝑘 𝑇𝑠 𝐵 + 𝐺1 𝑘 𝑇𝑒1 𝐵
Total available power of the network 2 is given by
𝑃𝑇𝐴1 = 𝐺2 (𝐺1 𝑘 𝑇𝑠 𝐵 + 𝐺1 𝑘 𝑇𝑒1 𝐵) + 𝐺2 𝑘 𝑇𝑒2 𝐵
Page 56 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
= 𝐺1 𝐺2 𝑘 𝑇𝑠 𝐵 + 𝐺1 𝐺2 𝑘 𝑇𝑒1 𝐵 + 𝐺2 𝑘 𝑇𝑒2 𝐵
Total available power of the network 3 is given by
𝑃𝑇𝐴1 = 𝐺3 (𝐺1 𝐺2 𝑘 𝑇𝑠 𝐵 + 𝐺1 𝐺2 𝑘 𝑇𝑒1 𝐵 + 𝐺2 𝑘 𝑇𝑒2 𝐵) + 𝐺3 𝑘 𝑇𝑒3 𝐵
= 𝐺1 𝐺2 𝐺3 𝑘 𝑇𝑠 𝐵 + 𝐺1 𝐺2 𝐺3 𝑘 𝑇𝑒1 𝐵 + 𝐺2 𝐺3 𝑘 𝑇𝑒2 𝐵 + 𝐺3 𝑘 𝑇𝑒3 𝐵
Similarly, repeating the procedure for all networks, the total available power of
network M can be written as
𝑃𝑇𝐴𝑀 = 𝐺1 𝐺2 𝐺3 … 𝐺𝑀 𝑘 𝑇𝑠 𝐵 + 𝐺1 𝐺2 𝐺3 … 𝐺𝑀 𝑘 𝑇𝑒1 𝐵 + 𝐺2 𝐺3 … 𝐺𝑀 𝑘 𝑇𝑒2 𝐵 + ⋯
+ 𝐺𝑀 𝑘 𝑇𝑒𝑀 𝐵
= 𝐺 𝑘 𝑇𝑠 𝐵 + 𝐺 𝑘 𝑇𝑒1 𝐵 + 𝐺2 𝐺3 … 𝐺𝑀 𝑘 𝑇𝑒2 𝐵 + ⋯ + 𝐺𝑀 𝑘 𝑇𝑒𝑀 𝐵
Let us now consider the equivalent network of cascade connection, the above
cascade connection can be redrawn as follows.
5.8 In a cascade of M network stages for which the Mth stage has available power
gain 𝐆𝐌 and operating noise figure 𝐅𝐎𝐏 , Show that:
𝐅𝟐 − 𝟏 𝐅𝟑 − 𝟏 𝐅𝐌 − 𝟏
𝐅𝐎𝐏 = 𝐅𝟏 + + +⋯+
𝐆𝟏 𝐆𝟏 𝐆𝟐 𝐆𝟏 𝐆𝟐 … 𝐆𝐌−𝟏
Page 57 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
denoted by 𝑃𝑆𝐴 , and the available noise power due to network alone is denoted as
𝑃𝑁𝐴 .
The spot noise figure of a two-port network is defined as the ratio of total
available power to the available noise power due to source alone and is
given by
𝑃𝑇𝐴 𝑃𝑆𝐴 + 𝑃𝑁𝐴 𝑃𝑁𝐴
𝐹= = =1+
𝑃𝑆𝐴 𝑃𝑆𝐴 𝑃𝑆𝐴
From the definitions of noise powers, upon substitution, we get
𝑃𝑁𝐴 𝐺 𝑘 𝑇𝑒 𝐵
𝐹 = 1+ =1+
𝑃𝑆𝐴 𝐺 𝑘 𝑇𝑠 𝐵
𝑃𝑁𝐴 𝑇𝑒
∴𝐹 =1+ =1+
𝑃𝑆𝐴 𝑇𝑠
Page 58 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
5.9 Two resistors with resistances R1 and R2 are connected in parallel and have
physical temperatures T1 and T2 respectively.
i. Find the effective noise temperature Ts of an equivalent resistor with
resistance equal to the parallel combination of R1 and R2
(OR)
Derive an equation of effective noise temperature of resistors connected in
parallel.
Ans: Let us consider two resistors R1 and R2 operating at different temperature are
connected in parallel as shown in figure below.
2
The mean square noise current of resistor R1 is given by 𝑖𝑛1 = 4𝑘𝑇1 𝐵𝐺1
2
The mean square noise current of resistor R2 is given by 𝑖𝑛2 = 4𝑘𝑇2 𝐵𝐺2
2 2
𝑖𝑛1 + 𝑖𝑛2 = 4𝑘𝑇1 𝐵𝐺1 + 4𝑘𝑇2 𝐵𝐺2
= 4𝑘(𝑇1 𝐺1 + 𝑇2 𝐺2 )𝐵 − − − − − −(𝑎)
Now, let us assume that the noise temperature across output terminals is 𝑇𝑠 and
equivalent conductance is 𝐺𝑒𝑞 , then mean square noise current across output
terminals is given by
Ans:
Let us consider two resistors R1 and R2 operating at different temperature are
connected in series as shown in figure below
Page 59 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
2
The mean square noise voltage of resistor R1 is given by 𝑣𝑛1 = 4𝑘𝑇1 𝐵𝑅1
2
The mean square noise voltage of resistor R2 is given by 𝑣𝑛2 = 4𝑘𝑇2 𝐵𝑅2
Mean square noise voltage across A & B is
2 2
𝑣𝑛2 = 𝑣𝑛1 + 𝑣𝑛2 = 4𝑘𝑇1 𝐵𝑅1 + 4𝑘𝑇2 𝐵𝑅2
∴ 𝑣𝑛2 = 4𝑘(𝑇1 𝑅1 + 𝑇2 𝑅2 )𝐵 − −(𝑎)
Now, let us assume that the noise temperature across AB terminals is Ts and
equivalent resistance is Req, then mean square noise voltage across AB is given by
5.11 Derive an equation for Average Noise Figure of practical noisy networks.
𝑃𝑆𝑂 = ∫ 𝑃𝑆𝐴 𝑑𝜔 = ∫ 𝐺 𝑘 𝑇𝑠 𝐵 𝑑𝜔
0 0
Page 60 of 61
RANDOM SIGNAL PRINCIPLES
QUESTION BANK
∞ ∞ ∞
5.12 Derive an equation for Average Noise Temperature of practical noisy networks.
Ans:
- Average noise temperature of a practical system can be calculated in two cases,
average source temperature, and average effective input temperature.
From the definition of total available noise power of a network, we have
𝑃𝑇𝐴 = 𝐺𝑎 𝑘( 𝑇𝑠 + 𝑇𝑒 ) 𝐵
From the definition of total output noise power of a network, we have
∞ ∞
Now let us define 𝑇̅𝑆 as the average effective source temperature and 𝑇̅𝑒 as the
average effective input noise temperature, which are supposed to produce same
output noise power, then
∞ ∞
𝑇̅𝑆 ∫ 𝐺𝑎 𝑘 𝐵 𝑑𝜔 = ∫ 𝐺𝑎 𝑘 𝐵 𝑇𝑠 𝑑𝜔
0 0
∞
∫0 𝐺𝑎 𝑇𝑠 𝑑𝜔
𝑇̅𝑆 = ∞
∫0 𝐺𝑎 𝑑𝜔
Similarly, we get
∞
∫0 𝐺𝑎 𝑇𝑒 𝑑𝜔
𝑇̅𝑒 = ∞
∫0 𝐺𝑎 𝑑𝜔
Page 61 of 61