0% found this document useful (0 votes)
7 views

Random

Uploaded by

D2Y2
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Random

Uploaded by

D2Y2
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Ajman University

College Of Engineering
Probability and Random Variables

Assignment

Cauchy Random Variable

Done by : Raghad Alnablsi


The Cauchy distribution, named after Augustin Cauchy, is a continuous probability distribution. It is also
known, especially among physicists, as the Lorentz distribution (after Hendrik Lorentz), Cauchy–Lorentz
distribution, Lorentz(ian) function, or Breit–Wigner distribution. The Cauchy distribution is the
distribution of the x-intercept of a ray issuing from with a uniformly distributed angle. It is also the
distribution of the ratio of two independent normally distributed random variables if the denominator
distribution has mean zero.

The Cauchy distribution is often used in statistics as the canonical example of a "pathological"
distribution since both its expected value and its variance are undefined. (But see the section
Explanation of undefined moments below.) The Cauchy distribution does not have finite moments of
order greater than or equal to one; only fractional absolute moments exist.[1] The Cauchy distribution
has no moment generating function.

The Cauchy distribution has the probability density function (PDF)

where is the location parameter, specifying the location of the peak of the distribution, and is the scale
parameter which specifies the half-width at half-maximum (HWHM), alternatively is full width at half
maximum (FWHM). is also equal to half the interquartile range and is sometimes called the probable
error. Augustin-Louis Cauchy exploited such a density function in 1827 with an infinitesimal scale
parameter, defining what would now be called a Dirac delta function.

Cumulative distribution function

The cumulative distribution function is:


Cauchy distribution when solving a problem in "Communications Engineering" (3rd Edition) by
Proakis and Salehi.

Let X and Y be independent Gaussian random variables, each with distribution N(0, o^2). ('o' is
how I am writing "sigma" here).

Find the pdf of the random variable R = X / Y. What is its mean and variance?

After a little calculation, I came up with:

f(r) = 1 / (pi * (1 + r^2)), which looks like a Cauchy pdf with beta = 1 and theta = 0. The
transformation r = tan(u) allows you to calculate that the integral of f(r)dr from -inf to inf is
unity, which is what you would expect.

However, when I try to calculate E(r), I get zero. In fact, the nth moment seems to evaluate to
zero for all odd n. If n is even, then the nth moment evaluates to infinity.

I tried calculating E(r) by evaluating the integral r*f(r)dr from -T to T, where T>0, then taking
the limit as T->inf. I still get zero.

The denominator is never zero for nonzero beta, and a plot of f(r) shows it to have even
symmetry about theta, which in my case is zero.
Solving step by step :

b /π
f x ( x )= 2 2
b +( x−a)

x x
b/ π
F x ( x ) =∫ f x (u)du=∫
−∞ −∞ b +(u−a)2
2

¿
[
b 1 −1 ( u−a ) x
π b
tan
b −∞ π ]
1
= tan−1
( x−a )
b [−tan−1 (−∞ )
]

¿
1 π
π 2 [
tan −1
( x−a )
b ]

1 1 ( x−a )
F x ( x ) = + tan−1
2 π b
Solving by Matlab:
>> syms x b a;int((b/pi)/(b^2+(x-a)^2),x)

ans =

1/pi*atan(1/2*(2*x-2*a)/b)

You might also like