0% found this document useful (0 votes)
10 views

Chapter2 Deterministic and Stochastic

Uploaded by

orkco6565
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Chapter2 Deterministic and Stochastic

Uploaded by

orkco6565
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

Chapter 2 Deterministic and Stochastic

Representation R. J. Chang
Department of Mechanical Engineering
NCKU

§ 2.1 Deterministic and Stochastic View

§ 2.2 Classification of Dynamic Data

§ 2.3 Stationary and Nonstationary Data

§ 2.4 Ergodic and Nonergodic Data


§ 2.1 Deterministic and Stochastic View(1)
1. General Concept
(1) Natural phenomenon
Wind flow
Sea wave
Earthquake

Q: How random is true random?

Q: Can we predict future behavior based on the past


history of evolution?
§ 2.1 Deterministic and Stochastic View(2)
(2) Scientific prediction
Natural disaster:Mainly due to limited prediction of
dynamic phenomenon
Predictable:Predict right value (event) at right place
and right time

Note: If the future data can be predicted from the past


history of data, the dynamic data is deterministic,
otherwise, it is stochastic.
§ 2.1 Deterministic and Stochastic View(3)
2. Dynamic Data
(1) Time series
x(tj)

tj

past present future

t j :time index
Data collection as a time series:
At1 , At 2 , At 3 ,... or  A  j  | j  1, 2,3,..., M 
§ 2.1 Deterministic and Stochastic View(4)
(2) Repeatability view
Measurement data: x1, x2, x3,, xN |t
1

Measurement reset after T: x1, x2, x3,, xN |t T


1

(a) Deterministic time series:


t  t T 100% repeatable
1 1

(b) Stochastic time series:


t  t T
1 1
Repeatability issue
§ 2.1 Deterministic and Stochastic View(5)
(3) Sample record
A single finite collection of data which representing
a random behavior

x(tj)

t1 t2 t3 t4 t5 t6 tj

Sample record of data “x” :  x  t1  , x  t2  , x  t3  , , x  t6   |


x
Sample record of data “o” :  x  t1  , x  t2  , x  t3  , , x  t6   |
o
…..
§ 2.1 Deterministic and Stochastic View(6)
3. Prediction of Time Series
(1) Regression and prediction
(a) Prediction by given two data
 
x 
 j 1
t  x t j , x t 
j 1 
 

Prediction error

et j 1   xt j 1   xt j 1 


 

Note: et  depends on xt j 1  , xt j  and x t j 1 



j 1
§ 2.1 Deterministic and Stochastic View(7)
Ex1:

x(t)

xt j 1 

t
tj-1 tj tj+1

Regression is a constant.

Note: If a dynamic process is constant, then


prediction is 100% correct.
§ 2.1 Deterministic and Stochastic View(8)
Ex2:

x(t)
xt j 1 

t
tj-1 tj tj+1

Regression is linear.

Note: If a dynamic process is linear, then regression


line passes through xˆ t j 1 
§ 2.1 Deterministic and Stochastic View(9)
Ex3:

x(t)

xt i 1 

dynamic process

t
tj-1 tj tj+1

Note: Regression is linear; however, the dynamic process


is nonlinear.
§ 2.1 Deterministic and Stochastic View(10)
(b) Prediction of stochastic data
Ex: Ideal (math) stochastic data
No. of data points →∞
sampling time interval →0
prediction error (possible) →∞

Note: White noise process as a model of ideal random


signal.

Deterministic Stochastic
Process Process

100% Predictable Chaotic 100% Unpredictable


constant Process white process(noise)
§ 2.1 Deterministic and Stochastic View(11)
(2)Formulation of data prediction
Ex.Given a data record of time series x1, x2, x3 x8
 
Find: x9 x1, x2, x3 x8  ?
 

Ideal case

x 9   x 9  Predictable

x 9   x 9  Unpredictable

Practical case
  
Predictable if e9  x9  x9 , e9  

 need to be defined, e.g., absolute value 


§ 2.1 Deterministic and Stochastic View(12)
(3) Predictability based on geometrical view point

X(tk+1)

Data set [x1,x2,...,xk] form


a Vector Space

100% predictable

100% unpredictable
§ 2.1 Deterministic and Stochastic View(13)

(4) Prediction accuracy


Inherent dynamic process
Number of data points (data length)
Time interval between samples (sampling interval)
Acceptable error
§ 2.2 Data Classification and Analysis(1)
1. Data Space

Signal Space Data Space Information Space


Discretization Estimation Amplitude
Continuous Discrete Time
Dynamic Dynamic Spatial
Process Process Frequency
Reconstruction Prediction Information

Sampling Discrete
Signal Source Sensor / Information
Data Hold Digitizer Algorithm
(Ideal) Transducer display (S/N)

Analog Discrete Digital


§ 2.2 Data Classification and Analysis(2)
2. Types of Data
Dynamic
Data

Deterministic Chaotic Stochastic


Quasi-
Periodic Transient Stationary Nonstationary
Stationary

Modulated Nonstationary
Sinusoidal Complex Ergodic Nonergodic
Nonstationary Stochastic
§ 2.2 Data Classification and Analysis(3)
3. Data Classification Problem
The classification problem is a hypothesis testing
problem.

The data class is acceptable in a qualitative sense


from the statistics of data set.

The data class can be tested through statistical


method to give a result with specified confidence of
interval.
§ 2.2 Data Classification and Analysis(4)
4. Fundamental Statistical Measure
(1) Sample average

 x t j   lim  xi t j 
1 N
N  N
i 1

(2) Sample mean square

  t j   lim  xi  t j 
2 1 N 2
x
N  N
i 1

(3) Root mean square

 x  t j    x2  t j 
§ 2.2 Data Classification and Analysis(5)
(4) Autocorrelation function
(a) Definition
Rx t j , t j    lim  xi t j  xi t j   
1 N
N  N
i 1

By G.I. Taylor in turbulence analysis

 : time lag (shift)


x(t)

x(tj) n )
x(tj+£
§ 2.2 Data Classification and Analysis(6)
(b) Physical meaning
Rx t , t   

τ
No correlation No correlation
Self correlation
 0  0

 0 :critical time constant

x(t1 ) xt1   


§ 2.2 Data Classification and Analysis(7)
• If   0
Autocorrelation function becomes mean square value
Rx t1 , t1   0 and is the maximum value of Rx t1,t1   

• If    0
No correlation between xt1  and x  t 1   , xt 1  can
not be predicted by previous time series.

• If  < 0
Correlation between xt1  and x  t 1   , xt 1  is
predictable.
§ 2.2 Data Classification and Analysis(8)
Ex1: White process

R x t1 ,t1   

No way to predict even     0


§ 2.2 Data Classification and Analysis(9)
Ex2: Constant

R x t1 ,t1   

Always predictable even   


§ 2.2 Data Classification and Analysis(10)
Ex3: Deterministic dynamic response
Steady State
x(t)

x0 t
0

Note: No correlation between steady state and initial


condition
§ 2.3 Stationary and Nonstationary Data(1)
1. Two Classes of Random Data

2.5
12
sample 1
sample 1
2 sample 2
sample 2
sample 3
10 sample 3
1.5

8 1

0.5

x(t)
6
x(t)

4 -0.5

-1
2
-1.5
0
-2

-2 -2.5
1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10

Time t Time t
§ 2.3 Stationary and Nonstationary Data(2)
2. Definition and Sense of Stationarity
(1) Mathematical definition
Statistical information obtained by (average)
estimators are invariant w.r.t. time.

(2) Strong and weak sense


Strong sense (Mathematical sense): Stationary
check for all possible moments
Weak sense (Physical sense):Stationary check for
the first two moments usually

Note: Mathematical sense can be realized only


in analytical treatment.
§ 2.3 Stationary and Nonstationary Data(3)
(3) Engineering sense
Trade-off the required numbers of moments and
performance in system design and analysis

Stationarity check:
 x t1    x t2    x t3      x t j 
   
Almost a constant (independent of time)

   
 t1    t2    t3      2 t j 
2
x
2
x
2
x x
Almost a constant (independent of time)

Rx  , t1   Rx  , t2   Rx  , t3     Rx  , t j 
   
Almost a constant (independent of time)

Note: “  ” in statistical sense


§ 2.3 Stationary and Nonstationary Data(4)
3. Analogy

Deterministic Stochastic

Deterministic sense Stochastic sense

Transient Nonstationary

Steady state Stationary


§ 2.3 Stationary and Nonstationary Data(5)
4. Decomposition of Nonstationary Data
(1) Additive decomposition
xNon t   xDet t   x t  , x t  is a stationary data.
4 3 0.5

3 0.4
2
0.3
2

1 0.2
1
0.1
0

-1
= 0
+ 0

-0.1
-1

-2
-0.2
-2
-3 -0.3

-4 -3 -0.4
1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10

Nonstationary Sinusoidal Stationary

3.6 3 0.3

3.4
0.2
2.8
3.2

3 0.1
2.6
2.8
0
2.6

2.4 = 2.4

2.2
+ -0.1

2.2 -0.2

2
2
-0.3
1.8

1.6 1.8 -0.4


1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10

Nonstationary Polynomial Stationary


§ 2.3 Stationary and Nonstationary Data(6)
(2) Multiplicative decomposition
xNon t   xDet t  x t  , x t  is a stationary data.

3 6 0.5

0.4
2 5
0.3

1 4 0.2

0.1
0

-
= 3

-
0

2 0.1
1
-
- 0.2
1
2 -
0.3
- -
0- 1 1
3- 0 5 10 15 0 5 0.4 1 2 3 4 5 6 7 8 9 10
5 5 0 5

Nonstationary Envelope Function Stationary

Note: Wavelet analysis can be used for analyzing


nonstationary data.
§ 2.3 Stationary and Nonstationary Data(7)
5. Dynamic System and Stationary Process
(1) Dynamic process in system response

Input Dynamic System Output

Block Diagram Model

Output  f Input, System, I .C.


f is a generalized function.
§ 2.3 Stationary and Nonstationary Data(8)
(a) Linear system

Output  f1 Input, System  f 2 I .C., System

(b) Linear time-invariant system

Output  Input  System  I .C.  System


“  ” is a convolution operator

Stochastic Stochastic
L.T.I
Input Output

Transfer function
§ 2.3 Stationary and Nonstationary Data(9)

(2) Strong conditions for stationary process

L.T.I. System

Response independent of I.C. (Sufficient settling time)

Input is stationary
§ 2.4 Ergodic and Non-ergodic Data(1)
1. Issues with Ensemble Average

Sample 1
Sample 2
Sample 3
t1 t2
§ 2.4 Ergodic and Non-ergodic Data(2)
Realization issue

Setup 1: Input System Sample 1

Setup 2: Input System Sample 2



Synchronized operation, identical macro system, same
operational environment.
§ 2.4 Ergodic and Non-ergodic Data(3)
2. Definition of Ergodic Process
Time average is equal to ensemble average
(a) Nonstationary

x(tj) x(tj)

t1 t2 t3 t4 t5 t6 .... .... tj t1 t2 t3 t4 t5 t6 .... .... tj

[x(t1), x(t2),…] [x(t1)|x, x(t1)|o, x(t1)|△…]

Average Average
§ 2.4 Ergodic and Non-ergodic Data(4)
(b)Stationary
o x

t1 t2 t3 t4 t5 t t1 t2 t3 t

[x(t1), x(t2),…] [x(t1)|x, x(t1)|o, x(t1)|△…]


Average Average

Weak sense ergodicity is tested up to 2nd moment


§ 2.4 Ergodic and Non-ergodic Data(5)
2. Conceptual Realization

Segment #1 Segment #2 Segment #3 Segment #4 ... ... ...

t0 tN t2N t3N t4N

... ... ... ... ... ... ... ...

# # # #
1 t 2 t 3 t 4 t

t1 tN tN+1 t2N t2N+1 t3N t3N+1 t4N


§ 2.4 Ergodic and Non-ergodic Data(6)

... ...
#1 t
t1 tN
... ...
#2 t
tN+1 t2N
... ...
#3 t
t2N+1 t3N
... ...
#4 t
t3N+1 t4N
§ 2.4 Ergodic and Non-ergodic Data(7)
4. Phase Plane Representation of Ergodic Process

x(t ,ω )

t
§ 2.4 Ergodic and Non-ergodic Data(8)
4. Engineering use of Ergodic Properties
(a) Qualification
Check stationarity
Ergodic hypothesis
Design and verification
Acceptance in performance
(b) Time average for replace ensemble average
1 M
 
 x  k   lim  xk t j
M  M j 1

Rx  , k   lim   
1 M

 xk t j xk t j  
M  M j 1

Note: Time index uses j=1,2,…,M


Sample index uses I=1,2,…N

You might also like