Auto Regressive Model
Auto Regressive Model
Autoregression:
An autoregressive (AR) model predicts future behaviour based on past
behaviour. It’s used for forecasting when there is some correlation between values in a
time series and the values that precede and succeed them. You only use past data to
model the behaviour, hence the name autoregressive (auto– means “self.” ). The
process is basically a linear regression of the data in the current series against one or
more past values in the same series.
In an AR model, the value of the outcome variable (Y) at some point t in time is like
“regular” linear regression — directly related to the predictor variable (X). Where
simple linear regression and AR models differ is that Y is dependent on X and
previous values for Y.
AR models are also called conditional models, Markov models, or transition models.
AR(p) Model:
An AR(p) model is an autoregressive model where specific lagged values of
yt are used as predictor variables. Lags are where results from one time period affect
following periods.
The simplest AR process is AR(0), which has no dependence between the terms. Only
the error/innovation/noise term contributes to the output of the process, so in the
figure, AR(0) corresponds to white noise.
The value for “p” is called the order. For example, an AR(1) would be a “first order
autoregressive process.” The outcome variable in a first order AR process at some
point in time t is related only to time periods that are one period apart (i.e. the value of
the variable at t – 1). A second or third order AR process would be related to data two
or three periods apart.
For an AR(1) process with a positive φ ,only the previous term in the process and the
noise term contribute to the output. If φ is close to 0, then the process still looks like
white noise, but as φ approaches 1, the output gets a larger contribution from the
previous term relative to the noise. This results in a "smoothing" or integration of the
output, similar to a low pass filter.
For an AR(2) process, the previous two terms and the noise term contribute to the
output. If both φ 1 and φ 2 are positive, the output will resemble a low pass filter, with
the high frequency part of the
noise decreased. If φ 1is positive while φ 2 is negative, then the process favours
changes in sign between terms of the process. The output oscillates. This can be
likened to edge detection or detection of change in direction.
An AR(1) model
where ɛt is a white noise process with zero mean and constant variance φ 1 The process
is wide-sense stationary if |φ|<1since it is obtained as the output of a stable filter
whose input is white noise. The mean E(Xt) is identical for all values of t by the very
definition of wide sense stationarity. If the mean is denoted by µ ,it follows from
(2) Autoregression model only applies to the forecast and its early-related
phenomenon, which is influenced by its own historical factors
However, once it became public knowledge that many financial institutions were at
risk of imminent collapse, investors suddenly became less concerned with these
stocks' recent prices and far more concerned with their underlying risk exposure.
Therefore, the market rapidly revalued financial stocks to a much lower level, a move
which would have utterly confounded an autoregressive model.