0% found this document useful (0 votes)
6 views

Unit 3 B Time Series Analysis

Uploaded by

Aarthy Ramesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Unit 3 B Time Series Analysis

Uploaded by

Aarthy Ramesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 50

UNIT - 3 PREDICTIVE

ANALYTICS –
SUPERVISED
Forecasting
• Simple Linear Regression. Multiple Linear Regression - Logistic
Regression
• Time series analysis: Variations in time series - trend analysis, cyclical
variations, seasonal variations and irregular variations, forecasting
errors.
• Using Regression Analysis for Forecasting - Linear Trend Projection -
Holt’s Model –Winter’s Model - Using Regression Analysis as a Causal
Forecasting Method - Combining Causal Variables with Trend and
Seasonality
• Chapter 13 Dinesh Kumar

Process of predicting the future while analyzing the past and present
data.
Examples of Forecasting
• Predicting demand – Inventory Level
• Manpower Planning
• Economic Growth
• Stock Market Prediction
Time Series Analysis
• Time Series Analysis is the way of studying the characteristics of the
response variable with respect to time, as the independent variable.
• Time can be Years, Months, Weeks, Days, Hours, Minutes, and
Seconds
Components of Time Series Analysis
• Level of the series – the average value for a specific time period
Components of Time Series Analysis
• Trend Component (Tt): Trend is the consistent long-term upward or
downward movement of the data over a period of time.
Components of Time Series Analysis
• Seasonal Component (St): Seasonal component (measured using
seasonality index) is the repetitive upward or downward movement
(or fluctuations) from the trend that occurs within a calendar year
such as seasons, quarters, months, days of the week, etc. For
example, heating demands are high in the winter months, sales are
high in Dec.,
Components of Time Series Analysis
• Cyclical Component (Ct): Cyclical component is fluctuation around the
trend line that happens due to macro-economic changes such as
recession, unemployment, etc. Cyclical fluctuations have repetitive
patterns with a time between repetitions of more than a year.
Component consists of the gradual ups and downs that do not repeat
each year and so are excluded from the seasonal component.
Components of Time Series Analysis
• Irregular Component (It): Irregular component is the white noise or
random uncorrelated changes that follow a normal distribution with
mean value of 0 and constant variance.
• White noise time series is defined by a Zero
mean change over the time, Constant
variance over the time, and Zero Co-
relation over the time.
• Is Mean level Zero?
• Is variance constant ?
• Is no co-relation with lag values?

White noise will reduce our prediction


accuracy !!
Components of Time Series Analysis
• The time-series data can be modelled as an addition of the all the
components

• The time-series model is given by


No Seasonality With Seasonality

No Trend Moving Average Seasonal Additive and


Single Exponential Multiplicative (No
Smoothing covered)
With Trend Double Exponential Triple Exponential
Smoothing (Holts) Smoothing (Holts
Winter)
FORECASTING TECHNIQUES
• Simple Moving Average
• Exponential smoothing
• Double Exponential Smoothing – Holts Method
• Triple Exponential Smoothing – Holt-Winter Method
• Other Methods….
What Is Forecast Error / Bias?
Forecast Bias can be described as a tendency to either over-forecast (forecast is more than the
actual), or under-forecast (forecast is less than the actual), leading to a forecasting error.
FORECAST ERRORS
• Forecast error is the difference between the actual and the forecast
for a given period. Forecast error is a measure forecast accuracy.
There are many different ways to summarize forecast errors in order
to provide meaningful information to the manager.
• Forecast errors can be separated into standard and relative error
measures .
• Standard error measures typically provide error in the same units as
the data. Relative error measures are based on percentages and make
it easier for managers to understand the quality of the forecast.
Forecasting accuracy measures (Errors)
• Mean absolute error (MAE)

• Yt is the actual value of Y at time t and Ft is the corresponding forecasted value.

• Mean absolute percentage error (MAPE)

• Mean squared error (MSE)

• Root mean square error (RMSE)


Forecasting Dilemma at Dstore (Case Study)
Banglaore based John Paul established Dstore (P) Ltd. (DStore), a pen drive manufacturing
company, in 2011. John realized he was not able to administer the inventory properly.
Dstore was continuously facing the problem of overstock or shortage. To avoid this
problem John wanted accurate forecasting and he gave this assignment to the new joined
Trainee Ram Charan. Ram wanted to do the forecasting as accurately as John expected it to
be. Can Ram find solution to the problem at Dstore? Will he able to able to identify the apt
method of forecasting suitable for Dstore?
Questions
1. What are the available forecasting methods? Which method would be most suitable for
Ram?
2. Based on your analysis prepare a report, based on the same criteria, to be presented to
John Paul.
3. Adding the factor of seasonality into your model, how does the analysis change?
MOVING AVERAGE METHOD
• Moving average is one of the simplest forecasting techniques which
forecasts the future value of a timeseries data using average (or
weighted average) of the past ‘N’ observations.
• The moving average method assumes that future observations will be
similar to the recent past, it is most useful as a short-range forecasting
method.
700

600

500

400

Actual
300 Forecast

200

100

0
y y h il y e y t r r r r y y h il y e y t r r r r y y h il y e y t r r r r
uar uar arc Apr Ma Jun Jul gus be obe be be uar uar arc Apr Ma Jun Jul gus be obe be be uar uar arc Apr Ma Jun Jul gus be obe be be
n r
Ja Feb M Au ptem Oct vem ecem Jan ebr M Au ptem Oct vem ecem Jan ebr M Au ptem Oct vem ecem
e o D F e o D F No D
S N S N Se
Weighted Moving Average
• Weights are used to give more emphasis on the recent values. This
process makes forecasting techniques more responsive to changes
because more recent periods may be more heavily weighted.
Exponential Smoothing Methods
• Simple or single exponential smoothing
• Double exponential smoothing
• Triple exponential smoothing

Forecast of the weighted averages of past observations are introduced


using exponential smoothing methods, with the weights breaking down
exponentially as the observations get formed. In other words, the more
the latest the observation the higher the corresponding weight.
Simple Exponential Smoothing Models
• Simple or single exponential smoothing (SES) is the method
of time series forecasting used with univariate data with no
trend and no seasonal pattern.
• Alpha controls the rate at which the influence of past
observations decreases exponentially. The parameter is often
set to a value between 0 and 1.
• A value close to 1 indicates that the most recent values
influence the forecasts, whereas a value close to 0 indicates
past observations have a large influence on forecasts.
Simple Exponential Smoothing Models
• is suitable for data with no trend or seasonal pattern.

• Now, the question is – if you want to forecast the stock price for
tomorrow, would you consider yesterday’s value or the price 10 days
ago or last year?

• Obviously yesterday’s price or last week’s value would give a better


idea about the forecast than the values taken from a year ago. This
implies that recency is an important factor in forecasting values.
This is where exponential smoothing algorithms shine.
Simple Exponential Smoothing Models
• Damping factors (Used in Excel) are used to smooth out the graph
and take on a value between 0 and 1. Technically, the damping factor
is 1 minus the alpha level (1 – α).

• The value of the damping factor is 0.9. Therefore, the alpha value is
0.1 as the damping factor is 1 – α.
Simple Exponential Smoothing Models

• Alpha Parameter a in the equation above is called the smoothing


constant and its value lies between 0 and 1.
• A smaller value (closer to 0) creates a smoother (slowly
changing) line similar to a moving average with a large number
of periods. A high value for alpha tracks the data more closely
by giving more weight to recent data.
Alpha
• Considerations:
• Start with an alpha between 0.2 and 0.5 and see how it fits your
data. Set it higher to fit changes in the data more closely. Set it
lower to emphasize longer term trend.
• If you have lots of data points. For example, daily data for more
than a year, then consider a lower alpha (even as low as 0.01 or
0.02) to include more of your timeline in the calculation.
Double Exponential Smoothing – Holts
Method
• One of the drawbacks of single exponential smoothing is that the
model does not do well in the presence of trend.
• This can be improved by introducing an additional equation for
capturing the trend in the time-series data.
• Double exponential smoothing uses two equations to forecast the
future values of the time series, one for forecasting the level (short
term average value) and another for capturing the trend.
• Beta
• This numeric value, between 0 and 1, controls the trend
calculation. A smaller value considers longer-term trend and a
larger value focus on shorter-term trend.
• Considerations:
• Start with a beta between 0.2 and 0.5 and see how it fits your
data. Set it higher to reflect short-term trends.
• If you have lots of data points. For example, daily data for more
than a year, then consider a lower beta (even as low as 0.01 or
0.02).
Triple Exponential Smoothing – Holt-
Winter Method
• Moving averaging and single and double exponential smoothing
techniques discussed so far can handle data as long as the data do not
have any seasonal component associated with it. However, when
there is seasonality in the time-series data, techniques such as
moving average, exponential smoothing, and double exponential
smoothing are no longer appropriate.
• Season Periods
• If the cycle is one year and the data is weekly, then the season
period is 52 because there are 52 weeks in the cycle. Another
example is if the “seasonality” is daily and the data is hourly, then
set the Season Periods to 24 because the data cycles every 24
hours.
• Gamma
• Like with Alpha and Beta, this parameter is a numeric value
between 0 and 1. A smaller gamma takes into account a longer
history whereas a bigger gamma gives more weight to the recent
history.
Seasonality
Regression Model for Forecasting
• Linear Regression
• Multiple linear regression models with categorical variables can be
used for time series with seasonality.
• Since the demand is seasonal, the first step in forecasting is to
estimate the seasonality index.
• De-seasonalized data is calculated by dividing the value of Yt with the
corresponding seasonality index.
• Regression output for the de-seasonalized demand
Seasonal Index
• The seasonal index (also called seasonal effect or seasonal component) is a
measure of how a particular season compares on average to the mean of
the cycle.
• Seasonal Index = Period average Demand / Average demand for all
periods
• We use seasonal indices for two purposes.
They can be used to smooth data by a process called deseasonalising.
• They can be used to help with predicting future scores with time series
data. Once an initial predicted value from a smoothed line is calculated, the
seasonal index is used to correct that value up or down depending on
which season we are predicting for.
AC manufacturing company
• Total Demand for 6 years (24 Q) is (310+365+395+415+450+465) =
2400.
• Average Demand for all quarters = 2400/24 = 100.
Seasonal Index

1. For Q1 = 142/100 = 1.42


2. For Q2 = 58/100 = 0.58
3. For Q3 = 80 /100 = 0.80
4. For Q4 = 120/100 = 1.20
Other Methods
• Auto-regressive (AR),
• Auto-regressive and moving average (ARMA)
• Auto-regressive integrated moving average (ARIMA)
Regression Forecasting with Causal Variables
• In many forecasting applications, other independent variables besides
time, such as economic indexes or demographic factors, may
influence the time series.
• For example, a manufacturer of hospital equipment might include
such variables as hospital capital spending and changes in the
proportion of people over the age of 65 in building models to forecast
future sales.
• Explanatory/causal models, often called econometric models,

You might also like