Sie sind auf Seite 1von 27

Slide 14-1

Chapter 14
Time Series: Understanding Changes over Time

2/10/2012

Slide 14-2

Time Series Analysis


Understand the past Forecast the future

Goals

Different from Cross-Sectional Data


Time-series data are not independent of each other
Not a random sample Does not satisfy the random-sample assumption for confidence intervals (in Chapter 9) or hypothesis testing (in Chapter 10)

New methods are needed to take account of the interdependence

2/10/2012

Slide 14-3

Cross-Sectional and Time-Series


Expect next observation to be about S away from X
S S

Cross-Sectional Data
X

Time-Series Data
Next will probably not be about S away from X
(not a random sample)
2/10/2012

S S

Slide 14-4

Forecasting
A system of equations that can produce data that look like your time series data Estimate the model Your forecast will be the expected (mean) value of the future behavior of the model The forecast limits are the confidence limits for your forecast (if your model can produce them)
Computed from the appropriate standard error If model is correct, the future observation has a 95% chance of being within the forecast limits

Use a model

2/10/2012

Slide 14-5

Trend-Seasonal and Box-Jenkins


Direct and intuitive, with four components:
(1) Long-term Trend, (2) repeating Seasonal, (3) medium-term wandering Cyclic, and (4) random Irregular
Forecast comes from extending the Trend and Seasonal

Trend-Seasonal Analysis

Box-Jenkins ARIMA Process


Flexible, but complex, probability models for how current value of the series depends upon
Past values, past randomness, and new randomness

A better way to describe the Cyclic component Forecast is


Expectation of random future behavior, given past data
2/10/2012

Slide 14-6 Fig 14.1.4, 6

Example: Radio, TV, Computer Stores


70 60 50 40 30 20 10 rate 0 1980 Sales (billions) 5 Sales (logarithm) 4 3 2

Steady growth
not perfectly smooth Nonlinear (curved)
Suggests constant growth

Logarithm of revenues
Log plot looks linear if constant growth rate Can use regression to model relationship

1985

1990 Year

1995

2000

Points are not randomly 1980 distributed about the line, so serial correlation is present
2/10/2012

1985

1990 Year

1995

2000

Slide 14-7 Fig 14.1.7, 8

Example: Retail Sales


$350 $300 $250 $200 $150 1997 $350 Sales (billions) $300 $250 $200 $150 1997 Sales (billions)

U.S. Retail Sales (Monthly)


Growth Repeating seasonal variation
High in December Low in January, February

1998

1999 Year

2000

2001

Seasonally-Adjusted Sales
Growth Seasonal pattern removed
Shows how sales went up (or down) relative to what you expect for time of year
2/10/2012

1998

1999 Year

2000

2001

Slide 14-8 Fig 14.1.9

Example: Interest Rates


15%

U.S. Treasury Bills, Yearly


Generally rising Substantial variation Cyclic pattern

0% Rising and falling 1960 1970 1980 Increasing magnitude Year Not perfectly repeating Not expected to continue rising indefinitely! 1990 2000 Interest rate 10% 5%

2/10/2012

Slide 14-9

Trend-Seasonal Analysis
Data = Trend v Seasonal v Cyclic v Irregular Trend
Long-term behavior (often straight line or exponential growth)

Decompose a Time Series into Four Components

Seasonal
Repeating effects of time-of-year

Cyclic
Gradual ups and downs, not repeating each year, not purely random

Irregular
Short-term, random, nonsystematic noise
2/10/2012

Slide 14-10

Ratio-to-Moving-Average Method
Eliminates Seasonal and Irregular by averaging a year

Moving Average Represents Trend and Cyclic Divide Data by Moving Average
Represents Seasonal and Irregular Group by season, then average, to obtain Seasonal

Seasonal Adjustment: Divide Data by Seasonal Regress Seasonally-Adjusted Series vs. Time
Represents Trend

Forecast by Seasonalizing the Trend


Multiply (future predicted Trend) by (Seasonal index)
2/10/2012

Slide 14-11 Fig 14.2.1

Example: Ford Motor Company


Sales typically highest in second quarter Does not repeat perfectly (due to Cyclic and Irregular)
$40 $35

Time-series Plot
Quarterly data with strong Seasonal pattern

Ford Sales (billions)

$30 $25 $20 1994 1995 1996 1997 Year 1998 1999 2000 2001

2/10/2012

Slide 14-12 Fig 14.2.4

Example: Moving Average

Averages one year of data


2 quarters before to 2 quarters after each data value Smooths the data, eliminating Seasonal and Irregular Shows you Trend and Cyclic
$40 Moving average $35 Ford sales (billions) $30 $25 Original data $20 1994 1995 1996 1997 Year
2/10/2012

1998

1999

2000

2001

Slide 14-13 Fig 14.2.6

Example: Seasonal Index


Shows how much larger (or smaller) this quarter is compared to a typical period throughout the year

Average Ratio-to-Moving-Average by Quarter


Seasonal index for each quarter, repeating each year

1.1 Seasonal index

1.0

0.9

0.8 1994 1995 1996 1997 Year


2/10/2012

1998

1999

2000

2001

Slide 14-14 Fig 14.2.7

Example: Seasonal Adjustment


Eliminates the expected seasonal component Shows changes that are not due to expected seasonal effects
$40 Original data $35

Divide Data by Seasonal Index


To get Seasonally Adjusted Value

Ford sales (billions)

$30 $25 $20 1994 1995 1996 1997 Year 1998 1999 2000 2001 Seasonally adjusted

2/10/2012

Slide 14-15 Fig 14.2.8

Example: Trend Line

Regress Seasonally-Adjusted Data vs. time


The resulting line can be extended into the future This gives a Seasonally-Adjusted Forecast
$40 $35 Ford sales (billions) $30 $25 $20 1995 2000 Year
2/10/2012

Trend line

Seasonally adjusted series

seasonally adjusted forecast

2005

Slide 14-16 Fig 14.2.9

Example: Forecast

Seasonalize the Trend


Multiply Trend by Seasonal Index Can be extended into the future
Use future predicted Trend with quarterly Seasonal index
$45 Seasonalized trend $40 Ford sales (billions) $35 $30 Forecast $25 Original data $20 1995
2/10/2012

2000 Year

2005

Slide 14-17

Box-Jenkins ARIMA Processes


Can describe many different kinds of time-series
Including medium-term cyclic behavior

A Collection of Linear Statistical Models

Compared to trend-seasonal analysis, Box-Jenkins


Has a more solid statistical foundation Is more flexible Is somewhat less intuitive

Outline of the steps involved



2/10/2012

Choose a type of model and estimate it using your data Forecast using average future random behavior of this model Find standard error (variability in this future behavior) Find forecast limits, to include 95% of future behavior

Slide 14-18

Random Noise Process


Data = (Mean value) + (Random Noise) Yt = Q + It The long-term mean of Y is Q
Mean

A Random Sample, with No Memory

2/10/2012

Slide 14-19

Autoregressive (AR) Process


Data = H + N(Previous value) + (Random Noise) Yt = H + NYt1 + It The long-term mean value of Y is H  N
Mean

Remembers the Past, Adds Random Noise

2/10/2012

Slide 14-20

Moving-Average (MA) Process


Data = Q + (Random Noise) U(Previous Noise) Yt = Q + It UIt1 The long-term mean value of Y is Q
Mean

Remembers Previous Noise, Adds New Noise

2/10/2012

Slide 14-21

ARMA Process
Remembers the Past, Previous Noise, Adds New Noise
Data = H + N(Previous value) + (Noise) U(Previous Noise) Yt = H + NYt1 + It UIt1

Autoregressive Moving Average Process

The long-term mean value of Y is H  N


Mean

2/10/2012

Slide 14-22

Example: Unemployment
Unemployment rate 10%

5%

0% 1960

1970

1980

1990

2000

Estimated ARMA Process for this Time Series


Yt =  + Yt1 + It + It1 where random noise has standard deviation 0.907
2/10/2012

Slide 14-23

Example (continued)
Look similar to actual unemployment rate history
Because of estimation using actual data Looking at what might have happened instead

Random Simulations from Estimated Process

Unemployment rate

10%

5%

0% 1960
2/10/2012

1970

1980

1990

2000

Slide 14-24

Example (continued)
Using the average of random future possibilities
And their lower and upper 95% limits

Forecast and 95% Forecast Limits (10 years ahead)

Unemployment rate

10%

Forecast
5%

0% 1960 1970 1980 1990 2000 2010

2/10/2012

Slide 14-25

Example (continued)
With forecast and 95% Forecast Limits To see how forecast represents future possibilities

Three Simulations of the Future

Unemployment rate

10%

5%

0% 1960
2/10/2012

1970

1980

1990

2000

2010

Slide 14-26

Pure Integrated (I) Process


Data = H + (Previous value) + (Random Noise) Yt = H + Yt1 + It Over time, Y is not expected to stay close to any longterm mean value

A Random Walk from the Previous Value

2/10/2012

Slide 14-27

ARIMA Process

Autoregressive Integrated Moving Average Remembers its Changes


The differences, Yt Yt1, follow an ARMA process Over time, Y is not expected to stay close to any longterm mean value

2/10/2012

Das könnte Ihnen auch gefallen