Time Series and Analysis
Time Series and Analysis
This is the mathematical or statistical analysis on past data arranged in a periodic sequence.
Decision making and planning in an organization involves forecasting which is one of the time
series analysis.
Moving Average
Periodical data e.g. monthly sales may have random fluctuation every month despite a general
trend being evident. Moving average helps in smoothing away these random changes.
A moving average is the forecast for a period that takes the average of the previous periods.
Example:
The table below represents company sales, calculate 3 and 6 monthly moving averages, for the
data
Months Sales
January 1200
February 1280
March 1310
April 1270
May 1190
June 1290
July 1410
August 1360
September 1430
October 1280
November 1410
December 1390
Solution.
These are calculated as follows
April’s forecast = =
May’s forecast = =
And so on…
1|Page
July forecast = =
And so on…
3 months moving 6 months moving
average average
April 1263
May 1287
June 1257
July 1250 1257
August 1297 1292
September 1353 1305
October 1400 1325
November 1357 1327
December 1373 1363
Note:
When plotting moving average on graphs the points are plotted as the midpoint of the period of
the average, e.g. in our example the forecast for April (1263) is plotted on mid Feb.
Characteristics of moving average
1) The more the number of periods in the moving average, the greater the smoothing
effect.
2) Different moving averages produce different forecasts.
3) The more the randomness of data with underlying trend being constant then the more
the periods should be involved in the moving averages.
Exponential smoothing
This is a weighted moving average technique; it is given by:
New forecast = Old forecast + (Latest Observation – Old forecast)
Where = Smoothing constant
This method involves automatic weighing of past data with weights that decrease exponentially
with time.
Example
Using the previous example and smoothing constant 0.3 generate monthly forecasts
Months Sales Forecasts: = 0.3
January 1200
February 1280 1200
March 1310 1224
2|Page
April 1270 1250
May 1190 1256
June 1290 1233
July 1410 1250
August 1360 1283
September 1430 1327
October 1280 1358
November 1410 1335
December 1390 1357
Solution
Since there were no forecasts before January, we take Jan to be the forecast for February.
Feb – 1200
For March;
March forecast = Feb forecast + 0.3 (Feb sales – Feb forecast)
= 1200 + 0.3 (1280 – 1200)
=1224
Note:
The value lies between 0 and 1.
The higher the value, the more the forecast is sensitive to the current status.
For accurate forecasts these aspects are qualified separately (i.e. T, C, S and R) from data.
This is known as time decomposition or time series analysis
The separate elements are then combined to produce a forecast.
3|Page
Time series models:
Additive Model
Multiplicative Model:
Of the four elements of time series the most important are trend and seasonal variation. The
following illustration shows how the trend (T) and seasonal variation (S) are separated out from
a time series and how the calculated T and S values are used to prepare forecast. The process of
separating out the trend and seasonal variation is known as deseasonalizing the data.
There are two approaches to this process: one is based on regression through the actual data
points and the other calculates the regression line through moving average trend points.
Time series analysis: trend and seasonal variation using regression on the data
The following data will be used to illustrate how the trend and seasonal variation are calculated.
Example 1
It will be apparent that there is a strong seasonal element in the above data (low in Quarter 1 and
high in Quarter 3) and there is a generally upward trend.
Step 1: Calculate the trend in the data using the least squares method.
4|Page
Step 2: Estimate the sales for each quarter using the regression formula
established in step 1.
Step 3: Calculate the percentage variation of each quarter’s actual sales from the
estimates, obtained in step 2.
Step 4: Average the percentage variations from step 3. This establishes the
average seasonal variations.
Solution
Step 1
Calculate the trend in the data by calculating the linear regression line y = a + bx.
x (quarters) x (sales) xy x2
1 20 20 1
2 32 64 4
Year 1 3 62 186 9
4 29 116 16
5 21 105 25
6 42 252 36
Year 2 7 75 525 49
8 31 248 64
9 23 207 81
10 39 390 100
Year 3 11 77 847 121
12 38 576 144
13 27 351 169
14 39 546 196
Year 4 15 92 1380 225
16 53 848 256
x=136 y= 710 xy= 6661 x2 =1496
y = an + bx
5|Page
6661 = 136a + 1496b
626 = 340b
a = 28.74
Steps 2 and 3
Use the trend line to calculate the estimated sales for each quarter.
The actual value of sales is then expressed as a percentage of this estimate. For example, actual
sales in the first quarter were 20 so the seasonal variation is
1 20 30.58 65
2 32 32.42 99
Year 1 3 62 34.26 181
4 29 36.10 80
5 21 37.94 55
6 42 39.78 106
Year 2 7 75 41.62 180
8 31 43.46 71
9 23 45.30 51
10 39 47.14 83
Year 3 11 77 48.98 157
12 48 50.82 94
13 27 52.66 51
14 39 54.50 72
Year 4 15 92 56.34 163
16 53 58.18 91
6|Page
Step 4
These then are the average variations expected from the trend for each of the quarters; for
example, on average the first quarter of each year will be 56% of the value of the trend. Because
the variations have been averaged, the amounts over 100% (Q3 in this example). This can be
checked by adding the average and verifying that they total 400% thus:
On occasions, roundings in the calculations will make slight adjustments necessary to the
average variations.
Step 5
Prepare final forecasts based on the trend line estimates from “trend estimates and percentages
variation table” (i.e. 30.58, 32.42, etc) and the averaged seasonal variations from the table above.
(i.e. 56%, 90%, 170% and 84%)
X Y (sales) Seasonally
(quarters) adjusted
forecast
1 20 17.12
Year 1 2 32 29.18
3 62 58.24
4 29 30.32
5 21 21.24
Year 2 6 42 35.80
7 75 70.75
8 31 36.51
7|Page
9 23 25.37
Year 3 10 39 42.43
11 77 83.27
12 48 42.69
13 27 29.49
Year 4 14 39 49.05
15 92 95.78
16 53 48.87
Notes:
a) Time series decomposition is not an adaptive forecasting system like moving averages
and exponential smoothing.
b) Forecasts produced by such an analysis should always be treated with caution.
Changing conditions and changing seasonal factors make long term forecasting a
difficult task.
c) The above illustration has been an example of a multiplicative model. This is the
seasonal variations were expressed in percentage or proportionate terms. Similar steps
would have been necessary if the additive model had been used except that the
variations from the trend would have been the absolute values. For example, the first
two variations would have been
8|Page
Q2: 32 – 32.42 = absolute variation = - 0.42
And so on.
The absolute variations would have been averaged in the normal way to find the
average absolute variation, whether + or -, and these values would have been used to
make the final seasonally adjusted forecasts.
In such circumstances, calculating a regression line through the moving average trend
points is more robust and stable.
Example 1 is reworked below using this method and, because there are many similarities
to the earlier method, only the key stages are shown.
9|Page
The regression line y = a + bx of the moving average values is calculated in the normal manner
and results in the following:
y = 33.06 + 1.32x
The percentage variations are averaged as previously shown, resulting in the following values:
Q1 Q2 Q3 Q4
Average seasonal variation % 54 89 170 86
The trend line and the average seasonal variations are then used in a similar manner to that
previously described.
For example, to extrapolate future sales for the next year (i.e. quarters 17, 18, 19 and 20) is as
follows:
Quarter 17
Forecast sales = (33.06 + 1.32(17)) × 0.54 = 29.97
Quarter 18 = 50.57
19 = 98.84
20 = 51.13
Forecast errors
Differences between actual results and predictions may arise from many reasons. They may arise
from random influences, normal sampling errors, choice of the wrong forecasting system or
alpha value or simply that the future conditions turn out to be radically different from the past.
Whatever the cause(s) management wish to know the extent of the forecast errors and various
methods exist to calculate these errors.
A commonly used technique, appropriate to time series, is to calculate the mean squared error of
the deviations between forecast and actual values then choose the forecasting system and/or
parameters which gives the lowest value of mean squared errors, i.e. akin to the ‘least squares’
method of establishing a regression line.
10 | P a g e
regression analysis described in the preceding chapters could be used depending on the
assumptions about linearity or non- linearity, the number of independent variables and so on. The
least squares regression approach is often used for trend forecasting.
Example 2
Data have been kept of sales over the last seven years
Year 1 2 3 4 5 6 7
Sales (in ‘000 units 14 17 15 23 18 22 27
Solution
Years (x) Sales (y) xy x2
1 14 14 1
2 17 34 4
3 15 45 9
4 23 92 16
5 18 90 25
6 22 132 36
7 27 189 49
x=28 y = 136 xy=596 x = 140
2
136 = 7a + 28b
We use this expression for forecasting, for 8th year sales = 12 + 1.86 (8)
=26.88 i.e. 26,888 units
11 | P a g e