0% found this document useful (0 votes)
149 views5 pages

Data Analysis and Decision Making Time Series Analysis

This document discusses time series analysis and its two main goals: identifying patterns in variable data over time and forecasting future values. It describes analyzing trends, which involve smoothing noisy data and fitting functions, and seasonality, which involves examining autocorrelation and partial autocorrelation graphs and removing serial dependencies through differencing. Applications of time series analysis include economic and sales forecasting, budgeting, stock prices, yields, process control, and workload projections.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
149 views5 pages

Data Analysis and Decision Making Time Series Analysis

This document discusses time series analysis and its two main goals: identifying patterns in variable data over time and forecasting future values. It describes analyzing trends, which involve smoothing noisy data and fitting functions, and seasonality, which involves examining autocorrelation and partial autocorrelation graphs and removing serial dependencies through differencing. Applications of time series analysis include economic and sales forecasting, budgeting, stock prices, yields, process control, and workload projections.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Data Analysis and Decision

Making
Time Series Analysis
Time Series Analysis
Definition:-An ordered sequence of values of a variable at equally spaced time intervals .

Two Main Goals of Time Series


There are two main goals of time series analysis:
(a) Identifying the nature of the phenomenon represented by the sequence of
observations
(b) Forecasting (predicting future values of the time series variable).

Both of these goals require that the pattern of observed time series data is identified and
more or less formally described. Once the pattern is established, we can interpret and
integrate it with other data (i.e., use it in our theory of the investigated phenomenon, e.g.,
seasonal commodity prices). Regardless of the depth of our understanding and the validity
of our interpretation (theory) of the phenomenon, we can extrapolate the identified pattern
to predict future events.

 Two General Aspects of Time Series


Most time series patterns can be described in terms of two basic classes of components:
trend and seasonality. The former represents a general systematic linear or (most often)
nonlinear component that changes over time and does not repeat or at least does not
repeat within the time range captured by our data (e.g., a plateau followed by a period of
exponential growth). The latter may have a formally similar nature (e.g., a plateau followed
by a period of exponential growth), however, it repeats itself in systematic intervals over
time. Those two general classes of time series components may coexist in real-life data. For
example, sales of a company can rapidly grow over years but they still follow consistent
seasonal patterns (e.g., as much as 25% of yearly sales each year are made in December,
whereas only 4% in August).

Trend Analysis
There are no proven "automatic" techniques to identify trend components in the time series
data; however, as long as the trend is monotonous (consistently increasing or decreasing)
that part of data analysis is typically not very difficult. If the time series data contain
considerable error, then the first step in the process of trend identification is smoothing.

Smoothing:- Smoothing always involves some form of local averaging of data such that the
nonsystematic components of individual observations cancel each other out. The most
common technique is moving average smoothing which replaces each element of the series
by either the simple or weighted average of n surrounding elements, where n is the width of
the smoothing "window". Medians can be used instead of means. The main advantage of
median as compared to moving average smoothing is that its results are less biased by
outliers (within the smoothing window). Thus, if there are outliers in the data (e.g., due to
measurement errors), median smoothing typically produces smoother or at least more
"reliable" curves than moving average based on the same window width. The main
disadvantage of median smoothing is that in the absence of clear outliers it may produce
more "jagged" curves than moving average and it does not allow for weighting.

Page 2 of 5
In the relatively less common cases (in time series data), when the measurement error is
very large, the distance weighted least squares smoothing or negative exponentially
weighted smoothing techniques can be used. All those methods will filter out the noise and
convert the data into a smooth curve that is relatively unbiased by outliers.

Fitting a function:- Many monotonous time series data can be adequately approximated by
a linear function; if there is a clear monotonous nonlinear component, the data first need to
be transformed to remove the nonlinearity. Usually a logarithmic, exponential, or (less
often) polynomial function can be used.

Analysis of Seasonality
Seasonal dependency (seasonality) is another general component of the time series pattern.
It is formally defined as co relational dependency of order k between each i'th element of
the series and the (i-k)'th element and measured by auto correlation (i.e., a correlation
between the two terms); k is usually called the lag. If the measurement error is not too
large, seasonality can be visually identified in the series as a pattern that repeats
every k elements.

Autocorrelation correlogram:- Seasonal patterns of time series can be examined via


correlograms. The correlogram (autocorrelogram) displays graphically and numerically the
autocorrelation function (ACF), that is, serial correlation coefficients (and their standard
errors) for consecutive lags in a specified range of lags (e.g., 1 through 30). Ranges of two
standard errors for each lag are usually marked in correlograms but typically the size of auto
correlation is of more interest than its reliability because we are usually interested only in
very strong (and thus highly significant) autocorrelations.

Examining correlograms. While examining correlograms, you should keep in mind that


autocorrelations for consecutive lags are formally dependent. Consider the following
example. If the first element is closely related to the second, and the second to the third,
then the first element must also be somewhat related to the third one, etc. This implies that
the pattern of serial dependencies can change considerably after removing the first order
auto correlation (i.e., after differencing the series with a lag of 1).

Page 3 of 5
Partial autocorrelations. Another useful method to examine serial dependencies is to
examine the partial autocorrelation function (PACF) - an extension of
autocorrelation, where the dependence on the intermediate elements (those within the lag)
is removed. In other words the partial autocorrelation is similar to autocorrelation, except
that when calculating it, the (auto) correlations with all the elements within the lag are
partialled out. If a lag of 1 is specified (i.e., there are no intermediate elements within the
lag), then the partial autocorrelation is equivalent to auto correlation. In a sense, the partial
autocorrelation provides a "cleaner" picture of serial dependencies for individual lags (not
confounded by other serial dependencies).

Page 4 of 5
Removing serial dependency:- Serial dependency for a particular lag of k can be removed by
differencing the series, that is converting each i'th element of the series into its difference
from the (i-k)''th element. There are two major reasons for such transformations.

First, we can identify the hidden nature of seasonal dependencies in the series. Remember
that, as mentioned in the previous paragraph, autocorrelations for consecutive lags are
interdependent. Therefore, removing some of the autocorrelations will change other auto
correlations, that is, it may eliminate them or it may make some other seasonalities more
apparent.

Applications: The usage of time series models is twofold:


 Obtain an understanding of the underlying forces and structure that produced the
observed data
 Fit a model and proceed to forecasting, monitoring or even feedback and feed
forward control.
Time Series Analysis is used for many applications such as:
 Economic Forecasting
 Sales Forecasting
 Budgetary Analysis
 Stock Market Analysis
 Yield Projections
 Process and Quality Control
 Inventory Studies
 Workload Projections
 Utility Studies
 Census Analysis and many more

Page 5 of 5

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy