0% found this document useful (0 votes)
27 views52 pages

AP SHAH ADS Notes Smote

The document discusses different time series forecasting methods including moving average, exponential smoothing, and ARIMA models. It provides examples of simple, weighted and holt's exponential smoothing. Moving average methods smooth data by calculating the average of previous data points. Exponential smoothing assigns weights that decrease exponentially as observations become older.

Uploaded by

sabeveg221
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views52 pages

AP SHAH ADS Notes Smote

The document discusses different time series forecasting methods including moving average, exponential smoothing, and ARIMA models. It provides examples of simple, weighted and holt's exponential smoothing. Moving average methods smooth data by calculating the average of previous data points. Exponential smoothing assigns weights that decrease exponentially as observations become older.

Uploaded by

sabeveg221
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 52

10/04/2023

APPLIED DATA SCIENCE


ARIMA Model

PROF.RAMYA.R.B , ASSISTANT PROFESSOR , COMPUTER ENGINEERING,APST THANE


OBJECTIVE

“ To demonstrate ARIMA model. “


14-04-2023 7
8
9
Three factors define ARIMA model, it is defined as ARIMA(p,d,q) where p, d, and q denote
the number of lagged (or past) observations to consider for autoregression, the number of
times the raw observations are differenced, and the size of the moving average window
respectively.

The below equation shows a typical autoregressive model. As the name suggests,
the new values of this model depend purely on a weighted linear combination of its
past values. Given that there are p past values, this is denoted as AR(p) or an
autoregressive model of the order p. Epsilon (ε) indicates the white noise

14-04-2023 10
Next, the moving average is defined as follows:

the moving average

Here, the future value y(t) is computed based on the errors εt made by
the previous model. So, each successive term looks one step further
into the past to incorporate the mistakes made by that model in the
current computation. Based on the window we are willing to look past,
the value of q is set. Thus, the above model can be independently
denoted as a moving average order q or simply MA(q).

14-04-2023 11
Why does ARIMA need Stationary Time-Series Data?

Stationarity
A stationary time series data is one whose properties do not depend on the time, That is why
time series with trends, or with seasonality, are not stationary. the trend and seasonality will
affect the value of the time series at different times,

A stationary time series is one whose statistical properties such as mean, variance,
autocorrelation, etc. are all constant over time.

On the other hand for stationarity it does not matter when you observe it, it should look
much the same at any point in time. In general, a stationary time series will have no
predictable patterns in the long-term.
Why does ARIMA need Stationary Time-Series Data?
Time series data must be made stationary to remove any obvious correlation and collinearity
with the past data.

In stationary time-series data, the properties or value of a sample observation does not depend
on the timestamp at which it is observed. For example, given a hypothetical dataset of the year-
wise population of an area, if one observes that the population increases two-fold each year or
increases by a fixed amount, then this data is non-stationary.

Any given observation is highly dependent on the year since the population value would rely on
how far it is from an arbitrary past year. This dependency can induce incorrect bias while
training a model with time-series data.

To remove this correlation, ARIMA uses differencing to make the data stationary.

Differencing, at its simplest, involves taking the difference of two adjacent data points.
For example, the left graph above shows Google's stock price for 200 days. While the
graph on the right is the differenced version of the first graph – meaning that it shows the
change in Google stock of 200 days. There is a pattern observable in the first graph, and
these trends are a sign of non-stationary time series data. However, no trend or
seasonality, or increasing variance is observed in the second figure. Thus, we can say that
the differenced version is stationary.

14-04-2023 14
This change can simply be modeled by

Where B denotes the backshift operator defined as

14-04-2023 15
14-04-2023 16
Combining all of the three types of models above gives the
resulting ARIMA(p,d,q) model.

14-04-2023 17
In general, it is a good practice to follow the next steps when doing time-series forecasting:

•Step 1 — Check Stationarity: If a time series has a trend or seasonality component, it must be made
stationary.
•Step 2 — Determine the d value: If the time series is not stationary, it needs to be stationarized
through differencing.
•Step 3 — Select AR and MA terms: Use the ACF and PACF to decide whether to include an AR term,
MA term, (or) ARMA.
•Step 4 — Build the model

14-04-2023 18
For a stationary time series,
the ACF will drop to zero
relatively quickly, while the
ACF of non-stationary data
decreases slowly.

14-04-2023 19
For a stationary time series,
the ACF will drop to zero
relatively quickly, while the
ACF of non-stationary data
decreases slowly.

14-04-2023 20
The right order of differencing is the minimum differencing required to get a near-stationary
series which roams around a defined mean and the ACF plot reaches to zero fairly quick.

If the autocorrelations are positive for many number of lags (10 or more), then the series
needs further differencing.

On the other hand, if the lag 1 autocorrelation itself is too negative, then the series is probably
over-differenced

14-04-2023 21
Check if the series is stationary using the Augmented Dickey
Fuller test (adfuller()), from the statsmodels package.

Why?

Because, you need differencing only if the series is non-


stationary. Else, no differencing is needed, that is, d=0.

The null hypothesis of the ADF test is that the time series is
non-stationary. So, if the p-value of the test is less than the
significance level (0.05) then you reject the null hypothesis and
infer that the time series is indeed stationary.

So, in our case, if P Value > 0.05 we go ahead with finding the
order of differencing.

14-04-2023 22
Thank you
29/03/2023

APPLIED DATA SCIENCE


Moving Average and Exponential Smoothing

PROF.RAMYA.R.B , ASSISTANT PROFESSOR , COMPUTER ENGINEERING,APST THANE


OBJECTIVE



To demonstrate smoothening methods:moving average and exponential smooth
ing.
The various time series forecasting methods are:

• Simple Average
• Moving Average
• Weighted Moving Average
• Naïve Method
• Exponential smoothing
• Time Series Analysis using Linear Regression(Least Squares
Method)
• ARIMA
Simple Average:

The method is very simple: average the data by months or


quarters or years and them calculate the average for the
period. Then find out, what percentage it is to the grand
average.
Moving Average
3-Moving Average

Example 1:
3-Moving Average

Example 2:

13-04-2023 8
4-Moving Average

Example 1

13-04-2023 9
13-04-2023 10
13-04-2023 11
4-Moving Average

Example 2:

13-04-2023 12
4-Moving Average

Example 3:

13-04-2023 13
13-04-2023 14
5-Moving Average

Example 1

13-04-2023 15
13-04-2023 16
13-04-2023 17
Weighted Moving Average

When using a moving average method described before, each of the observations used to compute the
forecasted value is weighted equally. In certain cases, it might be beneficial to put more weight on the
observations that are closer to the time period being forecast. When this is done, this is known as a weighted
moving average technique. The weights in a weighted MA must sum to 1.
Naïve Method

13-04-2023 20
Exponential Smoothing

13-04-2023 21
Continue this
process till
13-04-2023
week 10 22
Final Answer

13-04-2023 23
Thank you

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy