0% found this document useful (0 votes)
5 views28 pages

Week 11

The document discusses the application of Artificial Neural Networks (ANN) in financial time series modeling, particularly for stock market forecasting. It covers various data types used in econometrics, including cross-sectional, time series, and panel data, and outlines the components and classical models of time series analysis. Additionally, it details the modeling process, data preparation, network topology, and performance metrics for ANN in stock price forecasting.

Uploaded by

Riya singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views28 pages

Week 11

The document discusses the application of Artificial Neural Networks (ANN) in financial time series modeling, particularly for stock market forecasting. It covers various data types used in econometrics, including cross-sectional, time series, and panel data, and outlines the components and classical models of time series analysis. Additionally, it details the modeling process, data preparation, network topology, and performance metrics for ANN in stock price forecasting.

Uploaded by

Riya singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

BUSINES INTELLIGENCE & ANALYTICS

Artificial Neural Networks (ANN) for


financial time series modeling

Saji K Mathew, PhD


Professor, Department of Management Studies
INDIAN INSTITUTE OF TECHNOLOGY MADRAS
AI: Winter to Spring
} Google’s search
} Facebook—automatic
photo tagging
} Apple’s voice assistant
} Amazon recommendations
} Tesla’s self driving cars
Machinery question

} Threatened by AI: Analyzing Users’ Responses to


the Introduction of AI in a Crowd-Sourcing Platform
} Mikhail Lysyakov
} , Siva Viswanathan
Cognitive neuroscience
} Deals with mental abilities and brain functioning
} Invention of microscope, and neurons thereafter
} Neurons as fundamental elements for brain’s information
processing
} Neuron-neuron communication through release of
neurotransmitters (chemicals) into junctions between
neurons (synapse)
} An average brain has ~100bn neurons, each one
connected to 15,000 other neurons
} When one neuron fires another, it becomes a Hebbian
circuit (neurons that fire together, wire together)
The Neuron
The Artificial Neuron
Perceptron (Rosenblatt, 1957)
Gradient descent algorithm
(A type of Back Propagation: BP)
} Perceptron training may not converge if the linear unit is not
linearly separable
} Solution: Back propagation algorithm (Hopefield, 1982)

} is half the squared difference between the target output td and the
linear unit output od, summed over all training tuples.
} Here E is a function of weight vector, it is dependent on the linear unit
output (od).
Gradient descent search determines
a weight vector (partial differentials)
that minimizes E
Starts with an initial weight vector and
modifies it in steps
At each step the weight vector is altered
In the direction that produces steepest
descent along the error surface
ANN Training
Target

Compare
Feed-Forward Neural N/w topologies
Financial Time Series Modeling:
The case of stock markets
Econometric data
} In econometrics there are three main types of data (not
necessarily mutually exclusive)
} Cross-sectional data
} Time series data
} Panel (longitudinal) data

} All these different data types require specific econometric


and statistical techniques for data analysis
Cross-sectional data
} A type of one-dimensional data set collected by observing
many subjects (such as individuals, firms or
countries/regions) at the same point of time, or without
regarding the differences in time
} In general used to compare the differences among the
subjects I Order does not matter
} Examples: Explaining people’s wages by reference to their
education level
Time series
} A sequence of data points, measured typically at
successive times spaced at uniform time intervals
} Natural one-way ordering of time so that values for a
given period will be expressed as deriving in some way
from past values, rather than from future values
} Have a natural temporal ordering: Annual inflation rates,
daily closing value of a certain stock
Panel data
} Data that involve repeated observations of the same
items over long periods of time
} Not necessarily cohort study
} Panel data (Longitudinal) studies track the same subject
(people, countries, same set of stocks)
} Measurements are observed or taken on the same
subjects repeatedly
Chocolate Production in Australia (1958-1990)
10000
Production (tonns)

9000

8000

7000

6000

5000

4000

3000

2000

1000

0 12 24 36 48 60 72 84 96 108 120 132 144 156 168 180 192 204 216 228 240 252 264 276 288 300 312 324 336 348 360 372 384 396400

Months (Jan 1958-Dec 1990)


Components of time series
Y

Time
Q1 Q3 Q1 Q3 Q1 Q3

Trend
Seasonality
Y Y

Time time

Cycle Noise
Time series classical models
} Multiplicative
} Ûi=Ti*Si*Ci*Ii
Ti: Trend component
} Additive Si: Seasonal component
} Ûi=Ti+Si+Ci+Ii
Ci: Cyclical component
Ii: Irregular component
} Trend
} Ûi=bXi+c

} Seasonality
} Ûi=b1Xi + b2Q1+ b3Q2 + b3Q3+c
} Seasonality index
Related concepts
} Efficient Market Hypothesis
} Week
} Semi-strong
} Strong
} Random Walk Hypothesis (yt - yt-1 = e)
} Implication: Only luck???
Approaches to stock price forecasting
} Fundamental Analysis
} External factors: Gold price, exchange rate, oil price etc.

} Technical Analysis
} Simple Moving Averages (SMA)

} Relative Strength Indicator (RSI)

} Momentum (M) M= CCP - OCP

} Stochastic Oscillator

} Price Rate Of Change Indicator


ANN for stock price forecasting

Modeling process Data Sources

} Data preparation } Financial times


} Network selection } Bloomberg
} Training } Wall street journal
} Testing and validation } Reuters
} Apply and re-train } Yahoo! finance
Data preparation
Training Testing validation
} Input, output selection
} Data transformation
} Range (say [-1,1])
} First differences
} Logarithmic

} Data sets for training, testing & validation


} Training: oldest, ~70%, inclusive of major patterns in the data
} Testing: 20-30% of training data
} Validation: remaining, most recent data
Network topology (architecture/structure)
} Number of hidden layers
} Determines non-linearity
} Problem of over-fitting
} Thumb rule:1-2 hidden layers
} Number of neurons in each layer
} i/p layer: no neurons=no of i/ps
} hidden layer
} No magic formula- follow experimentation
} Thumb rule: sqrt(no of i/p neurons*no of o/p neurons)
} o/p layer: based on application
Target

Training
Compare

} Transfer functions
} Subject to experimentation
} Thumb rule for time series
} output layer: linear
} rest all: sigmoid
} Algorithm
} Back Propagation (BP) most widely used
} Gradient Descent (GD)
} Gradient Descent with Momentum (GDM)
Model performance & prediction errors
} Mean Square Error (MSE)
} Root Mean Square Error (RMSE)
} Mean Absolute Error (MAE)
} Mean Absolute Percentage Error (MAPE)

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy