Kak Anncho
Kak Anncho
Course Credit L T P S C
Structure 3 - 2 - 4
Semester B. Tech VI Semester
COURSE OUTCOMES: After successful completion of the course, students will be able
to:
Understand the fundamentals of neural learning and concepts of
CO1
probability density estimation
Program Specific
Program Outcomes
Course Outcomes
Outcome
PO PO PO PO1 PO1 PO
PO1 PO2 PO3 PO4 PO5 PO7 PSO1 PSO2 PSO3 PSO4
6 8 9 0 1 12
CO1 3 1 - - - - - - - - - - - - 3 -
CO2 3 3 2 - - - - - - - - - - - 3 -
CO3 3 3 2 - - - - - - - - - - - 3 -
CO4 3 3 2 - - - - - - - - - - - 3 -
CO5 - - - - - 3 - - - - - - 3 - - -
Course
Correlat
3 3 2 - - 3 - - - - - - 3 - 3 -
ion
Mapping
Correlation Level: 3-High 2-Medium 1-Low
COURSE CONTENT
Total Periods: 45
EXPERIENTIAL LEARNING
1. Classify spectral remote sensing data using Ordinary Least Squares and
estimate the confusion matrix.
Fit, evaluate, and make predictions with the Linear Discriminant Analysis model.
Also tune the hyperparameters of the model.
Implementing the Perceptron Algorithm for binary classification task and plot the
decision boundary.
2. Build an employee churn prediction model using Multi-layer perceptron and
evaluate the model performance with different learning rate.
3. Analyze the efficiency of a Radial Basis function network for a classificationtask
and assess its convergence ability.
4. Evaluate the cross entropy error functions and compare its performance withthe Log
Loss.
5. Analyze the efficiency of L1 and L2 regularization techniques in resolving
overfitting issues in multi-layer neural network model.
6. Investigate the efficiency of SGD in terms of convergence speed andaccuracy
with different learning rate and momentum values.
Using stacked generalization, an ensemble method build a new model tobest
combine the predictions from multiple existing models.
7. Using pruning algorithm reduce the complexity of a trained neural network.
Classify spectral remote sensing data using Ordinary Least Squares and
estimate the confusion matrix.
Fit, evaluate, and make predictions with the Linear Discriminant Analysismodel.
Also tune the hyperparameters of the model.
8. Implementing the Perceptron Algorithm for binary classification task and plot the
decision boundary.
9. Build an employee churn prediction model using Multi-layer perceptron and
evaluate the model performance with different learning rate.
10. Analyze the efficiency of a Radial Basis function network for a classificationtask
and assess its convergence ability.
RESOURCES
TEXT BOOKS:
REFERENCE BOOKS:
3. Simon Haykin, Neural Networks and Learning Machines, Pearson, 2016
SOFTWARE/TOOLS:
3. python
4. MAT LAB
VIDEO LECTURES:
4 https://www.youtube.com/watch?v=Zxs-f4HsTDk
5 https://www.youtube.com/watch?v=VQ1O-pSPX20
6 https://www.youtube.com/watch?v=zCwEdKy2OJI
WEB RESOURCES:
3. https://www.ai.rug.nl/minds/uploads/LN_NN_RUG.pdf
4. https://ocw.mit.edu/courses/9-641j-introduction-to-neural-networks-spring-
2005/pages/lecture-notes/
COURSE DELIVERY SCHEDULE:
S. Contact CO
Topic Pedagogy Resources
No Hours Mapping
MODULE-1: INTRODUCTION TO NEURAL LEARNING
Introduction to Fundamentals PPT, Open
1. 01 C01 TB2, RB1
of Neural Learning Discussion Session
2. Classification and regression 01 C01 Blended Learning TB2, RB1
Pre-processing and feature
3. 01 C01 Comparative study TB2, RB1
extraction
PPT and Open
4. The curse of dimensionality 01 C01 TB2, RB1
Discussion Session
Polynomial curve fitting,
5. 01 C01 PPT TB2, RB1
Model complexity
Multivariate non-linear PPT and Open
6. 01 C01 TB2, RB1
functions Discussion Session
Bayes' theorem, Decision PPT and Open
7. 01 C01 TB2, RB1
boundaries Discussion Session
Probability Density C01 PPT
8. Estimation- Parametric 01 TB2, RB1
methods
Maximum likelihood, C01 PPT and Open
9. 01 TB2, RB1
Bayesian inference Discussion Session
MODULE-2: SINGLE LAYER NETWORKS
Linear discriminant
TB2, RB1, WR-1,
10 functions, Linear 01 CO2 PPT and Live demo
VL-1
separability
12. Least- squares techniques 01 CO2 PPT and Live demo TB2, WR2
The perceptron, Fisher's
13. linear discriminant. 01 CO2 PPT and Live demo TB2, WR3
* For a total of 30 marks, 80% of better one of the two CIAT and 20% of the other one are added
and finalized, any fraction shall be rounded off to the higher integer number.
K ARUN KUMAR
Signature of the Course Instructor Signature of the Chairperson BOS