DL Syllabus
DL Syllabus
ContinuousInternal
L T P C End Exams Total
2039601 PCC Assessment
3 0 0 3 40 60 100
Mid Exam Duration: 90 Minutes End Exam Duration: 3Hrs
Course Objectives:
• To introduce the fundamentals of deep learning and the main research activities in this
field.
• To learn architectures and optimization methods for deep neural network training.
• Study the neural networks and convolutions networks and their architecture.
• Gain knowledge about recurrent neural networks and deep supervised learning methods.
Course Outcomes: On successful completion of this course, the students will be able to
CO 1 Understand the fundamentals of deep learning.
CO 2 Compare various deep neural network architectures.
CO 3 Apply various deep learning algorithms based on real-world applications.
CO 4 Understand the convents.
CO 5 Understand the recurrent neural networks.
UNIT – 1
Linear Algebra Review and Optimization: Brief review of concepts from Linear Algebra, Types
of errors, bias-variance trade-off, overfitting-under fitting, brief review of concepts from Vector
Calculus and optimization, variants of gradient descent, momentum.
UNIT – II
Logistic Regression: Basic concepts of regression and classification problems, linear models
addressing regression and classification, maximum likelihood, logistic regression classifiers.
UNIT – III
Neural Networks: Basic concepts of artificial neurons, single and multi-layer perceptron,
perceptron learning algorithm, its convergence proof, different activation functions, SoftMax cross
entropy loss function.
UNIT – IV
Convnets: Basic concepts of Convolutional Neural Networks starting from filtering. Convolution
and pooling operation and arithmetic of these, Discussions on famous convent architectures -
AlexNet, ZFNet, VGG, Google Net, Res Net, MobileNet-v1
Regularization, Batchnorm: Discussion on regularization, Dropout, Batchnorm, Discussion on
detection as classification, region proposals, RCNN architectures
UNIT – V
Recurrent Neural Networks: Basic concepts of Recurrent Neural Networks (RNNs),
backpropagation through time, Long-Short Term Memory (LSTM) architectures, the problem of
exploding and vanishing gradients, and basics of word embedding.
Auto Encoders: Autoencoders, Denoising autoencoders, sparse autoencoders, contractive
Autoencoders.
Text Books:
1. Ian Goodfellow, YoshuaBengio, Aaron Courville. Deep Learning, the MIT press, 2016
2. Bengio, Yoshua. " Learning deep architectures for AI." Foundations and trends in Machine
Learning 2.1, Now Publishers, 2009.
Reference Books:
1. B. Vegnanarayana, Artificial Neural Networks, Prentice Hall of India, 2005.
2. Simon Haykin, Neural Networks a Comprehensive Foundations, PHI Edition, 2005.
3. Chao Pan, Deep Learning Fundamentals: An Introduction for Beginners, AI Sciences
Publisher.