0% found this document useful (0 votes)
11 views2 pages

DL Syllabus

The document outlines a course on Deep Learning for B.Tech. VI Sem (R20UG) AI&ML, detailing course objectives, outcomes, and a structured syllabus covering topics such as linear algebra, logistic regression, neural networks, convolutional networks, and recurrent neural networks. It includes information on assessments, textbooks, and reference materials. The course aims to equip students with foundational knowledge and practical skills in deep learning applications.

Uploaded by

Ramprakash Reddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views2 pages

DL Syllabus

The document outlines a course on Deep Learning for B.Tech. VI Sem (R20UG) AI&ML, detailing course objectives, outcomes, and a structured syllabus covering topics such as linear algebra, logistic regression, neural networks, convolutional networks, and recurrent neural networks. It includes information on assessments, textbooks, and reference materials. The course aims to equip students with foundational knowledge and practical skills in deep learning applications.

Uploaded by

Ramprakash Reddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Course Title DEEP LEARNING B.Tech.

VI Sem (R20UG) AI&ML


Course Code Category Hours / Week Credits Maximum Marks

ContinuousInternal
L T P C End Exams Total
2039601 PCC Assessment

3 0 0 3 40 60 100
Mid Exam Duration: 90 Minutes End Exam Duration: 3Hrs
Course Objectives:
• To introduce the fundamentals of deep learning and the main research activities in this
field.
• To learn architectures and optimization methods for deep neural network training.
• Study the neural networks and convolutions networks and their architecture.
• Gain knowledge about recurrent neural networks and deep supervised learning methods.
Course Outcomes: On successful completion of this course, the students will be able to
CO 1 Understand the fundamentals of deep learning.
CO 2 Compare various deep neural network architectures.
CO 3 Apply various deep learning algorithms based on real-world applications.
CO 4 Understand the convents.
CO 5 Understand the recurrent neural networks.

UNIT – 1
Linear Algebra Review and Optimization: Brief review of concepts from Linear Algebra, Types
of errors, bias-variance trade-off, overfitting-under fitting, brief review of concepts from Vector
Calculus and optimization, variants of gradient descent, momentum.

UNIT – II
Logistic Regression: Basic concepts of regression and classification problems, linear models
addressing regression and classification, maximum likelihood, logistic regression classifiers.

UNIT – III
Neural Networks: Basic concepts of artificial neurons, single and multi-layer perceptron,
perceptron learning algorithm, its convergence proof, different activation functions, SoftMax cross
entropy loss function.

UNIT – IV
Convnets: Basic concepts of Convolutional Neural Networks starting from filtering. Convolution
and pooling operation and arithmetic of these, Discussions on famous convent architectures -
AlexNet, ZFNet, VGG, Google Net, Res Net, MobileNet-v1
Regularization, Batchnorm: Discussion on regularization, Dropout, Batchnorm, Discussion on
detection as classification, region proposals, RCNN architectures

UNIT – V
Recurrent Neural Networks: Basic concepts of Recurrent Neural Networks (RNNs),
backpropagation through time, Long-Short Term Memory (LSTM) architectures, the problem of
exploding and vanishing gradients, and basics of word embedding.
Auto Encoders: Autoencoders, Denoising autoencoders, sparse autoencoders, contractive
Autoencoders.

Text Books:
1. Ian Goodfellow, YoshuaBengio, Aaron Courville. Deep Learning, the MIT press, 2016
2. Bengio, Yoshua. " Learning deep architectures for AI." Foundations and trends in Machine
Learning 2.1, Now Publishers, 2009.

Reference Books:
1. B. Vegnanarayana, Artificial Neural Networks, Prentice Hall of India, 2005.
2. Simon Haykin, Neural Networks a Comprehensive Foundations, PHI Edition, 2005.
3. Chao Pan, Deep Learning Fundamentals: An Introduction for Beginners, AI Sciences
Publisher.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy