0% found this document useful (0 votes)
102 views2 pages

20CT1153

This document outlines the course outcomes and units of a deep learning course. The course aims to help students understand regularization techniques, efficient convolution algorithms, recurrent and neural networks, autoencoders, and deep generative models. It is divided into 5 units which cover topics such as regularization, convolutional networks, sequence modeling, autoencoders, and deep generative models like Boltzmann machines. Students will learn both theoretical concepts and practical applications of deep learning techniques.

Uploaded by

Vasi Mahi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
102 views2 pages

20CT1153

This document outlines the course outcomes and units of a deep learning course. The course aims to help students understand regularization techniques, efficient convolution algorithms, recurrent and neural networks, autoencoders, and deep generative models. It is divided into 5 units which cover topics such as regularization, convolutional networks, sequence modeling, autoencoders, and deep generative models like Boltzmann machines. Students will learn both theoretical concepts and practical applications of deep learning techniques.

Uploaded by

Vasi Mahi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

`

DEEP LEARNING
 Course code:  20CT1153                                                                                        L T P C
      3  0   0 3
 Course Outcomes: At the end of the course, the student will be able to:
CO1: Understand Regularization for Deep Learning.(L2)
CO2: Examine efficient convolution algorithms. (L2)
CO3: Explain various Recurrent and Neural Networks.(L2)
CO4: Develop Applications of Autoencoders.(L3)
CO5: Implement  Deep Generative Models. (L3)

UNIT- I    (10 Lectures)


Regularization for Deep Learning: Parameter Norm Penalties, Norm Penalties as Constrained
Optimization, Regularization and Under-Constrained Problems, Dataset Augmentation, Noise
Robustness, Semi-Supervised Learning, Multi-Task Learning, Early Stopping, Parameter Tying
and Parameter Sharing, Sparse Representations, Bagging and Other Ensemble Methods,
Dropout, Adversarial Training, Tangent Distance, Tangent Prop and Manifold Tangent
Classifier. 
Learning Outcomes: At the end of the unit, student will be able to
1. Explain techniques for reducing the complexity of neural networks. (L2)
2. Understand about Semi Supervised Learning.(L2)
3. Understand Ensemble methods for performance. (L2)

UNIT-II    (8 Lectures)


Convolutional Networks: The Convolution Operation, Pooling, Convolution, Basic
Convolution Functions, Structured Outputs, Data Types, Efficient Convolution Algorithms,
Random or Unsupervised Features, Basis for Convolutional Networks. 
Learning Outcomes: At the end of the module, students will be able to:
1. Summarize different Convolution Algorithms. (L2)
2. Explain Convolution Operation. (L2)
3. Understand various Data types(L2)
UNIT-III (12 Lectures)
Sequence Modeling: Recurrent and Recursive Nets: Unfolding Computational Graphs,
Recurrent Neural Networks, Bidirectional RNNs, Encoder-Decoder Sequence-to-Sequence
Architectures, Deep Recurrent Networks, Recursive Neural Networks, Echo State Networks,
LSTM, Gated RNNs, Optimization for Long-Term Dependencies
Learning Outcomes: At the end of the module, students will be able to:
1. Understand the concepts of Recurrent and Recursive Nets. (L2)
2. Describe Optimization for Long-Term Dependencies.(L2)
3. Understand Encoder-Decoder Sequence-to-Sequence Architectures.(L2)
`

UNIT- IV     (10 Lectures)


Autoencoders:
Undercomplete Autoencoders, Regularized Autoencoders, Representational Power, Layer Size
and Depth  Stochastic Encoders and Decoders ,Denoising Autoencoders , Learning Manifolds
with Autoencoders, Contractive Autoencoders ,Predictive Sparse Decomposition , Applications
of Autoencoders, Generative adversarial networks.
1. Understand the basics of Autoencoders.(L2)
2. Explain the applications of Autoencoders.(L2)
3. Implement Generative adversarial networks. (L3)

UNIT-V  (10 Lectures)


Deep Generative Models:
Boltzmann Machines, Restricted Boltzmann Machines, Deep Belief Networks ,Deep Boltzmann
Machines , Boltzmann Machines for Real-Valued Data ,Convolutional Boltzmann ,Boltzmann
Machines for Structured or Sequential Outputs.
Learning Outcomes: At the end of the module, students will be able to:
1. Implement Deep Generative Models. (L3)
2. Explain different Boltzmann Machines.(L2)
3. Understand Boltzmann Machines for Structured Outputs.(L2)

Text Books:
1) Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press,2016. 
2) Josh Patterson and Adam Gibson, “Deep learning: A practitioner's approach”, O'Reilly Media,
First Edition, 2017. 

Reference Books:
 1) Fundamentals of Deep Learning, Designing next-generation machine intelligence algorithms,
Nikhil Buduma, O’Reilly, Shroff Publishers, 2019. 
2) Deep learning Cook Book, Practical recipes to get started Quickly, Douwe Osinga, O’Reilly,
Shroff Publishers, 2019.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy