0% found this document useful (0 votes)
74 views6 pages

ML002 Syllabus

This document outlines the modules and chapters for a deep learning course. Module 1 covers getting started with deep learning and includes chapters on linear algebra, calculus, probability and information theory, and machine learning basics. Module 2 covers deep learning fundamentals with chapters on the history and basics of neural networks, backpropagation, and designing neural networks with Keras and TensorFlow. Module 3 is on regularization and optimization techniques. Module 4 focuses on convolutional neural networks with chapters on CNN architecture and applications. Module 5 is on recurrent neural networks with chapters on RNNs, LSTMs, and building applications using RNNs.

Uploaded by

Saroj Sharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
74 views6 pages

ML002 Syllabus

This document outlines the modules and chapters for a deep learning course. Module 1 covers getting started with deep learning and includes chapters on linear algebra, calculus, probability and information theory, and machine learning basics. Module 2 covers deep learning fundamentals with chapters on the history and basics of neural networks, backpropagation, and designing neural networks with Keras and TensorFlow. Module 3 is on regularization and optimization techniques. Module 4 focuses on convolutional neural networks with chapters on CNN architecture and applications. Module 5 is on recurrent neural networks with chapters on RNNs, LSTMs, and building applications using RNNs.

Uploaded by

Saroj Sharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Antern ML002

Ayush Singh

October 16, 2021

Module 1: Getting Ready for Deep Learning

Motivation:- Why learn Deep Learning?

Chapter 1:- Linear Algebra


● Basic properties of a matrix and vectors: scalar multiplication, linear
● transformation, transpose, conjugate, rank, and determinant.
● Inner and outer products, matrix multiplication rule and various algorithms, and
matrix inverse.
● Matrix factorization concept/LU decomposition, Gaussian/Gauss-Jordan
elimination, solving Ax=b linear system of an equation.
● Eigenvalues, eigenvectors, diagonalization, and singular value decomposition.
● Special matrices: square matrix, identity matrix, triangular matrix, the idea about
sparse and dense matrix, unit vectors, symmetric matrix, Hermitian,
skew-Hermitian and unitary matrices.
● Vector space, basis, span, orthogonality, orthonormality, and linear least square.

Chapter 2:- Calculus


● Limits: Introduction, Properties of Limits, Solving Limits, L-Hopital Rule
● Continuity: Introduction, Solving problems, Discontinuities
● Differentiability: Introduction, How does it work? Formal Definition, Mean Value
Theorem, Minima and Maxima, Gradient Descent, derivative, partial derivative,
Jacobian and Hessian Matrices
● Integration: Introduction, Integration by Substitution, Integration by partial
functions, properties of definite integral
● Optimization: Convex Optimization, first and second order optimization
algorithm, Constrained Optimization, Karush–Kuhn–Tucker Approach,
generalized Lagrangian

Chapter 3:- Probability and Information Theory


● Probability:
○ Bayesian Probability
○ Random Variables,
○ Discrete Variables and Probability Mass Function,
○ Continuous Variables and Probability Density Functions
○ , Marginal Probability
○ , Conditional Probability,
○ Expectation,Variance and Covariance,
○ Bayes Theorem
○ Probability Distributions,
■ Bernoulli Distribution,
■ Multinoulli Distribution,
■ Gaussian Distribution,
■ Exponential and Laplace Distributions, The Dirac Distribution and
Empirical Distribution,
■ Properties
● Information Theory

Chapter 4:- Machine Learning basics


● Learning Algorithms : Linear Regression, Logistic Regression
● Bias and Variance
● Hyperparameters and Validation Sets
● Bayesian Statistics
● Supervised Learning Algorithms
● Unsupervised Learning
● Challenges Motivating Deep Learning

Module 2: Deep Learning Fundamentals

Chapter 5:- Deep Learning Fundamentals Part - 1

● What is Deep Learning?


○ Evolution of deep learning
○ Representation learning
○ History of deep learning
○ Formal Definition of deep learning
● An Overview of Neuron of a brain
● Perceptron and How It relates to Neurons?
● Logistic Regression as Neural Network
● Multi-layer Perceptron
● Notations
Chapter 6:- Deep Learning Fundamentals Part - 2

● Perceptron Training
● Multi-layer Perceptron Training
● BackPropagation Training
● Activation Functions and Derivation
● Building a simple neural network from scratch In Python
● Building a simple neural network Using Tensorflow and PyTorch
● Designing a Neural Network Using Keras-Tuner

Module 3: Regularization & Optimization

Chapter 7:- Regularization

● Parameter Norm Penalties


● Norm Penalties as Constrained Optimization
● Noise Robustness
● Parameter Tying and Parameter Sharing
● Multi-Task learning
● Semi-Supervised Learning
● Bagging & Boosting
● Dataset Augmentation
● Early Stopping
● Dropout
● Implementation in PyTorch/Tensorflow

Chapter 8:- Optimization


● How does learning differ from pure optimization?
● Challenges
● Batch Gradient Descent
● Stochastic Gradient Descent
● Adam
● RMSProp
● Batch Normalization
● Weights Initialization techniques

Module 4: Convolution Neural Networks

Chapter 9:- Convolutional Neural Nets ( CNN )

● Why do we need CNN over ANN?


● CNN
○ Filters, Feature Maps
○ Convolutions
○ Strides
○ Pooling
○ Max-Pooling
○ Average Pooling
○ Padding with different techniques
○ Getting to know the dimensions
● CNN Architectures
○ AlexNet
○ VGG-16
○ VGG-19
○ ResNet
● Designing CNN
● Transfer Learning
● Object Detection
○ Basic object detection model
○ YOLO ( You Only Look Once )
○ R-CNN
○ Fast R-CNN

Chapter 10:- Building Application using ConvNets and Deploying it

● Building cat and dog classifier


● Building Pneumonia Detection using Transfer Learning
● Building Multi-class classifier
● Building Object detector using Coco dataset
● Approaching and Completing a Kaggle Contest

Module 5: Recurrent Neural Network


Chapter 11:- Recurrent Neural Networks

● RNN Applications
● RNN Overview
● Backpropagation In RNN
● LSTM ( Long Short Term Memory )
● Bi-Directional RNN’s
● Gated Recurrent Units
● Problems with RNN’s
● Seq2Seq Architectures
● Transformers

Chapter 12:- Building Applications using RNN

● Basic RNN Implementation


● Sentiment Analysis using RNN
● Fake News Detection
● Speech recognition
● Approaching and Completing a Kaggle Contest

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy