0% found this document useful (0 votes)
7 views4 pages

1152ec - Optimization Technique-Draft Copy - 20th July

The document outlines a course titled 'Optimization Technique' which is a program elective worth 3 credits, focusing on optimization methods and their applications in fields like machine learning. It includes course outcomes related to understanding optimization concepts, convex and non-convex optimization, and special topics in the field. The course has no prerequisites and covers various units over 45 hours, supported by key references and learning resources.

Uploaded by

Arun
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views4 pages

1152ec - Optimization Technique-Draft Copy - 20th July

The document outlines a course titled 'Optimization Technique' which is a program elective worth 3 credits, focusing on optimization methods and their applications in fields like machine learning. It includes course outcomes related to understanding optimization concepts, convex and non-convex optimization, and special topics in the field. The course has no prerequisites and covers various units over 45 hours, supported by key references and learning resources.

Uploaded by

Arun
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Course Code Course Title L T P C

10212EC152 OPTIMIZATION TECHNIQUE 2 2 0 3

a) Course Category

Program Elective

b) Preamble
The area of optimization playing a critical role even in contemporary areas such as decision and
control, signal processing, and machine learning. This course intends to present a thorough
treatment of optimization techniques with specific emphasis on modern applications. This will
provide students with a sound background in the area and benefit those who wish to pursue
doctoral or master level theses in this subject, or apply these techniques to their own

c) Prerequisite
Nil

d) Related Courses
Machine Learning
Deep Learning

e) Course Outcomes
Upon the successful completion of the course, students will be able to

Knowledge Level
CO Course Outcomes (Based on Revised
Nos. Bloom’s Taxonomy)

CO1 Understand the basic concept of Optimization K2

To understand the concepts of Convex Optimization


CO2 K2

CO3 To understand the optimization for Machine Learning


K3

CO4 To understand the concepts of Non-Convex Optimization K3

CO5 To describe Special topics in Optimization techniques K3


f) Correlation of COs with POs (Program Outcomes defined by National Board of Accreditation,
India)
PO1 PO2 PO3 PO4 PO5 PO6 PO7 PO8 PO9 PO10 PO11 PO12 PSO1 PSO2
CO1 M L L L L L - - L - L L - -
CO2 M H L L M M - - L - L L - -
CO3 M H M M M H - - L - L L - -
CO4 M M L M M M - - L - L L - -
CO5 M H H H H - - L - L L - -
M

g) Course Content

UNIT-I
Introduction

Introduction to Optimization, sequences and limits, derivative matrix, level sets and gradients, Taylor series;
unconstrained optimization - necessary and sufficient conditions for optima. 9

UNIT-II
Convex Optimization:
convex sets, convex functions, optima of convex functions, Convex Optimization Problems , steepest
descent, Newton and quasi Newton methods. conjugate direction methods. 9
Unit-III
Optimization for Machine Learning:
Convex Optimization-Introduction, Incremental Gradient, Subgradient and Proximal Methods. Nonsmooth
Convex Optimization, DC (Difference of Convex functions) Programming, 9
Unit-IV
Non-convex Optimization:
Sparse recovery, affine rank minimization, low-rank matrix completion, Non-convex approaches - projected
gradient descent, alternating minimization. 9

Unit-V
Special topics:
Accelerated first order methods, Bayesian methods, Coordinate methods, Cutting plane methods Interior
point methods, Optimization methods for deep learning, Parallel and distributed methods. 9

Total- 45 Hours
Learning Resources
References:

 S. Boyd and L. Vandenberghe: Convex Optimization, Cambridge University Press, 2004.(UNIT-I &
II)
 Suvrit Sra, Sebastian Nowozin and Stephen Wright(Editors), Optimization for Machine Learning,The
MIT Press, Dec. 2011.(unit-III)
 T. Hastie, R. Tibshirani and M. J. Wainwright, Statistical Learning with Sparsity: the Lasso and
Generalizations, Chapman and Hall/CRC Press,
2015. http://web.stanford.edu/~hastie/StatLearnSparsity/
 Papers from leading conferences and journals in optimization, as well as applied areas such as signal
processing, information theory, and machine learning

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy