0% found this document useful (0 votes)
871 views2 pages

CS 391L Machine Learning Course Syllabus

The document is a course syllabus for CS 391L Machine Learning that outlines 14 topics: 1) Introduction to machine learning goals and applications. 2) Inductive classification including concept learning and version spaces. 3) Decision tree learning using entropy, information gain, and pruning. 4) Ensemble learning methods like bagging and boosting. 5) Experimental evaluation using cross-validation and hypothesis testing. 6) Computational learning theory including PAC learning and sample complexity. 7) Rule learning using decision trees, propositional logic, and inductive logic programming. 8) Artificial neural networks including perceptrons, backpropagation, and deep learning. 9) Support vector machines using

Uploaded by

Om Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
871 views2 pages

CS 391L Machine Learning Course Syllabus

The document is a course syllabus for CS 391L Machine Learning that outlines 14 topics: 1) Introduction to machine learning goals and applications. 2) Inductive classification including concept learning and version spaces. 3) Decision tree learning using entropy, information gain, and pruning. 4) Ensemble learning methods like bagging and boosting. 5) Experimental evaluation using cross-validation and hypothesis testing. 6) Computational learning theory including PAC learning and sample complexity. 7) Rule learning using decision trees, propositional logic, and inductive logic programming. 8) Artificial neural networks including perceptrons, backpropagation, and deep learning. 9) Support vector machines using

Uploaded by

Om Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

5/6/2019 CS 391L Machine Learning Course Syllabus

Course Syllabus for


CS 391L: Machine Learning

Chapter numbers refer to the text: Machine Learning

1. Introduction
Chapter 1. Definition of learning systems. Goals and applications of machine learning. Aspects of
developing a learning system: training data, concept representation, function approximation.
2. Inductive Classification
Chapter 2. The concept learning task. Concept learning as search through a hypothesis space. General-to-
specific ordering of hypotheses. Finding maximally specific hypotheses. Version spaces and the candidate
elimination algorithm. Learning conjunctive concepts. The importance of inductive bias.
3. Decision Tree Learning
Chapter 3. Representing concepts as decision trees. Recursive induction of decision trees. Picking the best
splitting attribute: entropy and information gain. Searching for simple trees and computational complexity.
Occam's razor. Overfitting, noisy data, and pruning.
4. Ensemble Learning
(read this paper) Using committees of multiple hypotheses. Bagging, boosting, and DECORATE. Active
learning with ensembles.
5. Experimental Evaluation of Learning Algorithms
Chapter 5. Measuring the accuracy of learned hypotheses. Comparing learning algorithms: cross-
validation, learning curves, and statistical hypothesis testing.
6. Computational Learning Theory
Chapter 7. Models of learnability: learning in the limit; probably approximately correct (PAC) learning.
Sample complexity: quantifying the number of examples needed to PAC learn. Computational complexity
of training. Sample complexity for finite hypothesis spaces. PAC results for learning conjunctions, kDNF,
and kCNF. Sample complexity for infinite hypothesis spaces, Vapnik-Chervonenkis dimension.
7. Rule Learning: Propositional and First-Order
Chapter 10. Translating decision trees into rules. Heuristic rule induction using separate and conquer and
information gain. First-order Horn-clause induction (Inductive Logic Programming) and Foil. Learning
recursive rules. Inverse resolution, Golem, and Progol.
8. Artificial Neural Networks
Chapter 4. Neurons and biological motivation. Linear threshold units. Perceptrons: representational
limitation and gradient descent training. Multilayer networks and backpropagation. Hidden layers and
constructing intermediate, distributed representations. Overfitting, learning network structure, recurrent
networks.
9. Support Vector Machines
(Paper handouts) Maximum margin linear separators. Quadractic programming solution to finding
maximum margin separators. Kernels for learning non-linear functions.
10. Bayesian Learning
Chapter 6 and new on-line chapter. Probability theory and Bayes rule. Naive Bayes learning algorithm.
Parameter smoothing. Generative vs. discriminative training. Logisitic regression. Bayes nets and Markov
nets for representing dependencies.
11. Instance-Based Learning
Chapter 8. Constructing explicit generalizations versus comparing to past specific examples. k-Nearest-
neighbor algorithm. Case-based learning.
12. Text Classification
Bag of words representation. Vector space model and cosine similarity. Relevance feedback and Rocchio
algorithm. Versions of nearest neighbor and Naive Bayes for text.

https://www.cs.utexas.edu/~mooney/cs391L/syllabus.html 1/2
5/6/2019 CS 391L Machine Learning Course Syllabus

13. Clustering and Unsupervised Learning


(Chapter 14 from Manning and Schutze text) Learning from unclassified data. Clustering. Hierarchical
Aglomerative Clustering. k-means partitional clustering. Expectation maximization (EM) for soft
clustering. Semi-supervised learning with EM using labeled and unlabled data.
14. Language Learning
(paper handouts) Classification problems in language: word-sense disambiguation, sequence labeling.
Hidden Markov models (HMM's). Veterbi algorithm for determining most-probable state sequences.
Forward-backward EM algorithm for training the parameters of HMM's. Use of HMM's for speech
recognition, part-of-speech tagging, and information extraction. Conditional random fields (CRF's).
Probabilistic context-free grammars (PCFG). Parsing and learning with PCFGs. Lexicalized PCFGs.

https://www.cs.utexas.edu/~mooney/cs391L/syllabus.html 2/2

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy