0% found this document useful (0 votes)
201 views9 pages

SRM Valliammai Engineering College (An Autonomous Institution)

between decision tree and regression tree. Remember BTL1 This document contains a question bank for the Machine Learning Techniques course offered in the first semester of the academic year 2019-2020 by the Department of Computer Science and Engineering at SRM Valliammai Engineering College. It includes questions related to topics from Units I-III of the syllabus, including machine learning, supervised and unsupervised learning, concept learning, decision trees, multilayer perceptrons, and support vector machines. The questions are divided into parts A, B, and C with varying competency levels ranging from basic recall to complex problem solving.

Uploaded by

suresh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
201 views9 pages

SRM Valliammai Engineering College (An Autonomous Institution)

between decision tree and regression tree. Remember BTL1 This document contains a question bank for the Machine Learning Techniques course offered in the first semester of the academic year 2019-2020 by the Department of Computer Science and Engineering at SRM Valliammai Engineering College. It includes questions related to topics from Units I-III of the syllabus, including machine learning, supervised and unsupervised learning, concept learning, decision trees, multilayer perceptrons, and support vector machines. The questions are divided into parts A, B, and C with varying competency levels ranging from basic recall to complex problem solving.

Uploaded by

suresh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

SRM VALLIAMMAI ENGINEERING COLLEGE

(An Autonomous Institution)


SRM Nagar, Kattankulathur – 603 203

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

QUESTION BANK

I SEMESTER

1912105-MACHINE LEARNING TECHNIQUES

Academic Year 2019 – 20 ODD

Prepared by

Dr. B. Muthusenthil, Associate Professor/CSE


Unit -I
SYLLABUS
Learning – Types of Machine Learning – Supervised Learning – The Brain and the Neuron – Design a
Learning System – Perspectives and Issues in Machine Learning – Concept Learning Task – Concept Learning
as Search –Finding a Maximally Specific Hypothesis – Version Spaces and the Candidate Elimination
Algorithm – Linear Discriminants – Perceptron – Linear Separability – Linear Regression.

PART – A
Q.No Questions Competence BT Level

1 What is machine learning? Remember BTL1


2 Point out few examples of machine learning applications. Analyze BTL 4
3 Distinguish between supervised and unsupervised learning. Understand BTL 2
4 Define the types of machine learning. Remember BTL 1
5 Describe the steps to be followed for designing a Learning Understand BTL 2
system for play checkers.
6 Discover one useful perspective on machine language. Apply BTL3
7 Explain the issues in machine learning. Analyze BTL 4
8 Explain concept learning. Evaluate BTL 5
9 Define Inductive Learning Hypothesis. Remember BTL 1
10 Assess the Enjoy sport learning task. Evaluate BTL 5
11 Define concept learning as search. Remember BTL 1
12 Develop candidate-elimination algorithm using version spaces. Create BTL 6
13 Define General-to-Specific Ordering of Hypotheses. Remember BTL 1
14 Identify the ways to represent The LIST-THEN-Eliminate Remember BTL1
algorithm.
15 Examine find-s: finding a maximally specific hypothesis. Apply BTL 3
16 Discuss candidate-elimination algorithm. Understand BTL 2
17 Discover perceptron. Apply BTL 3
18 Develop the linear discriminants. Create BTL 6
19 Analyze version space. Analyze BTL 4
20 Describe the threelevels of analysis in information processing by Understand BTL 2
brain.
PART-B (13 MARK )
1 (i)What is machine learning? Discuss about learning and machine Analyze BTL 4
learning. (7)
(ii)Discuss the various types of machine learning. (6)
2 (i)Discuss in detail about Supervised learning. (7) Understand BTL 2
(ii)Discuss about the Classification problem.(6)
3 (i)Summarize the Issues in Machine Learning.(7) Evaluate BTL 5
(ii)Summarize the concept learning task.(6)
4 (i)Describe the concept learning as search. (7) Remember BTL 1
(ii)Describe General-to-Specific Ordering of Hypothesis for concept
learning as a search. (6)
5 Explain find-s: finding a maximally specific hypothesis. (13) Apply BTL 3
6 (i)Explain CANDIDATE-Elimination algorithm. (7) Apply BTL 3
(ii)Explain in detail about List-then –Eliminate algorithm.(6)

7 With a Neat diagram, Explain about More Compact Representation for Create BTL 6
Version Spaces. (13)
8 (i)Describe in detail about Linear Discriminants. (7) Remember BTL 1
(ii)Discuss: Generalizing the Linear Model and Geometry of the Linear
Discriminant. (6)
9 (i)Discuss about perceptron. (7) Understand BTL 2
(ii)Discuss about the perceptron learning Algorithm with an
example.(6)
10 (i)Describe about the working of Brain and the Neuron (7) Remember BTL 1
(ii)Explain about the Limitations of the McCulloch and Pitts Neuronal
Model (6)

11 (i)Describe Find-S and explain the algorithm. (7) Remember BTL 1


(ii)Discuss about Inductive Bias. (6)

12 Summarize in detail about Discrimination by Regression with an Understand BTL 2


example. (13)
13 (i)Explain in detail about linear seperability. (7) Analyze BTL 4
(ii)Discuss Limitations of the McCulloch and Pitts Neuronal Model. (6)

14 Explain about Linear Regression with an example. (13) Analyze BTL 4


PART - C (15 MARK )
1 Develop a program to learn to play. Checkers with goal of entering it in Create BTL 6
the world checkers tournament.
(i)Choosing the Training Experience
(ii)Choosing the Target Function
(iii)Choosing a Representation for the Target Function. (15)
2 Explain the following for designing a learning system Analyze BTL 4
(i)choosing a function approximation algorithm
(ii)estimating training values
(iii)adjusting the weights
(iv)the final design. (15)
3 (i)Describe in detail about Pair wise Separation of classes Remember BTL 1
(ii) ExplainParametric Discrimination. (15)

4 Describe about Vapnik-Chervonenkis (VC) Dimension. (15) Remember BTL 1


Unit
Unit -III
-II
SYLLABUS
Multi-layer
Learning withPerceptron – Going Forwards
Trees-Decision – Going Backwards:
Trees-Constructing DecisionBack
Trees-Classification
Propagation Errorand– Multi-layer
Regression Trees-
Perceptron in Practice – Example of using the MLP
Ensemble Learning-Boosting-Bagging-Different – Overview
Ways – Deriving
to combine Classifiers-Probability and –Learning
Back-Propagation –Data
Radial Basis
Functions and Splines – Concepts
into Probabilities-Basic – RBF Network
Statistics-Gaussian – CurseModels-Nearest
Mixture of Dimensionality – Interpolations
Neighbor Methods-Unsupervised
and Basis
Learning –– K
Functions Support
means Vector
Algorithms – Vector Quantization – Self Organizing Feature Map.
Machines.
PART - A
Q.No Questions PART - A Competence BT Level
1 Define Decision trees. Remember BTL1
Q.No Questions Competence BT Level
2 What is decision node? Remember BTL1
31 Analyze Back propagation
What is regression tree? Algorithm. Analyze
Remember BTL4
BTL1
42 Write about
Illustrate an Multilayer
example for Perceptrons.
decision tree with diagram. Apply
Remember BTL3
BTL1
53 Describe
List classification
out different outputtree construction.
activation functions. Understand
Remember BTL2
BTL1
64 List out the difference
Discuss Local minima. between pre and post pruning. Remember
Understand BTL1
BTL2
75 Showisthe
What difference
testing, between
training Diversity and Accuracy.
and validation? Apply
Remember BTL3
BTL1
86 Define
Give pruning.between local and distributed representation.
difference Remember
Understand BTL1
BTL2
97 What is
Write about boosting?
receptive field. Understand
Remember BTL2
BTL1
10 Classify the different ways to combine classifiers. Apply BTL3
118 Name the difference
Point out two parts of MLP. regression tree and classification
between Understand BTL2
9 Explain hybrid learning. Analyze
Understand BTL4
BTL2
tree.
1210 Write
How to thewrite
errorweight
function usedrule?
update for the perceptron. Remember
Analyze BTL1
BTL4
1311 Illustrate
Explain what curseisofBayes
Dimensionality.
optimal Classification Apply
Evaluate BTL 3
BTL5
1412 Describe the
Define bagging. various steps of deriving Back Propagation. Apply
Remember BTL 3
BTL1
1513 What
What isis support
No Freevector
Lunchmachine?
Theorem? Apply
Understand BTL 3
BTL2
1614 Explain the principles of
Can you give few basic statistics gradient concept?
descent. Evaluate
Evaluate BTL 5
BTL5
1715 Define
PerformBiases.
k-means algorithm. Remember
Create BTL 1
BTL6
1816 Differentiate
Compose kernel optimal separating hyperplane and soft margin
smoothers. Create BTL6
Analyze BTL 4
hyperplane.
19 What vector quantization? Understand BTL2
2017 Analyze
DifferentiateRBFbetween
Network.k means and online k-means algorithm Analyze
Analyze BTL 4
BTL4
18 Summarize Back propagation of error PART-B (13 MARK ) Evaluate BTL‐5
19 (i)What is decision tree? Construct a decision tree. (7) Create BTL‐6
1 Justify radial basis function Remember BTL1
(ii)Write about univariate trees. (6)
20 How smoothing splines. Create BTL‐6
(i)Describe the Classification Trees. (7)
2 Remember BTL1
(ii)Draw and explain Entropy function for a class problem. (6)
PART-B (13 MARK )
(i)Explain in detail about regression trees. (7)
3 (i)Write
(ii)How about multilayer
Regression perceptron.
tree smoothes various values of Ѳr (6)
for(7) Analyze BTL4
1 (ii)Draw the structure of MLP network with inputs and outputs Remember BTL 1
4 layer
Discuss(6) the classification example for the decision tree. (13) Understand BTL2
(i)
(i) Define the Back
Use decision treepropagation
to classify theRule. (7) in a class based on
students
2 Remember BTL 1
5 (ii)Describe the
their academic. (7) training rule for output unit weights (6) Apply BTL3
(i)Write aboutabout
(ii)Discuss Multilayer
mixtureperceptron algorithm.
of Experts (7) (6)
Algorithm.
3 Remember BTL 1
(ii)Describe how the MLP is designed
(i)What is Naïve Bayes Classifier? (7) to be a batch algorithm (6)
(i)What is Local Minima? (7)
46 (ii) Explain the following Basic statistics: Averages and Remember
Remember BTL1
BTL 1
(ii)Discuss in detail about
Variance and Covariance. (6) picking up Momentum (6)
57 Discuss the following
(i)Illustrate Turning data into probabilities. (7) Understand
Apply BTL 2
BTL3
(i)
(ii)Draw
Data preparation
the histogram in MLP
of feature
(5) values (6)
(ii)Amount
(i) Write aboutof training Data (4)and Regression Trees (CART)
Classification
8 (iii)Number
with examples. of Hidden
(7) layers (4) Remember BTL1
(i)Describe
(ii)Explain theTheworking
Expectation-Maximization
behavior of support(Em) vector Algorithm.
machine.(7)
(6)
6 Understand BTL 2
(ii)How
Explain tothetrain test andStatistics
following validate the MLP. (6)
9 (ii)Discuss
(i)GaussianRegression
(6) Problem of using MLP. (7) Analyze BTL4
7 Understand BTL 2
(ii)
(ii)Bias-variance
How Data compression
Trade-off.is(7) performed (6)
(i)Illustrate
(i) DescribeGeneralization of multivariate
the Nearest Neighbor Algorithm. data.(7)
(7)
8
10 Apply
Understand BTL 3
BTL2
(ii)Describe in detail about Curse of Dimensionality
(ii)Write in detail about Nearest Neighbor Smoothing.(6) (6)
9 Explain in detail about RBF network. (13) Apply BTL 3
(i)Discussa the
(i)Create twobasic
class idea of ensemble
problem learning.
and describe (7) vector
support
11 Understand BTL2
10 (ii)Write in detail about the various ensemble methods. (6)
machine.(7) Analyze BTL 4
(i)Write abouthow
(ii)Investigate unsupervised
to chooselearning
smoothing (7)parameter. (6)
12 (ii)Discussthe theoptimal
k-means algorithmhyperplane.
in detail. (6) Evaluate BTL5
(i)Explain separating (7)
11 Analyze BTL 4
13 (ii)Examine
Explain vector the quantization
soft margin hyperplane.
in detail. (13)(6) Analyze BTL4
(i)Illustrate
(i) Generateself-organizing
recurrent network feature
and map. (7)
its equivalent unfolded
14 Create BTL6
12 (ii)Illustrate
network. (7) SOM algorithm. (6) Analyze BTL 4
(ii) Discuss about back propagation PART -with C (15 MARK
time. (6) )
(i)Create classification
(i)Summarize about back tree construction.
propagation (8) (7)
error.
1
13 Create
Evaluate BTL6
BTL 5
(ii)Describe regression tree algorithm. (7)
(ii)Explain how the weights of the network are trained (6)
(i)Write inyour
(i)Justify detail aboutwhy
answer Boosting
we useand AdaBoost (7)
Interpolation and Basis
2 Evaluate BTL
BTL5 6
14 (ii)Discuss
function.(7) What is Bagging and Subagging (8) Create
(i)Explain in detail about Gaussian Mixture
(ii)Discuss in detail about Smoothing splines. (6) Model. (7)
3 Evaluate BTL 5
(ii)Write the General Expectation (EM) Algorithm. (8)
PART - C (15 MARK )
(i) Design K-means algorithm and group the points (1, 0, 1), (1,
1 Write about the MLP as a Universal Approximator. (15) Evaluate BTL5
4 1, 0), (0, 0, 1) and (1, 1, 1) using K-means algorithm. (7)
(i)Create an example and explain about Principal Component Create BTL6
(ii)Perform normalization for the Neural network. (8)
2 Analysis. (8) Create BTL6
(ii)Describe Multiclass Discrimination. (7)
3 Illustrate some examples of using MLP and the four types of Evaluate BTL5
problems that are generally solved using MLP. (15)
Write a case study for the following (15)
4 (i) Illustrate an example of using MLP
(ii) task involved Create BTL 6
(iii) Input choice
(iv) Input encoding and output encoding.

Unit -IV
SYLLABUS
Dimensionality Reduction-Linear Discriminant Analysis –Principal Component Analysis-Factor Analysis-
Independent component Analysis-Locally Linear Embedding –Isomap-Least Squares Optimization-
Evolutionary Learning-Genetic Algorithms-Genetic offspring Genetic Operators-using Genetic Algorithms-
Reinforcement Learning-Overview-Getting Lost –Example-Markov Decision Process.

PART - A
Q.No Questions Competence BT Level

1 Point out why dimensionality reduction is useful? Analyze BTL4


2 Define Factor Analysis or latent variables. Remember BTL1
3 Distinguish between within-class scatter and between-classes scatter. Understand BTL2
4 Define PCA. Remember BTL1
5 Describe what isomap is. Understand BTL2
6 Discover Locally Linear Embedding algorithm with k=12. Apply BTL3
7 Explain the three different ways to do dimensionality reduction. Analyze BTL4
8 Explain what Least Squares Optimization is. Evaluate BTL5
9 Difference action and state space. Remember BTL1
10 Write what is Punctuated Equilibrium? Evaluate BTL5
11 How reinforcement learner experience and the corresponding action. Remember BTL1
12 Express the basic tasks that need to be performed for GA. Create BTL6
13 Define TD-Gammon Remember BTL1
14 Identify how reinforcement learning maps states to action. Remember BTL1
15 Examine Genetic Programming. Apply BTL3
16 Discuss about reward function. Understand BTL2
17 Discover Markov decision processes. Apply BTL3
18 Differentiate Sarsa and Q-learning. Create BTL6
19 Analyze POMDPs. Analyze BTL4
20 Describe about values. Understand BTL2
PART-B (13 MARK )
(i)Write the three method s of dimensionality reduction. (7)
1 (ii)Discuss how to choose the right features with an example and Analyze BTL4
perform the various analysis. (6)
(i)Discuss in detail LDA. (7)
2 Understand BTL2
(ii)Discuss how to measure the dataset by various method. (6)
(i)Write in detail factor analysis. (7)
3 Evaluate BTL5
(ii)Write the output of using factor analysis on the iris data set. (6)
(i) Describe Locally Linear Embedding algorithm. (7)
4 Remember BTL1
(ii) Describe LLE algorithm. (6)
(i) Explain in detail about isomap (7)
5 Apply BTL3
(ii) Discover the result of applying isomap to the iris dataset. (6)
6 Explain Least Squares Optimization. (13) Create BTL6
(i)Describe Evolutionary Learning. (7)
7 Remember BTL1
(ii) Describe about population and Parent selection (6)
(i)Describe in detail about Generating Offspring Genetic Operators. (7)
8 Understand BTL2
(ii)Discuss the Basic Genetic Algorithm. (6)
(i)Identify the difference between Sarsa and Q-learning. (7)
9 Remember BTL1
(ii)Discuss an example for the reinforcement learning. (6)
(i)Describe Knapsack problem for GA. (7)
10 Remember BTL1
(ii)Describe the for peaks Problem for GA. (6)
(i)Write in detail about reinforcement learning. (7)
11 Create BTL6
(ii)Illustrate the reinforcement learning cycle. (6)
(i)Write about the Markov Property. (7)
12 (ii) Generalize how principal component analysis is carried out to Create BTL 6
reduce dimensionality of data sets. (6)
(i)Write in detail about values (7)
13 Remember BTL 1
(ii)Show an example of environment (6)
Describe in detail about
14 (i)Getting Lost (7) Create BTL 6
(ii) Discounting(6)
PART - C (15 MARK )
1 Generalize PCA algorithm and discuss Kernel PCA. (15) Create BTL 6
Write in detail about Np complete problem and show the string
2 Create BTL 6
representation. (15)
Choose two destination with different routes connecting them. Apply
3 Analyze BTL 4
genetic algorithm to find the optional path based on distance. (15)
(i)Explain about reinforcement learning. (8)
4 (ii) Explain about isomap ? Give its significance in machine Evaluate BTL 5
learning.(7)
Unit -V
SYLLABUS
Markov Chain Monte Carlo Methods-Sampling –Proposal Distribution-Markov Chain Monte Carlo-Graphical
Models –Bayesian Networks-Markov Random Fields-Hidden Markov Models-Tracking Methods.
PART - A

Q.No Questions Competence BT Level

1 Point the algorithm that produce pseudo-random numbers Analyze BTL4


2 Define MCMC Remember BTL1
3 Distinguish random numbers and Gaussian Random numbers Understand BTL2
4 Define Sampling Remember BTL1
5 Describe Markov Chains Understand BTL2
6 Discover the proposal distribution method Apply BTL3
7 Explain the Various mcmc methods Analyze BTL4
8 Explain graphical models Evaluate BTL5
Differentiate between sampling –importance sampling and resampling
9 Remember BTL1
algorithm
10 Write Gibbs sampler Evaluate BTL5
11 How gibbs sampler forms the basis for software package Remember BTL1
12 Express Bayesian Belief network Create BTL6
13 Difference between: Directed and Undirected graphs Remember BTL1
14 Identify polytrees in a graph Remember BTL1
15 Examine approximate inference. Apply BTL3
16 Discuss Making Bayesian networks Understand BTL2
17 Discover HMM Apply BTL3
18 Differentiate top –down and bottom-up inference Create BTL6
19 Analyze HMM Baum-Welch(forward –Backward )algorithm Analyze BTL4
20 Describe the two Tracking methods Understand BTL2
PART-B (13 MARK )
(i)Write about sampling. (7)
1 Analyze BTL4
(ii)Discuss the sampling methods in generation of random numbers. (6)
(i)Discuss Gaussian Random numbers. (7)
2 Understand BTL2
(ii)Describe Box-Muller scheme. (6)
(i) Write the Rejection Sampling Algorithm. (7)
3 Evaluate BTL5
(ii)Show the histogram of a mixture of two Gaussians. (6)
(i) Describe Sampling –importance resampling algorithm. (7)
4 Remember BTL1
(ii) Describe Gibbs Sampling. (6)
(i) Explain in detail about MCMC (7)
5 Apply BTL3
(ii) Discover Simulated Annealing by a distribution. (6)
(i)Structure two graphical models and show the various relationship
6 between the nodes.(7) Create BTL6
ii) Explain about conditional table. (6)
(i).Describe variable elimination algorithm. (7)
7 Remember BTL1
(ii)Describe the Approximate Inference. (6)
(i)Describe in detail about Bayesian Network. (7)
8 Understand BTL2
(ii)Explain with an example. (6)
9 Identify the structure and conditional probability. (13) Remember BTL1
10 (i)Describe in detail Markov Random fields. (7) Remember BTL1
(ii)Write the Markov Random Field Image Denoising Algorithm. (6)
(i)Write in detail forward algorithm. (7)
11 Create BTL6
(ii)Discuss HMM forward algorithm. (6)
12 Write a HMM Baum-Welch(forward –Backward )algorithm. (13) Create BTL 6
(i)Write about Tracking methods. (7)
13 Remember BTL 1
(ii) Identify hidden Markov models in detail. (6)
(i)Describe in detail about Kalman Filter Algorithm. (7)
14 Create BTL 6
(ii)Discuss about The particle Filter. (6)
PART - C (15 MARK )
1 Generalize the proposal Distribution. (15) Create BTL 6
(i)Write about Markov chains. (7)
2 Create BTL 6
(ii)How Metropolis-Hastings algorithm rejects or accepts a sample. (8)
3 Explain about Markov Chain Monte Carlo Methods. (15) Analyze BTL 4
4 Measure the decoding problem by viterbi algorithm (15) Evaluate BTL 5

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy