0% found this document useful (0 votes)
4 views5 pages

NNDL Question Bank MID 1

The document covers three units on Artificial Neural Networks (ANNs), Unsupervised Learning Networks, and Deep Learning, detailing both subjective and objective questions related to each topic. Key concepts include the structure and functioning of various types of ANNs, the principles of unsupervised learning, and the characteristics of deep learning. It emphasizes the importance of activation functions, backpropagation, and the differences between supervised and unsupervised learning methods.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views5 pages

NNDL Question Bank MID 1

The document covers three units on Artificial Neural Networks (ANNs), Unsupervised Learning Networks, and Deep Learning, detailing both subjective and objective questions related to each topic. Key concepts include the structure and functioning of various types of ANNs, the principles of unsupervised learning, and the characteristics of deep learning. It emphasizes the importance of activation functions, backpropagation, and the differences between supervised and unsupervised learning methods.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Unit 1: Artificial Neural Networks (ANNs)

Subjective Questions

1. Describe the basic structure of an ANN and explain the roles of the input, hidden, and
output layers.

2. What is forward propagation in ANNs? Explain with the equation used for weighted
sum.

3. Describe the various types of activation functions in ANNs, and explain where each is
commonly used.

4. Compare and contrast different types of ANNs, such as Feedforward Neural Networks,
Recurrent Neural Networks, and Convolutional Neural Networks.

5. Explain the architecture and working of a Perceptron, including the Perceptron model
equation.

6. Describe the Adaptive Linear Neuron (ADALINE) and how it differs from the
Perceptron.

7. Explain the concept and algorithm of Back-Propagation Networks and its importance
in training neural networks.

8. What is an Associative Memory Network, and how does it differ from other types of
neural networks?

9. Describe the structure and applications of Hopfield Networks as a type of associative


memory network.

10. Explain the Bidirectional Associative Memory (BAM) network and its significance.

Objective Questions

1. In a neural network, the connections between neurons are called ________. (weights)
2. A single layer perceptron can solve only ________ problems. (linearly separable)
3. ________ is the process of passing data through the network to make predictions.
(Forward propagation)
4. The ________ function in a neuron decides whether the neuron should activate or not.
(activation)
5. Backpropagation uses ________ to calculate the gradient of the loss function. (chain
rule)
6. The ________ function is commonly used as an activation function in binary
classification tasks. (sigmoid)
7. ________ is a basic model of ANN that can classify input data into two categories.
(Perceptron)
8. The ________ function maps input values to a range between 0 and 1. (sigmoid)
9. The process of adjusting weights in a neural network based on the error is known as
________. (backpropagation)
10. The ________ activation function is commonly used in hidden layers for non-linearity.
(ReLU)
11. The ________ function is used in the output layer for multi-class classification.
(softmax)
12. The Adaptive Linear Neuron (ADALINE) minimizes error using the ________ cost
function. (mean squared error)
13. In ADALINE, the activation function is ________, which makes it different from
perceptrons. (linear)
14. A ________ network stores patterns and retrieves them based on input patterns, similar
to human memory. (associative memory)
15. The perceptron training algorithm uses ________ to update weights based on
classification error. (gradient descent)
16. In supervised learning, the ________ layer is the layer where the model receives the
target labels during training. (output)
17. In a Hopfield network, stored patterns are known as ________. (attractors)
18. The backpropagation algorithm adjusts weights in the direction of the ________
gradient to reduce error. (negative)
19. ________ is a type of associative memory network that stores pattern pairs and retrieves
them in both directions. (Bidirectional Associative Memory or BAM)
20. The technique used to prevent a neural network from learning irrelevant details and
overfitting is called ________. (regularization)

Unit 2: Unsupervised Learning Networks

Subjective Questions
1. Define unsupervised learning and discuss its importance in neural networks.

2. Describe Fixed Weight Competitive Nets and their working principles.

3. Explain Maxnet, its architecture, and how it applies the winner-takes-all strategy.

4. What is a Hamming Network, and how is it used for binary pattern recognition?

5. Describe the Kohonen Self-Organizing Maps (SOMs) and explain how they preserve
topological properties.

6. Describe the types of ART Networks, including ART1 and ART2, and their
applications.

7. Explain the correlation layer and competitive layer in Hamming Networks with an
example.

8. Describe the role of inhibitory weights in Maxnet and their importance in network
functionality.

9. Explain the process of neighborhood function in SOM and its significance in


dimensionality reduction.

10. What are the key differences between competitive learning and supervised learning?

Objective Questions

1. In competitive learning, neurons compete among themselves to be the one to ________.


(activate)
2. Fixed Weight Competitive Nets use a ________-takes-all strategy, where only one
neuron wins. (winner)
3. Maxnet uses ________ weights to suppress the activations of non-winning neurons.
(inhibitory)
4. The network that stores binary patterns and retrieves the closest match is the ________
network. (Hamming)
5. The process of finding the neuron closest to the input in a SOM is known as identifying
the ________. (Best Matching Unit or BMU)
6. The function used to control the influence of neighboring neurons in SOM is called the
________ function. (neighborhood)
7. ________ is a supervised extension of the Self-Organizing Map that is used for
classification tasks. (Learning Vector Quantization or LVQ)
8. The ________ network combines both supervised and unsupervised learning in its two
layers. (Counter Propagation)
9. In Counter Propagation Networks, the ________ layer performs unsupervised learning.
(Kohonen)
10. The parameter that determines the similarity threshold in ART networks is known as
the ________ parameter. (vigilance)
11. ART1 networks are designed to work with ________ input patterns. (binary)
12. ART2 networks are an extension of ART1 and can handle ________ input data.
(continuous)
13. In Self-Organizing Maps, ________ learning involves adjusting weights based on the
proximity to the BMU. (competitive)
14. The primary task of a Hamming Network is to recognize and classify ________
patterns. (binary)
15. The ________ layer in a Hamming Network is responsible for finding the closest match
to the input pattern. (correlation)
16. The competitive learning process in SOMs is often used for ________ tasks.
(clustering)
17. In Kohonen SOM, the BMU and its ________ are updated to adjust to the input vector.
(neighbors)
18. Counter Propagation Networks are effective for ________ tasks where initial clustering
aids in mapping inputs to outputs. (function approximation)
19. ________ learning in neural networks is based on competitive adaptation, where
neurons update their weights independently. (Self-organizing)
20. The ART network that handles chemical neurotransmitter concepts for more complex
dynamics is known as ________. (ART3)

Unit 3: Introduction to Deep Learning

Subjective Questions

1. What is deep learning, and how does it differ from traditional machine learning?

2. Explain the structure and working of Deep Feed-Forward Networks (FFNNs).


3. What are the main applications of deep learning, and why is it preferred in complex
tasks?

4. Explain the importance of end-to-end learning in deep neural networks.

5. What are the key characteristics of deep learning that contribute to its performance on
large datasets?

Objective Questions

1. A deep neural network with no feedback loops is known as a ________ network.


(feedforward)
2. The ________ activation function is commonly used in deep neural networks to avoid
vanishing gradients. (ReLU)
3. In deep learning, the ________ function measures the discrepancy between the
predicted and actual values. (loss)
4. The concept of ________ learning involves leveraging models trained on large datasets
to perform well on specific tasks. (transfer)
5. In a deep feed-forward network, each neuron computes a weighted sum of inputs, adds
a bias, and applies an ________ function. (activation)
6. The self-attention mechanism in Transformers helps capture ________ between
different parts of the input sequence. (dependencies)
7. ________ allows deep learning models to learn from sequences of data, making them
suitable for language and time-series tasks. (Recurrent Neural Networks or RNNs)
8. ________ is an optimization technique commonly used in deep learning to prevent
overfitting by randomly dropping neurons during training. (Dropout)
9. In deep learning, ________ functions introduce non-linearity, enabling models to learn
complex patterns. (activation)
10. The ________ learning rate allows the model to adjust weights in smaller or larger steps
during training, impacting convergence speed. (adaptive)

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy