Unit 1 and Unit 2
Unit 1 and Unit 2
Part A
1. Define the Learning process in a neural network.
2. Define the term Knowledge in Neural Network.
3. List the Properties of Neural Networks
4. Explain Learning vector Quantization.
5. What is content addressable memory?
6. Write the Mathematical notation of Threshold activation function.
7. Define sigmoid activation Function.
8. What is a signal flow graph?
9. What is a single layer Feed Forward network?
10. What is a multilayer Feed Forward network?
11. What is meant by associative memory?
12. Define auto associative memory.
13. What is Hebbian Learning?
14. What is Bidirectional Associative memory (BAM)?
Part B and C
1. Elaborate about the basic Models of Artificial Neural Network.
2. List the various Applications and Scope of Neural Networks.
3. Discuss the phases involved in training a Back Propagation Neural Network (BPN) and their
respective functions
4. How does the Hebbian learning rule and outer products rule facilitate pattern association in
neural networks, and what are its limitations?
5. Discuss the architecture, training, and testing process of an autoassociative memory
network, highlighting how it stores and retrieves patterns.
6. Explain the architecture of a heteroassociative memory network and discuss the algorithm
used to test its pattern retrieval accuracy.
7. Elaborate the key architectural components of a Bidirectional Associative Memory (BAM)
network, and how is its testing algorithm designed to ensure bidirectional pattern
association?
8. Compare and contrast the discrete and continuous variants of Bidirectional Associative
Memory (BAM) in terms of their architecture and functional dynamics.
9. Outline the architecture, training, and testing algorithms of a discrete Hopfield network,
focusing on its capability to perform as an associative memory.
10. Compare and contrast the training algorithms for single output unit and multiple output
units in perceptrons.
11. Explain the significance of the feedback loop in Adaline's architecture and its role in training.
12. Explain the significance of the feedback loop in Madaline architecture and its role in training.
13. Discuss the phases involved in training a Back Propagation Neural Network (BPN) and their
respective functions.
14. How does the Radial Basis Function (RBF) Network perform classification, and what are its
advantages over other architectures?
15. Design a training algorithm for a single output unit perceptron.
16. Explain training algorithm for a multiple output unit perceptron.
17. Construct a decision tree and explain how it could be utilized within a Tree Neural Network
(TNN) for pattern recognition.
18. Implement a Wavelet Neural Network (WNN) for approximating a nonlinear function and
evaluate its performance.