The document contains a question bank for the subject BTCOE801(A) Deep Learning. It includes questions grouped into 8 weekly topics covering concepts such as deep learning fundamentals, linear classifiers, optimization techniques, neural networks, convolutional neural networks, and effective training methods in deep learning.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
32 views5 pages
DL Question Bank 2022-23
The document contains a question bank for the subject BTCOE801(A) Deep Learning. It includes questions grouped into 8 weekly topics covering concepts such as deep learning fundamentals, linear classifiers, optimization techniques, neural networks, convolutional neural networks, and effective training methods in deep learning.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5
QUESTION BANK
VIII SEMESTER
BTCOE801(A) DEEP LEARNING
Academic Year 2022 2023
–
Prepared by
Dr. Rahila Sheikh
Prof. Madhavi Sadu Department Of Computer Science & Engineering Question Bank 2022-23
SUBJECT: BTCOE801(A) DEEP LEARNING
SEMESTER/YEAR: VIII/IV
Week 1: Introduction to Deep Learning, Bayesian Learning, Decision Surfaces
Q. No. Question CO/Level 1 What is deep learning and how is it different from traditional CO410.1/L2 machine learning methods? 2 Can you explain the difference between supervised and CO410.1/L2 unsupervised deep learning algorithms? 3 What are the advantages of using deep learning over other CO410.1/L2 machine learning methods? 4 What is a neural network and how is it used in deep CO410.1/L2 learning? 5 How do you measure the performance of a deep learning CO410.1/L2 model? 6 What is Bayesian learning and how is it different from other CO410.1/L2 machine learning approaches? 7 What is the role of prior knowledge in Bayesian learning? CO410.1/L2 8 How is decision surface related to classification in machine CO410.1/L2 learning? 9 Can you give an example of a real-world application of CO410.1/L2 decision surfaces in machine learning? 10 How can deep learning and Bayesian learning be combined CO410.1/L2 to improve the performance of a model? Week 2: Linear Classifiers, Linear Machines with Hinge Loss 1 What is a linear classifier and how does it work? CO410.1/L2 2 Can you explain the difference between linear and non- CO410.1/L2 linear classifiers? 3 What is a decision boundary and how is it used in linear CO410.1/L2 classification? 4 How do you train a linear classifier? CO410.1/L2 5 What is the hinge loss function and how is it used in linear CO410.1/L2 machines? 6 What is the role of regularization in linear machines? CO410.1/L2 7 Can you explain the difference between L1 and L2 CO410.1/L2 regularization in linear machines? 8 How do you handle multi-class classification using linear CO410.1/L2 machines? 9 Can you give an example of a real-world application of CO410.1/L2 linear machines with hinge loss? 10 How do you optimize the hinge loss function to train a linear CO410.1/L2 machine? Week 3: Optimization Techniques, Gradient Descent, Batch Optimization 1 How would you modify the learning rate of a neural CO410.2/L3 network to improve its accuracy? 2 What are some common optimization algorithms used in CO410.2/L3 deep learning, and how do they differ from each other? 3 How do you determine the optimal number of layers for a CO410.2/L3 deep neural network? 4 What are some techniques for preventing overfitting in deep CO410.2/L3 learning models, and how do they work? 5 Compare and contrast the pros and cons of gradient descent CO410.2/L3 and stochastic gradient descent optimization algorithms. 6 Reason for calling Feedforward neural networks as CO410.2/L3 networks–Justify. 7 What is batch normalization and how does it help optimize CO410.2/L3 neural networks? 8 What is the difference between L1 and L2 regularization, CO410.2/L3 and when would you use each? 9 What is the vanishing gradient problem, and how can it be CO410.2/L3 addressed? 10 What is the role of momentum in optimization algorithms, CO410.2/L3 and how does it help optimize neural networks? Week 4: Introduction to Neural Network, Multilayer Perceptron, Back Propagation Learning 1 What is backpropagation and how is it used to optimize CO410.2/L3 neural networks? 2 How does a multilayer perceptron (MLP) differ from a CO410.2/L3 single-layer perceptron, and what are the advantages of using an MLP? 3 What is backpropagation learning, and how is it used to train CO410.2/L3 neural networks? 4 What is the role of activation functions in neural networks, CO410.2/L3 and how do they affect the output of the network? 5 How do you choose the number of neurons and layers in a CO410.2/L3 neural network, and what are some guidelines for avoiding overfitting? 6 What are some common optimization algorithms used in CO410.2/L3 neural networks, and how do they differ from each other? 7 Explain XOR operation. CO410.2/L3 8 Analyze cost function for back propagation learning. CO410.2/L3 9 Assess the difference between linear models and neural CO410.2/L3 networks 10 What are the advantages and disadvantages of using a deep CO410.2/L3 neural network over a shallow neural network? Week 5: Unsupervised Learning with Deep Network, Autoencoders 1 What is the difference between supervised and unsupervised CO410.3/L4 learning, and what are some examples of problems that can be solved using each approach? 2 What are the advantages and disadvantages of using an CO410.3/L4 autoencoder for unsupervised learning compared to other methods? 3 How does the denoising autoencoder differ from a standard CO410.3/L4 autoencoder, and what are some applications where it might be useful? 4 What are variational autoencoders (VAEs), and how do they CO410.3/L4 differ from standard autoencoders? What are some potential applications of VAEs? 5 What are adversarial autoencoders (AAEs), and how do they CO410.3/L4 differ from standard autoencoders? What are some potential applications of AAEs? 6 How does the contractive autoencoder differ from a standard CO410.3/L4 autoencoder, and what are some applications where it might be useful? 7 Explain Principal Components Analysis. CO410.3/L4 Week 6: Convolutional Neural Network, Building blocks of CNN, Transfer Learning 1 What are some of the key building blocks of a CO410.3/L4 Convolutional Neural Network, and how do they contribute to the overall architecture of the model? 2 What are some advantages of using Convolutional Neural CO410.3/L4 Networks over other machine learning models for image classification tasks? 3 What is transfer learning, and how can it be used to improve CO410.3/L4 the performance of Convolutional Neural Networks? 4 The output layer of convolutional network is usually CO410.3/L4 relatively inexpensive to learning layer. Justify. 5 Create a chart that demonstrates convolution with a stride. CO410.3/L4 6 What is meant by convolution? CO410.3/L4 7 How to reduce the cost of convolutional network training? CO410.3/L4 8 Simulate the idea behind reverse correlation. CO410.3/L4 9 Explain how a convolutional layer have a property called CO410.3/L4 equivariance to translation? Week 7: Revisiting Gradient Descent, Momentum Optimizer, RMSProp, Adam 1 What is Gradient Descent? CO410.3/L4 2 What is Momentum Optimizer? CO410.3/L4 3 What is RMSProp? CO410.3/L4 4 What is Adam? CO410.3/L4 5 What are the advantages of using Momentum optimizer over CO410.3/L4 traditional gradient descent? 6 What are the advantages of using RMSProp over traditional CO410.3/L4 gradient descent? 7 What are the advantages of using Adam over traditional CO410.3/L4 gradient descent? 8 What are the limitations of using Momentum optimizer? 9 What are the limitations of using RMSProp? 10 What are the limitations of using Adam? Week 8: Effective training in Deep Net- early stopping, Dropout, Batch Normalization, Instance Normalization, Group Normalization 1 What is Dropout, and how does it help prevent overfitting in CO410.4/L3 a neural network? 2 What is Batch Normalization, and how does it help improve CO410.4/L3 the training of neural networks? 3 How does Instance Normalization differ from Batch CO410.4/L3 Normalization, and in what types of applications might it be more appropriate? 4 What is Group Normalization, and how does it differ from CO410.4/L3 Batch Normalization and Instance Normalization? 5 What are some potential drawbacks of using Dropout, Batch CO410.4/L3 Normalization, Instance Normalization, and Group Normalization, and how can they be mitigated? Week 9: Recent Trends in Deep Learning Architectures, Residual Network, Skip Connection Network, Fully Connected CNN etc. 1 What is a Residual Network (ResNet), and how does it CO410.4/L3 differ from traditional deep neural networks? 2 What is a Skip Connection Network, and how does it differ CO410.4/L3 from a Residual Network? 3 What are some recent trends in deep learning architectures, CO410.4/L3 and how do they differ from traditional deep neural networks? 4 How can Residual Networks and Skip Connection Networks CO410.4/L3 be used to improve the performance of deep neural networks in computer vision tasks? 5 What are some potential drawbacks of using Residual CO410.4/L3 Networks and Skip Connection Networks, and how can they be mitigated? 6 Illustrate unshared convolution with suitable examples. CO410.4/L3 7 i. Write short notes Max Pooling. CO410.4/L3
ii. Explain Pooling with down sampling.
8 Evaluate variants of the basic convolution function. CO410.4/L3 Week 10: Classical Supervised Tasks with Deep Learning, Image Denoising, Semantic Segmentation, Object Detection etc. 1 What is a classical supervised task in deep learning? CO410.5/L2 2 What is image denoising? CO410.5/L2 3 What is semantic segmentation? CO410.5/L2 4 What is object detection? CO410.5/L2 5 How is deep learning used for image denoising? CO410.5/L2 6 How is deep learning used for semantic segmentation? CO410.5/L2 7 How is deep learning used for object detection? CO410.5/L2 8 What are some challenges in image denoising with deep CO410.5/L2 learning? 9 What are some challenges in semantic segmentation with CO410.5/L2 deep learning? 10 What are some challenges in object detection with deep CO410.5/L2 learning? Week 11: LSTM Networks 1 What is an LSTM network? CO410.5/L2 2 What makes LSTMs different from traditional RNNs? CO410.5/L2 3 What is a memory cell in an LSTM network? CO410.5/L2 4 What is an input gate in an LSTM network? CO410.5/L2 5 What is a forget gate in an LSTM network? CO410.5/L2 6 What is an output gate in an LSTM network? CO410.5/L2 7 What are some applications of LSTM networks? CO410.5/L2 8 What are some advantages of using LSTM networks? CO410.5/L2 9 What are some limitations of using LSTM networks? CO410.5/L2 10 How can the performance of an LSTM network be CO410.5/L2 improved? Week 12: Generative Modeling with DL, Variational Autoencoder, Generative Adversarial Network Revisiting Gradient Descent, Momentum Optimizer, RMSProp, Adam 1 What is generative modeling with deep learning? CO410.5/L2 2 What is a variational autoencoder? CO410.5/L2 3 What is a generative adversarial network (GAN)? CO410.5/L2 4 How does the generator in a GAN learn to generate realistic CO410.5/L2 data? 5 What is gradient descent? CO410.5/L2 6 What is momentum optimizer? CO410.5/L2 7 What is Adam? CO410.5/L2 8 What are the advantages of using generative models in deep CO410.5/L2 learning? 9 What are some of the limitations of generative models in CO410.5/L2 deep learning? 10 What is RMSProp? CO410.5/L2