0% found this document useful (0 votes)
307 views

Deep Learning - IIT Ropar - Unit 6 - Week 3

The document provides a 10 question assignment on feedforward neural networks. It includes the network structure, weights, input, target value, and questions related to computing outputs, losses, gradients and entropy. The questions test understanding of concepts like number of parameters, forward pass, loss computation, backpropagation and cross entropy.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
307 views

Deep Learning - IIT Ropar - Unit 6 - Week 3

The document provides a 10 question assignment on feedforward neural networks. It includes the network structure, weights, input, target value, and questions related to computing outputs, losses, gradients and entropy. The questions test understanding of concepts like number of parameters, forward pass, loss computation, backpropagation and cross entropy.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

X

(https://swayam.gov.in) (https://swayam.gov.in/nc_details/NPTEL)

arjunyamarthy@gmail.com 

NPTEL (https://swayam.gov.in/explorer?ncCode=NPTEL) » Deep Learning - IIT Ropar (course)

If already
registered, click
to check your
Week 3 : Assignment
payment status The due date for submitting this assignment has passed.
Due on 2024-02-14, 23:59 IST.

Course Assignment submitted on 2024-02-14, 21:14 IST


outline
Use the following data to answer the questions below The following diagram represents a neural
network containing two hidden layers and one output layer. The input to the network is a column
About vector x ∈ R .
3
The activation function used in hidden layers is sigmoid. The output layer doesn’t
NPTEL ()
contain any activation function and the loss used is squared error loss (predy 2
− truey ) .

How does an
NPTEL
online
course
work? ()

Week 0 ()

Week 1 ()

Week 2 () The following network doesn’t contain any biases and the weights of the network are given
below:
Week 3 ()

⎡ 1 1 2 ⎤
Feedforward 1 1 2
Neural W1=⎢ 3 1 1 ⎥ W2=[ ] W3=[ 2 5 ]
⎣ ⎦ 3 1 1
Networks 1 2 3

(a.k.a ⎡ 1 ⎤
multilayered The input to the network is: x=⎢ 1 ⎥
network of ⎣ ⎦
1
neurons) (unit?
unit=46&lesso
n=47)
Learning The target value y is: y=10
Paramters of
Feedforward
1) What is the total number of parameters in the following network? 1 point
Neural
Networks
15
(Intuition)
(unit? 7
unit=46&lesso 9
n=48)
17
Output
Yes, the answer is correct.
functions and Score: 1
Loss functions
Accepted Answers:
(unit?
17
unit=46&lesso
n=49)
2) What is the predicted output for the given input x1 after doing the forward pass? 1 point
Backpropagati (Choose the option closest to your answer)
on (Intuition)
(unit? 7.33
unit=46&lesso
6.92
n=50)
6.31
Backpropagati
8
on: Computing
Gradients w.r.t. Yes, the answer is correct.
the Output Score: 1
Units (unit? Accepted Answers:
unit=46&lesso 6.92
n=51)
3) Compute and enter the loss between the output generated by input x and the true output y.
Backpropagati
(NAT)
on: Computing
Gradients w.r.t.
9.44
Hidden Units
(unit? Yes, the answer is correct.
unit=46&lesso Score: 1
n=52) Accepted Answers:
(Type: Range) 9.38,9.58
Backpropagati
on: Computing 1 point
Gradients w.r.t.
Parameters 4) If we call the predicted y as y
^ then what is the gradient dL/dy
^ ? (L is the loss 1 point
(unit? function)
unit=46&lesso
n=53) -5.17
-7.52
Backpropagati
on: Pseudo -6.15
code (unit? -7.15
unit=46&lesso
n=54) Yes, the answer is correct.
Score: 1
Derivative of Accepted Answers:
the activation -6.15
function (unit?
unit=46&lesso
5) What is the sum of elements of ∇ω3? (Choose the closest value to your answer) 1 point
n=55)
-12.9
Information -11.6
content,
-10.07
Entropy &
cross entropy -12.14
(unit? Yes, the answer is correct.
unit=46&lesso Score: 1
n=56) Accepted Answers:
-12.14
Lecture
Material for 6) What is the sum of elements of ∇ω2?
Week 3 (unit?
unit=46&lesso -36.15
n=57)
No, the answer is incorrect.
Week 3 Score: 0
Feedback Accepted Answers:
Form: Deep (Type: Range) -1.4,-1.2
Learning - IIT
Ropar (unit? 0 points
unit=46&lesso
7) What is the sum of elements of ∇ω2?
n=186)

Week 3:
-108.45
Solution (unit?
No, the answer is incorrect.
unit=46&lesso Score: 0
n=247) Accepted Answers:
Quiz: Week 3
(Type: Range) -0.08,-0.04
: Assignment 0 points
(assessment?
name=265) 8) The probability of all the events x 1 , x 2 , x 2 . . . . x n in a system is equal (n > 1) . 1 point
What can you say about the entropy H (X) of that system?(base of log is 2)
week 4 ()

H (X) ≤ 1
Week 5 ()

H (X) = 1
Week 6 ()
H (X) ≥ 1
Week 7 ()
We can’t say anything conclusive with the provided information

Week 8 () No, the answer is incorrect.


Score: 0
Accepted Answers:
Week 9 () H (X) ≥ 1

week 10 () 9) Let p and q be two probability distributions. Under what conditions will the cross 1 point
entropy between p and q be minimized?
Week 11 ()

p = q
Week 12 ()
All the values in p are lower than corresponding values in q
Download
Videos () All the values in p are lower than corresponding values in q

Books () p = 0 [0 is a vector]

Y h i
Yes, the answer is correct.
Text Score: 1
Transcripts () Accepted Answers:
p = q

Problem 10) Suppose we have a problem where data x and label y are related by y = x 2 + 1. 1 point
Solving Which of thefollowing is not a good choice for the activation function in the hidden layer if the
Session - activation function at the output layer is linear?
Jan 2024 ()
Linear
Relu
Sigmoid

−1
Tan (x)

Yes, the answer is correct.


Score: 1
Accepted Answers:
Linear

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy