0% found this document useful (0 votes)
33 views16 pages

Interview Question What Is Gradient Descent 1679467271

Gradient descent is an optimization algorithm commonly used to train machine learning models and neural networks. It works by iteratively updating model parameters, with the goal of minimizing a cost function. This cost function acts as a gauge of the model's accuracy, improving with each parameter update according to the slope of the error surface.

Uploaded by

Vani agarwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views16 pages

Interview Question What Is Gradient Descent 1679467271

Gradient descent is an optimization algorithm commonly used to train machine learning models and neural networks. It works by iteratively updating model parameters, with the goal of minimizing a cost function. This cost function acts as a gauge of the model's accuracy, improving with each parameter update according to the slope of the error surface.

Uploaded by

Vani agarwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

What is

Gradient Descent
in
Machine Learning ?
Gradient Descent is an optimization
algorithm that is commonly-used to train
machine learning models and neural
networks. Training data helps these
models learn over time, and the cost
function within gradient descent
specifically acts as a barometer,
gauging its accuracy with each iteration
of parameter updates.
Kehna kya chahte
ho?

Sumit Bhai , Explain It In


Simple Language Please
Let’s say Sumit is standing on top of the hill
and he is stuck there. Sadly he can’t see
because he has lost his eyesight after
watching “Big Boss season 121” . There is no
one around to help him.
What’s the best way you can suggest sumit
so that he can safely come down the hill?
Think and mention your
answers in the
Chat Section
Let’s understand this.
The best way that Sumit can follow to come
down the hill is:
1. Look for the slope.

2. Take small steps toward the direction of the


slope.

3. Repeat 1 and 2 until you de-hill.

Step1

Step2
Bach Gaya

Step3

Step4
Let’s take one mathematical
example.
Let’s say we have a function, 𝑦 = 𝑥 2 and
our objective is to find the value of x
where y is minimum. In other words, we
have to find the input x where the output
of this equation is minimum.

x f(x) = x 2 y
(input) (output)

At what value of x , the output from this function


would be minimum ?
In order to answer this, we need an
optimization algorithm or a technique
that can help us with this task.
The optimization algorithm that can
solve this problem is known as
Gradient Descent
Descent. In simpler words,
Gradient means "Inclination"
“inclination” and
Descent means "A way Down"
“A Way down”. So by
using this algorithm, we are trying
to bring down something and here
it’s the "Error"
“Error”.
Let’s understand with an example. Let’s say we have
a function, 𝑦 = 𝑥 2
Also let’s assume that we are right now standing at
point (x=6)(initial point)
Now our objective is to get that value of x for which
the output of this function is
minimum.
From the figure, we can simply say that at x = 0, y is
also 0, which is basically the minimum value of the
output(y)

But how to do this using Gradient Descent?


y = x2
initial point (x = 6)

x
Now, we will apply the same logic and
approach we used earlier to save
sumit.
Look for slope.
Take small steps toward the direction of
the slope.
What is the slope of the function 𝑦 = 𝑥 2 ?
The slope of any function is the derivative
of the function
2
y=x
Lets take derivative of y with respect to x
d d 2 Rule
(y) = (x ) d (x )= (nx n-1 )
dx dx
dx
dy
= 2x Slope of the
dx function y=x 2
We have the slope of the function
𝑦 = 𝑥 2which is 2𝑥
Now let’s go to the second step.
We will now, reduce the value of x towards the
direction of the slope from the current value
which is x = 6
What is the step size ?
Steps size will control the
speed at which we want to
descend. Here I don’t want to
be very fast or very slow, so the
tried and tested value is 0.01
but you may always play with
this value and tune it. This is
also known as a
HyperParameter.
After applying the formula , we will get
This is what Gradient Descent is
all about.
The best usage of Gradient Descent
can be seen in any Machine Learning
Algorithm which is having a Cost
Function to optimize.

1. Linear Regression

2. Logistic Regression

3. ANN
I hope now you are clear with
"What is Gradient Descent"
If you enjoyed this , please Share this with your
give this post a big thumbs up friends

SHARING IS CARING

And do comment with your questions if you


have any.
Don’t Forget To

SAVE
&
Follow For More

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy