Week 4
Week 4
Optimization is the process where we train the model iteratively that results in maximum and
minimum function evaluation. We optimize our models by changing hyperparameters in each
step as well create accurate model with less error rate.
1. Gradient Descent : It finds out the local minima of a differentiable function and it
minimization algorithm that minimizes a given function.
2. Stochastic Gradient Descent : Here we select the data points randomly to calculate
the local minimum of the function. Stochastic basically means Probabilistic.
The non-linear functions do the mappings between the inputs and response variables. Their
main purpose is to convert an input signal of a node in Neural Network to an output signal.
That output signal is now used as an input in the next layer in the model.
1. Linear function : The activation function does the non-linear transformation to the
input making it capable to learn and perform more complex tasks.
2. Sigmoid function : It is a function which is plotted as ‘S’ shaped graph.In Deep
learning which is used to add non-linearity in a machine learning model.ex.regression
of bounded quantities, such as probabilities between 0 and 1.
3. Tanh Function : Tanh function also knows as Tangent Hyperbolic function. It’s
actually mathematically shifted version of the sigmoid function. Both are similar and
can be derived from each other.(Better than sigmoid).
4. RELU function : Stands for Rectified linear unit. It is the most widely used activation
function. Chiefly implemented in hidden layers of Neural network.
5. Softmax function : The softmax function is also a type of sigmoid function but is
handy when we are trying to handle classification problems.
Q5.Define Neural Network in Deep learning.How its works? and their types. explain it
with suitable Diagram(12 Marks**)
Neural Network is a computational model that has a network architecture acts like a human
brain.It has many layers. Each layer performs a specific function, and the complex the
network is, the more the layers are. That’s why a neural network is also called a multi-layer
perceptron.
Working :
1.The input layer :The input layer picks up the input signals and transfers them to the next
layer. It gathers the data from the outside world.
2.The hidden layer : There are multiple hidden neural networks that performs all the back-
end tasks of calculation. A network can even have zero hidden layers.
3.The output layer : The output layer transmits the final result of the hidden layer’s
calculation.
The input layer receives the data and passes it on to the hidden layer. Weights are assigned to
each input at random by the linkages between the two layers. After weights are multiplied
with them individually, a bias is applied to each input. To the activation function, the
weighted total is passed. For feature extraction, the activation function decides which nodes
should be fired. To deliver the output, the model uses an application function on the output
layer. To reduce error, weights are modified and the output is back-propagated.
1. Recurrent Neural Network (RNN) : Recurrent neural networks (RNN) are a class of
neural networks that are helpful in modeling sequence data.It produce predictive
results in sequential data that other algorithms can't.
2. Convolutional Neural Network (CNN) : This network consists of one or multiple
convolutional layers. The convolutional layer present in this network applies a
convolutional function on the input before transferring it to the next layer.CNN used
in natural language processing and image recognition.
3. Radial Basis Function Neural Network (RBFNN) : networks are used type of artificial
neural network for function approximation problems. Radial basis function networks
are distinguished from other neural networks due to their universal approximation and
faster learning speed.
4. Feedforward Neural Network (FNN) : In this network, the information moves in
only one direction—forward—from the input nodes, through the hidden nodes (if
any) and to the output nodes. There are no cycles or loops in the network.
5. Modular Neural Network :It is neural network characterized by a series of
independent neural networks moderated by some intermediary. Each independent
neural network serves as a module and operates on separate inputs to accomplish
some subtask of the task the network hopes to perform.