DL - FNN - RNN
DL - FNN - RNN
• Key Features:
o Uses multiple layers of neural networks (hence "deep") to extract features and learn
representations.
o Handles tasks like image recognition, speech processing, and natural language
understanding.
• Basic Structure:
• Key Components:
o Weights and Biases: Adjusted during training to optimize the performance of the
network.
o Activation Functions: Non-linear functions like ReLU, Sigmoid, or Tanh, which allow
neural networks to learn complex patterns.
3. Forward Propagation
• Explanation: Information flows forward from the input layer, through hidden layers, to the
output layer. Each neuron takes the weighted sum of inputs, applies an activation function,
and passes the output to the next layer.
• Key Process:
• Backpropagation:
o A technique used to update the weights of the network based on the error.
o The gradient of the error is calculated with respect to each weight (via chain rule)
and weights are updated using Gradient Descent.
• Loss Function: A mathematical function used to compute the error (e.g., Mean Squared
Error, Cross-Entropy).
• Example Applications:
o Characteristics:
o Key Feature: The output at a given time step is fed back as input to the network at
the next time step, enabling memory of previous inputs.
o Applications:
o Challenges:
▪ Vanishing Gradient Problem: In long sequences, the gradient diminishes,
making learning difficult for distant time steps.
7. Advanced Architectures
• Transformer Networks: