ASC Summarized Unit-1
ASC Summarized Unit-1
Activation Functions
1. Sigmoid Function:
o Outputs values between 0 and 1.
o Used in binary classification.
2. ReLU (Rectified Linear Unit):
o Efficient and widely used in deep networks.
3. Tanh Function:
o Outputs values between -1 and 1.
4. Softmax Function: Used for multi-class classification.
Learning Techniques
1. Supervised Learning: Uses labeled data. Example: Classification,
regression.
2. Unsupervised Learning: Uses unlabeled data. Example: Clustering,
dimensionality reduction.
3. Reinforcement Learning: Involves an agent interacting with an
environment to maximize rewards. Example: Game-playing AI.