98 D2 Exp-6 ML
98 D2 Exp-6 ML
121A1098
D2
ML
EXPERIMENT – 6
Theory:
Ensemble and AdaBoost – Concept:
Ensemble methods combine multiple weak learners to improve overall model
performance, and AdaBoost (Adaptive Boosting) is one such technique.
Unlike Random Forest, which builds trees independently, AdaBoost
sequentially builds a series of models, with each new model focusing on the
mistakes made by the previous ones. It assigns higher weights to incorrectly
classified instances, guiding the next learner to focus more on these errors.
The final prediction is made by combining the weighted votes of all models.
Algorithm – AdaBoost:
Initialize equal weights for all data points.
Train a weak learner (e.g., a shallow decision tree) on the dataset.
Calculate the weighted error and update the weights of the misclassified points.
Train the next learner with updated weights and repeat for a predefined number
of iterations.
Combine predictions using weighted voting for classification or
averaging for regression.
Python Implementation:
Import required libraries:
sklearn.ensemble.AdaBoostClassifier/Regressor. Load and split the
dataset into training and test sets.
Train the AdaBoost model with a chosen number of estimators (weak
learners). Use the trained model to predict on the test set and evaluate
performance with accuracy or error metrics.Ensemble and Ada Boost
Program :
Output :
Conclusion:
⚫ AdaBoost is an effective ensemble method that improves prediction
accuracy by focusing on hard-to-classify instances through adaptive
weighting.
⚫ It works well with simple base learners and is suitable for both
classification and regression tasks, though it may be sensitive to noisy
data.