Learning Rate (Or Eta)
Learning Rate (Or Eta)
Definition: Controls the step size shrinkage used in updating the weights of the
model during each boosting iteration.
Where to use: Central to controlling the step size during boosting and
preventing overfitting.
When to use: Lower values make the boosting process more conservative and
require more boosting rounds to converge, while higher values may lead to
overfitting.
XGBoost Hyperparameter: learning_rate
Recommended values: Typically in the range [0.01, 0.3].
o
o Number of Estimators (n_estimators):
Definition: Number of boosting rounds or trees to build.
Where to use: Dictates the number of boosting rounds and the overall
complexity of the model.
When to use: Higher values can improve model performance, but increasing the
number of estimators also increases computation time.
XGBoost Hyperparameter: n_estimators
Recommended values: Depends on the size of the dataset and computational
resources, but typically in the range [100, 1000].