Adaboost Learning Rate in Matlab Documentation

2 vues (au cours des 30 derniers jours)
Dario Walter
Dario Walter le 13 Août 2020
Commenté : Dario Walter le 23 Août 2020
Hey,
the description of Adaboost allows to set a learn rate.
However, the learn rate typically refers to Gradient Boosting. Could anyone explain to me what Matlab is doing when AdaboostM1 is applied.
Thanks for your help!

Réponse acceptée

Raunak Gupta
Raunak Gupta le 15 Août 2020
Hi,
The LearnRate option in AdaBoostM1 tells about the learning rate of shrinkage which is essentially the shrink in contribution of each new base-model learned in the ensemble. This parameter controls how much the new model contributes to existing one. Normally if the LearnRate is too small it will require more iterations to get trained and will be more accurate. AdaBoostM1 is used for binary classification problem only.
  3 commentaires
Raunak Gupta
Raunak Gupta le 21 Août 2020
Modifié(e) : Raunak Gupta le 21 Août 2020
Hi Dario,
The LearnRate parameter is included while calculate the weights of the weak hypothesis in the ensemble that is () in the Algorithm mentioned here. It is not explained in the original AdaBoost algorithm but is used widely in almost all of the application of AdaBoost because it provides a way to tinker the actual contribution of subsequent weak learners. So when any is calculated for a learner it is multiplied with LearnRate to diminish or enhance its contribution (based on value being < 1 or > 1).
You can see in this example, the influence of different LearnRate.
Dario Walter
Dario Walter le 23 Août 2020
Thank you Raunak. This helps me a lot.

Connectez-vous pour commenter.

Plus de réponses (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by