Bagging and Boosting
Bagging and Boosting
They both are ensemble techniques = combine several decision tree classifiers
for a better performance.
Bagging and Boosting
They both are ensemble techniques = combine several decision tree classifiers
for a better performance.
Ensemble techniques:
- reduce variance and bias;
- increase the robustness;
- especially useful for unstable classifiers.
Bagging
… What is Bootstrapping?
Bootstrapping
● Random sampling with replacement;
● Understand the bias and variance of the dataset.
Bagging
Given a weak learner, run it multiple times on (reweighted) training data, then
let the learned classifiers vote.
Boosting - visual example
Extension: Gradient Descent
= gradient descent + boosting
GD = a first-order iterative
optimization algorithm for finding
the minimum of a function.
Extension: Gradient Boosting
In Practice: AdaBoost
Stands for “Adaptive Boosting”;
1. https://becominghuman.ai/ensemble-learning-bagging-and-boosting-d20f38be9b1e
2. https://towardsdatascience.com/decision-tree-ensembles-bagging-and-boosting-266a8ba60fd9
3. https://victorzhou.com/blog/intro-to-random-forests/
4. http://bccvl.org.au/algorithms-exposed-random-forest/
5. https://campus.datacamp.com/courses/machine-learning-with-tree-based-models-in-python/boost
ing?ex=5
6. http://datahacker.rs/gradient-descent-python/
7. https://www.datacamp.com/community/tutorials/adaboost-classifier-python
8. https://www.educba.com/adaboost-algorithm/
9. https://sefiks.com/2018/10/04/a-step-by-step-gradient-boosting-decision-tree-example/