0% found this document useful (0 votes)
50 views

Ada Boost

AdaBoost is a machine learning meta-algorithm that was introduced in the 1990s to perform supervised learning tasks like classification. It works by combining many weak learners into a single strong learner in an iterative fashion. Each weak learner is trained on a weighted version of the training data, with incorrectly classified examples receiving higher weights. This process forces subsequent weak learners to focus on examples missed by previous ones, resulting in a diverse set of weak learners whose individual decisions are combined into a single strong learner.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views

Ada Boost

AdaBoost is a machine learning meta-algorithm that was introduced in the 1990s to perform supervised learning tasks like classification. It works by combining many weak learners into a single strong learner in an iterative fashion. Each weak learner is trained on a weighted version of the training data, with incorrectly classified examples receiving higher weights. This process forces subsequent weak learners to focus on examples missed by previous ones, resulting in a diverse set of weak learners whose individual decisions are combined into a single strong learner.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Adaboost

Huong.H.T Dang

Agenda
Boosting
Adaboost Introduction
Algorithm

Overview
Introduced in 1990s
originally designed for classification
problems
a procedure that combines the
outputs of many weak classifiers to
produce a powerful committee

Boosting Approach

select small subset of examples


derive rough rule of thumb
examine 2nd set of examples
derive 2nd rule of thumb
repeat T times
boosting = general method of
converting rough rules of thumb into
highly accurate prediction rule

Boosting - Definition
A machine learning algorithm
Perform supervised learning
Increments improvement of learned
function
Forces the weak learner to generate
new hypotheses that make less
mistakes on harder parts.

Adaboost Introduction
AdaBoost is the short for Adaptive
Boost, a machine learning metaalgorithm.
AdaBoost contain several layers of
weak learners each layer focus on
the misclassified of its previous
learner.
These weak learners are called base
classifiers and they need to be
simple classifiers that have error rate

Introduction
Weak Learner

Algorithm
Step 1: Initialization weight
for elements
Ym(xn) = 1
, -1

Step 2:
Find the weak learner with the
minimum error rate and calculate
the weight of classifier
0,1

Algorithm
Step 2: Update the weight to
elements
wn( m 1) wn( m ) exp(t n ai ym ( xn ))

Algorithm
Example for step 2. Loop

Algorithm

Step 3: Decision, combine weak


learners. Apply with new element

You might also like