Classic AdaBoost Classifier
This a classic AdaBoost implementation, in one single file with easy understandable code.
The function consist of two parts a simple weak classifier and a boosting part:
The weak classifier tries to find the best threshold in one of the data dimensions to separate the data into two classes -1 and 1
The boosting part calls the classifier iteratively, after every classification step it changes the weights of miss-classified examples. This creates a cascade of "weak classifiers" which behaves like a "strong classifier"
.
Training mode:
[estimateclass,model]=adaboost('train',datafeatures,dataclass,itt)
Apply mode:
estimateclass=adaboost('apply',datafeatures,model)
inputs/outputs:
datafeatures : An Array with size number_samples x number_features
dataclass : An array with the class off all examples, the class
can be -1 or 1
itt : The number of training iterations
model : A struct with the cascade of weak-classifiers
estimateclass : The by the adaboost model classified data
.
Please leave a comment, if you like the code, find a bug or have a suggestion.
Cite As
Dirk-Jan Kroon (2024). Classic AdaBoost Classifier (https://www.mathworks.com/matlabcentral/fileexchange/27813-classic-adaboost-classifier), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Platform Compatibility
Windows macOS LinuxCategories
Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
Version | Published | Release Notes | |
---|---|---|---|
1.5.0.0 | Fixed boundary bug |
||
1.4.0.0 | Speed improvement (Replaced loops by 1D indexing and bsxfun operations.)
|
||
1.3.0.0 | Changed bug : ndims(datafeatures)to size(datafeatures,2) |
||
1.2.0.0 | Solved division by zero, causing NaN |
||
1.1.0.0 | Changed Screenshot and example figure |
||
1.0.0.0 |