Support Vector Regression
On-line regression On-line learning algorithms are not restricted to classification problems. The update rule for the kernel adatron algorithm also suggests a general methodology for creating on-line versions of the optimisations.
making the first update of the kernel adatron algorithm equivalent to αi ← αi + ∂W(α) ∂αi making it a simple gradient ascent algorithm augmented with corrections to ensure that the additional constraints are satisfied. If, for example, we apply this same approach to the linear ε-insensitive loss version of the support vector regression algorithm.
One of the advantages of Support Vector Machine, and Support Vector Regression as the part of it, is that it can be used to avoid difficulties of using linear functions in the high dimensional feature space and optimization problem is transformed into dual convex quadratic programmes. In regression case the loss function is used to penalize errors that are grater than threshold - . Such loss functions usually lead to the sparse representation of the decision rule, giving significant algorithmic and representational advantages.
Reference:
Kernel Methods for Pattern Analysis byJohn Shawe-Taylor & Nello Cristianini
http://kernelsvm.tripod.com/
Cite As
Bhartendu (2024). Support Vector Regression (https://www.mathworks.com/matlabcentral/fileexchange/63060-support-vector-regression), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Platform Compatibility
Windows macOS LinuxCategories
- AI and Statistics > Statistics and Machine Learning Toolbox > Dimensionality Reduction and Feature Extraction >
Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
Version | Published | Release Notes | |
---|---|---|---|
1.0.0.0 |