Discard support vectors
mdlOut = discardSupportVectors(mdl)
returns
the trained, linear support vector machine (SVM) regression model mdlOut
= discardSupportVectors(mdl
)mdlOut
,
which is similar to the trained, linear SVM regression model mdl
,
except:
The Alpha
and SupportVectors
properties
are empty ([]
).
If you display mdlOut
, the software
lists the Beta
property instead of the Alpha
property.
For a trained, linear SVM regression model, the SupportVectors
property
is an nsv-by-p matrix. nsv is
the number of support vectors (at most the training sample size) and p is
the number of predictor variables. If any of the predictors are categorical,
then p includes the number of dummy variables necessary
to account for all of the categorical predictor levels. The Alpha
property
is a vector with nsv elements.
The SupportVectors
and Alpha
properties
can be large for complex data sets that contain many observations
or examples. However, the Beta
property is a vector
with p elements, which may be considerably smaller.
You can use a trained SVM regression model to predict response values
even if you discard the support vectors because the predict
and resubPredict
methods
use Beta
to compute the predicted responses.
If the trained, linear SVM regression model has many support
vectors, use discardSupportVectors
to reduce the
amount of disk space that the trained, linear SVM regression model
consumes. You can display the size of the support vector matrix by
entering size(mdlIn.SupportVectors)
.
The predict
and resubPredict
estimate
response values using the formula
where:
β is the Beta value, stored as mdl.Beta
.
β0 is the bias value,
stored as mdl.Bias
.
X
is the training data.
S
is the kernel scale value, stored
as mdl.KernelParameters.Scale
.
In this way, the software can use the value of mdl.Beta
to
make predictions even after discarding the support vectors.
CompactRegressionSVM
| fitrsvm
| predict
| RegressionSVM
| resubPredict