cvloss
Classification error by cross-validation for classification tree model
Description
returns
the cross-validated classification error (loss) E = cvloss(tree)E for the trained
classification tree model tree. The cvloss function uses
stratified partitioning to create cross-validated sets. That is, for each fold, each
partition of the data has roughly the same class proportions as in the data used to train
tree.
specifies additional options using one or more name-value arguments. For example, you can
specify the pruning level, tree size, and number of cross-validation samples.E = cvloss(tree,Name=Value)
Examples
Input Arguments
Name-Value Arguments
Output Arguments
Alternatives
You can construct a cross-validated tree model with crossval, and call
kfoldLoss instead of cvloss. If you are going to
examine the cross-validated tree more than once, then the alternative can save time.
However, unlike cvloss, kfoldLoss does not return SE, Nleaf, or
BestLevel. kfoldLoss also does not
allow you to examine any error other than the classification error.
Extended Capabilities
Version History
Introduced in R2011a
See Also
fitctree | crossval | loss | kfoldLoss | ClassificationTree

