Resume training ensemble
ens1 = resume(ens,nlearn)
ens1 = resume(ens,nlearn,Name,Value)
ens1 = resume(
nlearn more cycles.
resume uses the same training options
fitcensemble used to create
ens, except for parallel
training options. If you want to resume training in parallel, pass the
'Options' name-value pair.
resume training when
ens is a
Subspace ensemble created with
'AllPredictorCombinations' number of learners.
A classification ensemble, created with
A positive integer, the number of cycles for additional training of
Specify optional pairs of arguments as
the argument name and
Value is the corresponding value.
Name-value arguments must appear after other arguments, but the order of the
pairs does not matter.
Before R2021a, use commas to separate each name and value, and enclose
Name in quotes.
Printout frequency, a positive integer scalar or
For fastest training of some boosted decision trees, set
Options for computing in parallel and setting random numbers, specified as a structure. Create
You need Parallel Computing Toolbox™ to compute in parallel.
You can use the same parallel options for
For dual-core systems and above,
The classification ensemble
Train Classification Ensemble for Additional Cycles
Train a classification ensemble for three cycles, and compare the resubstitution error obtained after training the ensemble for more cycles.
ionosphere data set.
Train a classification ensemble for three cycles and examine the resubstitution error.
ens = fitcensemble(X,Y,'Method','GentleBoost','NumLearningCycles',3); L = resubLoss(ens)
L = 0.0085
Train for three more cycles and examine the new resubstitution error.
ens1 = resume(ens,3); L = resubLoss(ens1)
L = 0
The resubstitution error is much lower in the new ensemble than the original.
Automatic Parallel Support
Accelerate code by automatically running computation in parallel using Parallel Computing Toolbox™.
resume supports parallel training
'Options' name-value argument. Create options using
statset, such as
options = statset('UseParallel',true).
Parallel ensemble training requires you to set the
'Bag'. Parallel training is available only for tree learners, the
default type for
Accelerate code by running on a graphics processing unit (GPU) using Parallel Computing Toolbox™.
This function fully supports GPU arrays. For more information, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox).