Main Content

resume

Resume training of generalized additive model (GAM)

Since R2021a

    Description

    example

    UpdatedMdl = resume(Mdl,numTrees) returns an updated generalized additive model UpdatedMdl by training Mdl for numTrees more iterations with the same options used to train Mdl.

    For each iteration, resume trains one predictor tree per linear term or one interaction tree per interaction term.

    • If Mdl contains only linear terms for predictors (predictor trees), then resume trains an additional numTrees number of trees per predictor.

    • If Mdl contains both linear and interaction terms for predictors (predictor trees and interaction trees), then resume trains an additional numTrees number of trees per interaction term.

    resume does not add new terms to the model. If you want to add interaction terms to a model that contains only linear terms, use the addInteractions function.

    example

    UpdatedMdl = resume(Mdl,numTrees,Name,Value) specifies additional options using one or more name-value arguments. For example, 'Verbose',2 specifies the verbosity level as 2 to display diagnostic messages at every iteration.

    Examples

    collapse all

    Train a univariate classification GAM (which contains only linear terms) for a small number of iterations. After training the model for more iterations, compare the resubstitution loss.

    Load the ionosphere data set. This data set has 34 predictors and 351 binary responses for radar returns, either bad ('b') or good ('g').

    load ionosphere

    Train a univariate GAM that identifies whether the radar return is bad ('b') or good ('g'). Specify the number of trees per linear term as 2. fitcgam iterates the boosting algorithm for the specified number of iterations. For each boosting iteration, the function adds one tree per linear term. Specify 'Verbose' as 2 to display diagnostic messages at every iteration.

    Mdl = fitcgam(X,Y,'NumTreesPerPredictor',2,'Verbose',2);
    |========================================================|
    | Type | NumTrees |  Deviance  |   RelTol   | LearnRate  |
    |========================================================|
    |    1D|         0|      486.59|      -     |      -     |
    |    1D|         1|      166.71|         Inf|           1|
    |    1D|         2|      78.336|     0.58205|           1|
    

    To check whether fitcgam trains the specified number of trees, display the ReasonForTermination property of the trained model and view the displayed message.

    Mdl.ReasonForTermination
    ans = struct with fields:
          PredictorTrees: 'Terminated after training the requested number of trees.'
        InteractionTrees: ''
    
    

    Compute the classification loss for the training data.

    resubLoss(Mdl)
    ans = 0.0142
    

    Resume training the model for another 100 iterations. Because Mdl contains only linear terms, the resume function resumes training for the linear terms and adds more trees for them (predictor trees). Specify 'Verbose' and 'NumPrint' to display diagnostic messages at every 10 iterations.

    UpdatedMdl = resume(Mdl,100,'Verbose',1,'NumPrint',10);
    |========================================================|
    | Type | NumTrees |  Deviance  |   RelTol   | LearnRate  |
    |========================================================|
    |    1D|         0|      78.336|      -     |      -     |
    |    1D|         1|      38.364|     0.17429|           1|
    |    1D|        10|     0.16311|    0.011894|           1|
    |    1D|        20|  0.00035693|   0.0025178|           1|
    |    1D|        30|  8.1191e-07|   0.0011006|           1|
    |    1D|        40|  1.7978e-09|  0.00074607|           1|
    |    1D|        50|  3.6113e-12|  0.00034404|           1|
    |    1D|        60|  1.7497e-13|  0.00016541|           1|
    
    UpdatedMdl.ReasonForTermination
    ans = struct with fields:
          PredictorTrees: 'Unable to improve the model fit.'
        InteractionTrees: ''
    
    

    resume terminates training when adding more trees does not improve the deviance of the model fit.

    Compute the classification loss using the updated model.

    resubLoss(UpdatedMdl)
    ans = 0
    

    The classification loss decreases after resume updates the model with more iterations.

    Train a regression GAM that contains both linear and interaction terms. Specify to train the interaction terms for a small number of iterations. After training the interaction terms for more iterations, compare the resubstitution loss.

    Load the carbig data set, which contains measurements of cars made in the 1970s and early 1980s.

    load carbig

    Specify Acceleration, Displacement, Horsepower, and Weight as the predictor variables (X) and MPG as the response variable (Y).

    X = [Acceleration,Displacement,Horsepower,Weight];
    Y = MPG;

    Train a GAM that includes all available linear and interaction terms in X. Specify the number of trees per interaction term as 2. fitrgam iterates the boosting algorithm 300 times (default) for linear terms, and iterates the algorithm the specified number of iterations for interaction terms. For each boosting iteration, the function adds one tree per linear term or one tree per interaction term. Specify 'Verbose' as 1 to display diagnostic messages at every 10 iterations.

    Mdl = fitrgam(X,Y,'Interactions','all','NumTreesPerInteraction',2,'Verbose',1);
    |========================================================|
    | Type | NumTrees |  Deviance  |   RelTol   | LearnRate  |
    |========================================================|
    |    1D|         0|  2.4432e+05|      -     |      -     |
    |    1D|         1|      9507.4|         Inf|           1|
    |    1D|        10|      4470.6|  0.00025206|           1|
    |    1D|        20|      3895.3|  0.00011448|           1|
    |    1D|        30|      3617.7|  3.5365e-05|           1|
    |    1D|        40|      3402.5|  3.7992e-05|           1|
    |    1D|        50|      3257.1|  2.4983e-05|           1|
    |    1D|        60|      3131.8|  2.3873e-05|           1|
    |    1D|        70|      3019.8|  2.2967e-05|           1|
    |    1D|        80|      2925.9|  2.8071e-05|           1|
    |    1D|        90|      2845.3|  1.6811e-05|           1|
    |    1D|       100|      2772.7|   1.852e-05|           1|
    |    1D|       110|      2707.8|  1.6754e-05|           1|
    |    1D|       120|      2649.8|   1.651e-05|           1|
    |    1D|       130|      2596.6|  1.1723e-05|           1|
    |    1D|       140|      2547.4|   1.813e-05|           1|
    |    1D|       150|      2501.1|  1.8659e-05|           1|
    |    1D|       160|      2455.7|   1.386e-05|           1|
    |    1D|       170|      2416.9|  1.0615e-05|           1|
    |    1D|       180|      2377.2|   8.534e-06|           1|
    |    1D|       190|        2339|  7.6771e-06|           1|
    |    1D|       200|      2303.3|  9.5866e-06|           1|
    |    1D|       210|      2270.7|  8.4276e-06|           1|
    |    1D|       220|      2240.1|  8.5778e-06|           1|
    |    1D|       230|      2209.2|  9.6761e-06|           1|
    |    1D|       240|      2178.7|  7.0622e-06|           1|
    |    1D|       250|      2150.3|  8.3082e-06|           1|
    |    1D|       260|      2122.3|  7.9542e-06|           1|
    |    1D|       270|      2097.7|  7.6328e-06|           1|
    |    1D|       280|      2070.4|  9.4322e-06|           1|
    |    1D|       290|      2044.3|  7.5722e-06|           1|
    |    1D|       300|      2019.7|  6.6719e-06|           1|
    |========================================================|
    | Type | NumTrees |  Deviance  |   RelTol   | LearnRate  |
    |========================================================|
    |    2D|         0|      2019.7|      -     |      -     |
    |    2D|         1|      1795.5|   0.0005975|           1|
    |    2D|         2|      1523.4|   0.0010079|           1|
    

    To check whether fitrgam trains the specified number of trees, display the ReasonForTermination property of the trained model and view the displayed messages.

    Mdl.ReasonForTermination
    ans = struct with fields:
          PredictorTrees: 'Terminated after training the requested number of trees.'
        InteractionTrees: 'Terminated after training the requested number of trees.'
    
    

    Compute the regression loss for the training data.

    resubLoss(Mdl)
    ans = 3.8277
    

    Resume training the model for another 100 iterations. Because Mdl contains both linear and interaction terms, the resume function resumes training for the interaction terms and adds more trees for them (interaction trees).

    UpdatedMdl = resume(Mdl,100);
    |========================================================|
    | Type | NumTrees |  Deviance  |   RelTol   | LearnRate  |
    |========================================================|
    |    2D|         0|      1523.4|      -     |      -     |
    |    2D|         1|      1363.9|  0.00039695|           1|
    |    2D|        10|      594.04|  8.0295e-05|           1|
    |    2D|        20|      359.44|  4.3201e-05|           1|
    |    2D|        30|      238.51|  2.6869e-05|           1|
    |    2D|        40|      153.98|  2.6271e-05|           1|
    |    2D|        50|      91.464|  8.0936e-06|           1|
    |    2D|        60|      61.882|  3.8528e-06|           1|
    |    2D|        70|      43.206|  5.9888e-06|           1|
    
    UpdatedMdl.ReasonForTermination
    ans = struct with fields:
          PredictorTrees: 'Terminated after training the requested number of trees.'
        InteractionTrees: 'Unable to improve the model fit.'
    
    

    resume terminates training when adding more trees does not improve the deviance of the model fit.

    Compute the regression loss using the updated model.

    resubLoss(UpdatedMdl)
    ans = 0.0944
    

    The regression loss decreases after resume updates the model with more iterations.

    Input Arguments

    collapse all

    Generalized additive model, specified as a ClassificationGAM or RegressionGAM model object.

    Number of trees to add, specified as a positive integer scalar.

    Data Types: single | double

    Name-Value Arguments

    Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

    Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

    Example: 'Verbose',1,'NumPrint',100 specifies to print diagnostic messages in the Command Window every 100 iterations.

    Number of iterations between diagnostic message printouts, specified as a nonnegative integer scalar. This argument is valid only when you specify 'Verbose' as 1.

    If you specify 'Verbose',1 and 'NumPrint',numPrint, then the software displays diagnostic messages every numPrint iterations in the Command Window.

    The default value is Mdl.ModelParameters.NumPrint, which is the NumPrint value that you specify when creating the GAM object Mdl.

    Example: 'NumPrint',500

    Data Types: single | double

    Verbosity level, specified as 0, 1, or 2. The Verbose value controls the amount of information that the software displays in the Command Window.

    This table summarizes the available verbosity level options.

    ValueDescription
    0The software displays no information.
    1The software displays diagnostic messages every numPrint iterations, where numPrint is the 'NumPrint' value.
    2The software displays diagnostic messages at every iteration.

    Each line of the diagnostic messages shows the information about each boosting iteration and includes the following columns:

    • Type — Type of trained trees, 1D (predictor trees, or boosted trees for linear terms for predictors) or 2D (interaction trees, or boosted trees for interaction terms for predictors)

    • NumTrees — Number of trees per linear term or interaction term that resume added to the model so far

    • DevianceDeviance of the model

    • RelTol — Relative change of model predictions: (y^ky^k1)(y^ky^k1)/y^ky^k, where y^k is a column vector of model predictions at iteration k

    • LearnRate — Learning rate used for the current iteration

    The default value is Mdl.ModelParameters.VerbosityLevel, which is the Verbose value that you specify when creating the GAM object Mdl.

    Example: 'Verbose',1

    Data Types: single | double

    Output Arguments

    collapse all

    Updated generalized additive model, returned as a ClassificationGAM or RegressionGAM model object. UpdatedMdl has the same object type as the input model Mdl.

    To overwrite the input argument Mdl, assign the output of resume to Mdl:

    Mdl = resume(Mdl,numTrees);

    More About

    collapse all

    Deviance

    Deviance is a generalization of the residual sum of squares. It measures the goodness of fit compared to the saturated model.

    The deviance of a fitted model is twice the difference between the loglikelihoods of the model and the saturated model:

    -2(logL - logLs),

    where L and Ls are the likelihoods of the fitted model and the saturated model, respectively. The saturated model is the model with the maximum number of parameters that you can estimate.

    resume uses the deviance to measure the goodness of model fit and finds a learning rate that reduces the deviance at each iteration. Specify 'Verbose' as 1 or 2 to display the deviance and learning rate in the Command Window.

    Version History

    Introduced in R2021a