Main Content

findopOptions

Set options for finding operating points from specifications

Description

options = findopOptions returns the default operating point search options.

example

options = findopOptions(Name,Value) returns an option set with additional options specified by one or more Name,Value pair arguments. Use this option set to specify options for the findop command.

Examples

collapse all

Create an option set for operating point search that sets the optimizer type to gradient descent and suppresses the display output of findop.

option = findopOptions('OptimizerType','graddescent','DisplayReport','off');

Alternatively, use dot notation to set the values of options.

options = findopOptions;
options.OptimizerType = 'graddescent';
options.DisplayReport = 'off';

Input Arguments

collapse all

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

Example: 'DisplayReport','off' suppresses the display of the operating point search report to the Command Window.

Optimizer type used by the optimization algorithm, specified as the comma-separated pair consisting of 'OptimizerType' and one of the following:

  • 'graddescent-elim' — Enforce an equality constraint to force the time derivatives of states to be zero (dx/dt = 0, x(k+1) = x(k)) and output signals to be equal to their specified known values. The optimizer fixes the states, x, and inputs, u, that are marked as Known in an operating point specification, and optimizes the remaining variables.

  • 'graddescent' — Enforce an equality constraint to force the time derivatives of states to be zero (dx/dt = 0, x(k+1) = x(k)) and the output signals to be equal to their specified known values. The optimizer also minimizes the error between the states, x, and inputs, u, and their respective known values from an operating point specification. If there are not any inputs or states marked as Known, findop attempts to minimize the deviation between the initial guesses for x and u, and their trimmed values.

  • 'graddescent-proj' — In addition to 'graddescent', enforce consistency of model initial conditions at each function evaluation. To specify whether constraints are hard or soft, use the ConstraintType option. This optimization method does not support analytical Jacobians.

  • 'lsqnonlin' — Fix the states, x, and inputs, u, marked as Known in an operating point specification, and optimize the remaining variables. The algorithm tries to minimize both the error in the time derivatives of the states (dx/dt = 0, x(k+1) = x(k)) and the error between the outputs and their specified known values.

  • 'lsqnonlin-proj' — In addition to 'lsqnonlin', enforce consistency of model initial conditions at each function evaluation. This optimization method does not support analytical Jacobians.

  • 'simplex' — Use the same cost function as lsqnonlin with the direct search optimization routine found in fminsearch.

For more information about these optimization algorithms, see fmincon (Optimization Toolbox), lsqnonlin (Optimization Toolbox), and fminsearch.

Options for the optimization algorithm, specified as the comma-separated pair consisting of 'OptimizationOptions' and a structure created using the optimset (Optimization Toolbox) function.

Flag indicating whether to display the operating point summary report, specified as the comma-separated pair consisting of 'DisplayReport' and one of the following:

  • 'on' — Display the operating point summary report in the MATLAB® command window when running findop.

  • 'off' — Suppress display of the summary report.

  • 'iter' — Display an iterative update of the optimization progress.

Flag indicating whether to recompile the model when varying parameter values for trimming, specified as the comma-separated pair consisting of 'AreParamsTunable' and one of the following:

  • true — Do not recompile the model when all varying parameters are tunable. If any varying parameters are not tunable, recompile the model for each parameter grid point, and issue a warning message.

  • false — Recompile the model for each parameter grid point. Use this option when you vary the values of nontunable parameters.

Constraint types for 'graddescent-proj' optimizer algorithm, specified as the comma-separated pair consisting of 'ConstraintType' and a structure with the following fields:

  • dx — Type for constraints on state derivatives

  • x — Type for constraints on state values

  • y — Type for constraints on output values

Specify each constraint as one of the following:

  • 'hard' — Enforce the constraints to be zero.

  • 'soft' — Minimize the constraints.

All constraint types are 'hard' by default.

Output Arguments

collapse all

Trimming options, returned as a findopOptions option set.

Version History

Introduced in R2013b

expand all

See Also