How do I help quadprog converge?

Quadprog quits with exitFlag = -2, and the message below. How do I help it converge to a feasible point?
"quadprog optimization failed: Converged to an infeasible point.
quadprog stopped because the size of the current step is less than the default value of the step size tolerance but constraints are not satisfied to within the selected value of the constraint tolerance.
Stopping criteria details:
Optimization stopped because the relative changes in all elements of x are less than options.StepTolerance = 1.000000e-12, but the relative maximum constraint violation, 1.902013e-14, exceeds options.ConstraintTolerance = 1.000000e-06.
Optimization Metric Options
max(abs(delta_x./x)) = 1.82e-13 StepTolerance = 1e-12 (default)
relative max(constraint violation) = 1.90e-14 ConstraintTolerance = 1e-06 (selected)"

9 Comments

What kind of constraints do you have? If you have only bounds constraints then use trust-region-reflective and supply x0. In all other cases x0 is ignored and you might need to switch minimizers to one that handles x0
I have both equality constraints and bounds and so cannot use trust-region-reflective. I am using the default, 'interior-point-convex'.
The message says "...the relative maximum constraint violation, 1.902013e-14, exceeds options.ConstraintTolerance = 1.000000e-06" How can 1.9e-14 exceed 1e-6?
Thanks.
Which release are you using? I seem to recall that there was one release that had a bug about that. My memory is claiming that it was r2017b but I would not trust that.
Are you sure that the solver failed (even if the exit message makes no sense)? Does the result satisfy the constraints and give a low optimality measure?
Also, what are the dimensions of the problem and constraint matrices? Could you possibly eliminate the equality constraints and rewrite the problem with inequalities only?
Thank you. I was treating a negative exit flag as a failure, but that is too stringent.
Eliminating equality constraints is possible but would add considerable complexity to the code.
Your first comment may just solve the problem for me.
Thank you
Leigh Sneddon's comment moved here:
Interesting: I am using r2017b, release 9. Is there a way to find out what optimization bugs are in that release?
Sometimes a solution with a negative exit flag needs to be rejected. The question then is how to decide whether to keep or reject a negative flag solution. Is checking that the constraints are satisfied and the optimality measure is low a good rule of thumb for making this decision?
And what constitutes a "low" optimality measure?
Thank you!
Leigh
I cannot find the information on the problem that came to mind; unfortunately the bug reports are now difficult to search :(
OK. Thanks.

Sign in to comment.

Answers (1)

Matt J
Matt J on 15 Aug 2019
Edited: Matt J on 15 Aug 2019
Is checking that the constraints are satisfied and the optimality measure is low a good rule of thumb for making this decision?
Checking the first order KKT conditions would be the best test, assuming your quadratic is convex. The final output argument of quadprog gives the solver's idea of the optimal Lagrange multipliers,
[x,fval,exitflag,output,lambda] = quadprog(___)
But I would first recommend upgrading to a Matlab version that doesn't have this bug.

2 Comments

Optimizer precision limitations will mean that none of the conditions is satisfied exactly. Is there a way of knowing, given the tolerances used, how much mismatch is consistent with a correct solution?
Matt J
Matt J on 16 Aug 2019
Edited: Matt J on 16 Aug 2019
What does "consistent with a correct solution" mean to you? Even if quadprog's exit message had been a proper one, what is the deviation distance from the true optimum that your application can tolerate, and how would you have known that the result is within that distance if quadprog had behaved normally?

Sign in to comment.

Categories

Asked:

on 15 Aug 2019

Edited:

on 16 Aug 2019

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!