The optimization algorithms
Version 04.2023.01 (143 KB) by
Kenouche Samir
Algorithms (Newton, Quasi-Newton as well as gradient descent methods) are implemented by considering some objective functions (1D and 2D).
The optimization algorithms (Newton, Quasi-Newton as well as gradient descent methods) are implemented by considering objective functions in one and two dimensions. The Newton and quasi-Newton methods may encounter problems such as the Hessian is too complex or does not exist. The requirement to apply a matrix inversion at each iteration, this can be prohibitive for optimization problems involving many variables. These methods can therefore become impractical. An alternative is to use the family of gradient descent algorithms. These methods do not require explicit computation or Hessian approximation. A gradient descent algorithm is implemented by choosing successive descent directions and the amplitude of the descent step in the chosen direction. This family of algorithms is widely used in optimization processes of more or less complex problems. The term descent arises because these algorithms look for the extrema in an opposite direction to that of the objective function's gradient.
Explanatory algorithmic schemes are available in the user guide.
Cite As
Kenouche Samir (2025). The optimization algorithms (https://www.mathworks.com/matlabcentral/fileexchange/128008-the-optimization-algorithms), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Created with
R2023a
Compatible with any release
Platform Compatibility
Windows macOS LinuxTags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
optimization_algorithms
Version | Published | Release Notes | |
---|---|---|---|
04.2023.01 |