Very fast code for solving lasso and non-negative least-squares problems
You are now following this Submission
- You will see updates in your followed content feed
- You may receive emails, depending on your communication preferences
Proximal gradient algorithm for convex optimization, using a diagonal +/- rank-1 norm. Uses special tricks to allow the use of a quasi-Newton methods.
Cite As
Stephen Becker (2026). zeroSR1 (https://github.com/stephenbeckr/zeroSR1), GitHub. Retrieved .
Acknowledgements
Inspired by: NNLS and constrained regression, predictor-corrector algorithm, nnls, active set algorithm, newton's algorithm for nnls, MTRON, LARS algorithm, LBFGSB (L-BFGS-B) mex wrapper, mex interface for bound constrained optimization via ASA, nnls - Non negative least squares, Simple MATLAB example code and generic function to perform LASSO
Inspired: mex interface for bound constrained optimization via ASA
General Information
- Version 1.0.0.0 (196 KB)
-
View License on GitHub
MATLAB Release Compatibility
- Compatible with any release
Platform Compatibility
- Windows
- macOS
- Linux
Versions that use the GitHub default branch cannot be downloaded
| Version | Published | Release Notes | Action |
|---|---|---|---|
| 1.0.0.0 |
