Global convergence guarantees for adaptive gradient algorithms with Barzilai–Borwein and alternative step-length strategies

Main Article Content

Alyaqdhan Ammar Abed

Abstract

Motivated by recent progress in adaptive schemes for convex optimization, this work develops a proximal-gradient framework that enforces global convergence without resorting to linesearch procedures. The proposed approach accommodates widely used step-length rules, including Barzilai–Borwein updates and one-dimensional Anderson-type acceleration. Importantly, the analysis applies to problems where the smooth component admits only local Hölder continuity of its gradient. The resulting theory unifies and strengthens several existing results, while numerical experiments confirm the practical benefits of coupling aggressive step-length selection with adaptive safeguarding mechanisms.

Article Details

Section

Articles

How to Cite

Global convergence guarantees for adaptive gradient algorithms with Barzilai–Borwein and alternative step-length strategies (Alyaqdhan Ammar Abed , Trans.). (2026). Babylonian Journal of Mathematics, 2026, 19-23. https://doi.org/10.58496/BJM/2026/003