Logo image
Single-loop Algorithms for Stochastic Non-Convex Optimization with Weakly-Convex Constraints
Journal article   Open access   Peer reviewed

Single-loop Algorithms for Stochastic Non-Convex Optimization with Weakly-Convex Constraints

Ming Yang, Gang Li, Quanqi Hu, Qihang Lin and Tianbao Yang
Transactions on Machine Learning Research, Vol.2026, pp.1-29
02/16/2026
url
https://openreview.net/pdf?id=aCgOR2KvAIView
Published (Version of record) Open Access

Abstract

Constrained optimization with multiple functional inequality constraints has significant applications in machine learning. This paper examines a crucial subset of such problems where both the objective and constraint functions are weakly convex. Existing methods often face limitations, including slow convergence rates or reliance on double-loop algorithmic designs. To overcome these challenges, we introduce a novel single-loop penalty-based stochastic algorithm. Following the classical exact penalty method, our approach employs a hinge-based penalty, which permits the use of a constant penalty parameter, enabling us to achieve a state-of-the-art complexity for finding an approximate Karush-Kuhn-Tucker (KKT) solution. We further extend our algorithm to address finite-sum coupled compositional objectives, which are prevalent in artificial intelligence applications, establishing improved complexity over existing approaches. Finally, we validate our method through experiments on fair learning with receiver operating characteristic (ROC) fairness constraints and continual learning with non-forgetting constraints.

Details

Metrics

1 Record Views
Logo image