Logo image
Deterministic and Stochastic Accelerated Gradient Method for Convex Semi-Infinite Optimization
Preprint   Open access

Deterministic and Stochastic Accelerated Gradient Method for Convex Semi-Infinite Optimization

Yao Yao, Qihang Lin and Tianbao Yang
arXiv.org
Cornell University
10/17/2023
DOI: 10.48550/arxiv.2310.10993
url
https://doi.org/10.48550/arxiv.2310.10993View
Preprint (Author's original)This preprint has not been evaluated by subject experts through peer review. Preprints may undergo extensive changes and/or become peer-reviewed journal articles. Open Access

Abstract

This paper explores numerical methods for solving a convex differentiable semi-infinite program. We introduce a primal-dual gradient method which performs three updates iteratively: a momentum gradient ascend step to update the constraint parameters, a momentum gradient ascend step to update the dual variables, and a gradient descend step to update the primal variables. Our approach also extends to scenarios where gradients and function values are accessible solely through stochastic oracles. This method extends the recent primal-dual methods, for example, Hamedani and Aybat (2021); Boob et al. (2022), for optimization with a finite number of constraints. We show the iteration complexity of the proposed method for finding an ϵ-optimal solution under different convexity and concavity assumptions on the functions.
Mathematics - Optimization and Control

Details

Metrics

13 Record Views
Logo image