Preprint
Deterministic and Stochastic Accelerated Gradient Method for Convex Semi-Infinite Optimization
arXiv.org
Cornell University
10/17/2023
DOI: 10.48550/arxiv.2310.10993
Abstract
This paper explores numerical methods for solving a convex differentiable semi-infinite program. We introduce a primal-dual gradient method which performs three updates iteratively: a momentum gradient ascend step to update the constraint parameters, a momentum gradient ascend step to update the dual variables, and a gradient descend step to update the primal variables. Our approach also extends to scenarios where gradients and function values are accessible solely through stochastic oracles. This method extends the recent primal-dual methods, for example, Hamedani and Aybat (2021); Boob et al. (2022), for optimization with a finite number of constraints. We show the iteration complexity of the proposed method for finding an ϵ-optimal solution under different convexity and concavity assumptions on the functions.
Details
- Title: Subtitle
- Deterministic and Stochastic Accelerated Gradient Method for Convex Semi-Infinite Optimization
- Creators
- Yao YaoQihang LinTianbao Yang
- Resource Type
- Preprint
- Publication Details
- arXiv.org
- DOI
- 10.48550/arxiv.2310.10993
- eISSN
- 2331-8422
- Publisher
- Cornell University
- Language
- English
- Date posted
- 10/17/2023
- Academic Unit
- Computer Science; Business Analytics
- Record Identifier
- 9984484580402771
Metrics
13 Record Views