Indexed by:
Abstract:
Nonconvex minimax problems frequently arise in machine learning, distributionally robust optimization, and many other research fields. In this paper, we propose a Stochastic Alternating Mirror Descent Ascent with Momentum (SAMDAM) algorithm to solve nonconvex-strongly concave minimax optimization problems. SAMDAM employs simple mirror descent ascent steps along with momentum acceleration to update the variables and alternately at each iteration. We further prove that SAMDAM achieves a gradient complexity of for finding an -stationary point in stochastic nonconvex settings, where represents the condition number of the problem. Finally, computational experiments demonstrate that SAMDAM outperforms several state-of-the-art algorithms in distributionally robust optimization and fair classification tasks. © The Author(s) under exclusive licence to Korean Society for Informatics and Computational Applied Mathematics 2025.
Keyword:
Reprint 's Address:
Email:
Source :
Journal of Applied Mathematics and Computing
ISSN: 1598-5865
Year: 2025
2 . 4 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: