Indexed by:
Abstract:
Nonconvex minimax problems frequently arise in machine learning, distributionally robust optimization, and many other research fields. In this paper, we propose a Stochastic Alternating Mirror Descent Ascent with Momentum (SAMDAM) algorithm to solve nonconvex-strongly concave minimax optimization problems. SAMDAM employs simple mirror descent ascent steps along with momentum acceleration to update the variables \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$x$$\end{document} and \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$y$$\end{document} alternately at each iteration. We further prove that SAMDAM achieves a gradient complexity of \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathcal{O}(\kappa<^>{3}\epsilon<^>{-4})$$\end{document} for finding an \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\epsilon$$\end{document}-stationary point in stochastic nonconvex settings, where \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\kappa$$\end{document} represents the condition number of the problem. Finally, computational experiments demonstrate that SAMDAM outperforms several state-of-the-art algorithms in distributionally robust optimization and fair classification tasks.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
JOURNAL OF APPLIED MATHEMATICS AND COMPUTING
ISSN: 1598-5865
Year: 2025
2 . 4 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: