论文标题
超出单调性的近端点和基于外部的方法的融合:负共体性的情况
Convergence of Proximal Point and Extragradient-Based Methods Beyond Monotonicity: the Case of Negative Comonotonicity
论文作者
论文摘要
在单调性假设下,经常研究用于最小值优化和变异不平等的算法。由非单调机器学习应用激励,我们遵循[Diakonikolas等人,2021年,Lee和Kim,2021年,Pethick等,2022年,Böhm,2022年]的作品,旨在通过考虑较弱的负共体性假设来超越单调性。特别是,我们为本设置中的近端,外部和乐观的梯度方法提供了严格的复杂性分析,从而结束了有关其工作保证超出单调性的一些问题。
Algorithms for min-max optimization and variational inequalities are often studied under monotonicity assumptions. Motivated by non-monotone machine learning applications, we follow the line of works [Diakonikolas et al., 2021, Lee and Kim, 2021, Pethick et al., 2022, Böhm, 2022] aiming at going beyond monotonicity by considering the weaker negative comonotonicity assumption. In particular, we provide tight complexity analyses for the Proximal Point, Extragradient, and Optimistic Gradient methods in this setup, closing some questions on their working guarantees beyond monotonicity.