Skip to yearly menu bar Skip to main content


Poster

Delving into the Convergence of Minimax Optimization

Wenhan Xian · Ziyi Chen · Heng Huang


Abstract:

Minimax optimization is fundamental and important to enormous machine learning applications such as generative adversarial network, adversarial training and robust optimization. Recently, a variety of minimax algorithms are proposed with the theoretical guarantees based on Lipschitz smoothness. However, these algorithms could fail to converge in practice because the requisite Lipschitz smooth condition may not hold even in some classic minimax problems. We will present some counterexamples to reveal this divergence issue. Thus, to fill this gap, we are motivated to delve into the convergence analysis of minimax algorithms under the condition without Lipschitz smoothness. We prove that an adaptive stepsize strategy can improve the convergence of basic minimax optimization algorithms GDA, SGDA, GDmax and SGDmax with the relaxation of Lipschitz smoothness such that their theoretical guarantees of convergence can be extended to a wider range of applications. We also conduct a numerical experiment to validate the performance of our adaptive minimax algorithms.

Live content is unavailable. Log in and register to view live content