论文标题
Fednest:联合的二线,minimax和组成优化
FedNest: Federated Bilevel, Minimax, and Compositional Optimization
论文作者
论文摘要
标准联合优化方法成功地适用于单层结构的随机问题。然而,许多当代的ML问题(包括对抗性鲁棒性,超参数调整和参与者批评)属于嵌套的二线编程,这些编程包含了最小值和组成优化。在这项工作中,我们提出了\ fedblo:一种联合交替的随机梯度方法来解决一般嵌套问题。我们在存在异质数据的情况下为\ fedblo建立了可证明的收敛速率,并引入了二二元,最小值和组成优化的变化。 \ fedblo引入了多种创新,包括联邦高级计算和降低方差,以解决内部级别的异质性。我们通过有关超参数\&超代理学习和最小值优化的实验来补充我们的理论,以证明我们方法在实践中的好处。代码可在https://github.com/ucr-optml/fednest上找到。
Standard federated optimization methods successfully apply to stochastic problems with single-level structure. However, many contemporary ML problems -- including adversarial robustness, hyperparameter tuning, and actor-critic -- fall under nested bilevel programming that subsumes minimax and compositional optimization. In this work, we propose \fedblo: A federated alternating stochastic gradient method to address general nested problems. We establish provable convergence rates for \fedblo in the presence of heterogeneous data and introduce variations for bilevel, minimax, and compositional optimization. \fedblo introduces multiple innovations including federated hypergradient computation and variance reduction to address inner-level heterogeneity. We complement our theory with experiments on hyperparameter \& hyper-representation learning and minimax optimization that demonstrate the benefits of our method in practice. Code is available at https://github.com/ucr-optml/FedNest.