论文标题

变异推理的元学习差异

Meta-Learning Divergences of Variational Inference

论文作者

Zhang, Ruqi, Li, Yingzhen, De Sa, Christopher, Devlin, Sam, Zhang, Cheng

论文摘要

由于其计算效率和广泛的适用性,变分推断(VI)在近似贝叶斯推理中起着至关重要的作用。对于VI的性能,至关重要的是选择相关差异度量的选择,因为VI通过最大程度地减少了这种差异来近似棘手的分布。在本文中,我们提出了一种元学习算法,以学习适合感兴趣的任务的差异指标,从而自动化VI方法的设计。此外,当我们的方法被部署在几个射击的学习方案中时,我们将学习变分参数的初始化,而无需额外成本。我们证明,我们的方法在高斯混合物分布近似,贝叶斯神经网络回归,具有变异自动编码器的图像生成和具有部分变异自动编码器的推荐系统上的标准VI优于标准VI。

Variational inference (VI) plays an essential role in approximate Bayesian inference due to its computational efficiency and broad applicability. Crucial to the performance of VI is the selection of the associated divergence measure, as VI approximates the intractable distribution by minimizing this divergence. In this paper we propose a meta-learning algorithm to learn the divergence metric suited for the task of interest, automating the design of VI methods. In addition, we learn the initialization of the variational parameters without additional cost when our method is deployed in the few-shot learning scenarios. We demonstrate our approach outperforms standard VI on Gaussian mixture distribution approximation, Bayesian neural network regression, image generation with variational autoencoders and recommender systems with a partial variational autoencoder.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源