论文标题
降级深层生成模型
Denoising Deep Generative Models
论文作者
论文摘要
基于可能性的深层生成模型最近已显示出在歧管假设下表现出病理行为,这是由于使用高维密度来模拟具有低维结构的数据。在本文中,我们提出了两种旨在解决此问题的方法。两者都是基于在数据中添加高斯噪声以消除训练过程中的维度不匹配的,并且都提供了一种剥落机制,其目标是从模型中进行采样,好像没有在数据中添加噪声。我们的第一种方法是基于Tweedie的公式,第二种方法是将增加噪声作为条件输入的差异。我们表明,令人惊讶的是,尽管有充分的动力,但这些方法只是偶尔会提高性能,而不是增加噪声,而解决方案不匹配的其他方法在经验上更加足够。
Likelihood-based deep generative models have recently been shown to exhibit pathological behaviour under the manifold hypothesis as a consequence of using high-dimensional densities to model data with low-dimensional structure. In this paper we propose two methodologies aimed at addressing this problem. Both are based on adding Gaussian noise to the data to remove the dimensionality mismatch during training, and both provide a denoising mechanism whose goal is to sample from the model as though no noise had been added to the data. Our first approach is based on Tweedie's formula, and the second on models which take the variance of added noise as a conditional input. We show that surprisingly, while well motivated, these approaches only sporadically improve performance over not adding noise, and that other methods of addressing the dimensionality mismatch are more empirically adequate.