论文标题
条件崩溃:培训班级条件gan的数据有限
Collapse by Conditioning: Training Class-conditional GANs with Limited Data
论文作者
论文摘要
班级条件提供了一种直接的手段来控制基于离散输入变量的生成对抗网络(GAN)。虽然在许多应用中必要,但班级标签提供的其他信息甚至可以期望有利于GAN本身的培训。相反,我们观察到,阶级条件导致模式在有限的数据设置中崩溃,在这种情况下,无条件学习会导致令人满意的生成能力。在这一观察结果的推动下,我们提出了针对类条件gan(CGAN)的培训策略,该策略通过利用无条件学习有效地阻止了观察到的模式崩溃。我们的培训策略始于无条件的gan,然后逐渐将班级调理到发电机和目标功能中。由于跨课程对共享信息的早期开发,不仅在稳定培训中,而且在产生高质量的图像中训练具有有限数据的CGAN的方法不仅在产生高质量的图像中。我们分析了四个数据集的综合实验中观察到的模式崩溃问题。与最先进的方法和既定基线相比,我们的方法表现出了出色的结果。该代码可从https://github.com/mshahbazi72/transitional-cgan获得
Class-conditioning offers a direct means to control a Generative Adversarial Network (GAN) based on a discrete input variable. While necessary in many applications, the additional information provided by the class labels could even be expected to benefit the training of the GAN itself. On the contrary, we observe that class-conditioning causes mode collapse in limited data settings, where unconditional learning leads to satisfactory generative ability. Motivated by this observation, we propose a training strategy for class-conditional GANs (cGANs) that effectively prevents the observed mode-collapse by leveraging unconditional learning. Our training strategy starts with an unconditional GAN and gradually injects the class conditioning into the generator and the objective function. The proposed method for training cGANs with limited data results not only in stable training but also in generating high-quality images, thanks to the early-stage exploitation of the shared information across classes. We analyze the observed mode collapse problem in comprehensive experiments on four datasets. Our approach demonstrates outstanding results compared with state-of-the-art methods and established baselines. The code is available at https://github.com/mshahbazi72/transitional-cGAN