论文标题

优化$ f $ -Divergence时,标签噪音很强

When Optimizing $f$-divergence is Robust with Label Noise

论文作者

Wei, Jiaheng, Liu, Yang

论文摘要

我们显示,就分类器的预测而最大化正确定义的$ f $ divergence量度,并且有监督的标签对标签噪声具有牢固的态度。为了利用其变异形式,当标签噪声呈现标签噪声时,我们为一个$ f $ divergence的家族提供了一个不错的脱钩属性,在该标签噪声呈现时,差异被证明是在清洁分布上定义的变异差的线性组合,并且由于噪声而引入了偏置术语。上述推导有助于我们分析不同$ f $ divergence功能的鲁棒性。凭借建立的鲁棒性,这个$ f $ divergence函数的家族作为使用嘈杂标签的学习问题而成为有用的指标,这不需要标签的噪声速率规范。当它们可能不健壮时,我们提出修复程序以使其成为这样。除了分析结果外,我们还提供了彻底的实验证据。我们的代码可从https://github.com/ucsc-real/robust-f-divergence-measures获得。

We show when maximizing a properly defined $f$-divergence measure with respect to a classifier's predictions and the supervised labels is robust with label noise. Leveraging its variational form, we derive a nice decoupling property for a family of $f$-divergence measures when label noise presents, where the divergence is shown to be a linear combination of the variational difference defined on the clean distribution and a bias term introduced due to the noise. The above derivation helps us analyze the robustness of different $f$-divergence functions. With established robustness, this family of $f$-divergence functions arises as useful metrics for the problem of learning with noisy labels, which do not require the specification of the labels' noise rate. When they are possibly not robust, we propose fixes to make them so. In addition to the analytical results, we present thorough experimental evidence. Our code is available at https://github.com/UCSC-REAL/Robust-f-divergence-measures.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源