论文标题

重要性回火:过度参数化模型的组鲁棒性

Importance Tempering: Group Robustness for Overparameterized Models

论文作者

Lu, Yiping, Ji, Wenlong, Izzo, Zachary, Ying, Lexing

论文摘要

尽管过度参数化的模型已经在许多机器学习任务上表现出成功,但与培训不同的测试分布的准确性可能会下降。这种准确性下降仍然限制了在野外应用机器学习的限制。同时,重要的加权是一种处理分配变化的传统技术,已被证明对在经验和理论上对过度参数化模型的影响较小甚至没有影响。在本文中,我们提出了重要的回火来改善决策界限,并为过度参数化模型取得更好的结果。从理论上讲,我们证明在标签移位和虚假相关设置下,组温度的选择可能不同。同时,我们还证明,正确选择的温度可以解脱出少数群体的分类不平衡。从经验上讲,我们使用重要性回火来实现最严重的小组分类任务的最新结果。

Although overparameterized models have shown their success on many machine learning tasks, the accuracy could drop on the testing distribution that is different from the training one. This accuracy drop still limits applying machine learning in the wild. At the same time, importance weighting, a traditional technique to handle distribution shifts, has been demonstrated to have less or even no effect on overparameterized models both empirically and theoretically. In this paper, we propose importance tempering to improve the decision boundary and achieve consistently better results for overparameterized models. Theoretically, we justify that the selection of group temperature can be different under label shift and spurious correlation setting. At the same time, we also prove that properly selected temperatures can extricate the minority collapse for imbalanced classification. Empirically, we achieve state-of-the-art results on worst group classification tasks using importance tempering.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源