论文标题

您也可以拥有数据并平衡:迈向平衡,高效的多语言模型

You Can Have Your Data and Balance It Too: Towards Balanced and Efficient Multilingual Models

论文作者

Limisiewicz, Tomasz, Malkin, Dan, Stanovsky, Gabriel

论文摘要

多语言模型已被广泛用于跨语言转移到低资源语言。但是,这些语言的表现受到预读数据的代表性不足的阻碍。为了减轻这个问题,我们提出了一种基于教师知识蒸馏的新型多语言培训技术。在这种情况下,我们利用针对其语言优化的单语教师模型。我们使用这些老师以及平衡的(子采样)数据,将教师的知识提炼成一个多语言的学生。我们的方法在使用相同数量的数据时,以低资源语言的语言优于低资源语言的标准培训方法,并在高资源语言上恢复性能。如果广泛应用,我们的方法可以增加NLP系统中低资源语言的表示。

Multilingual models have been widely used for cross-lingual transfer to low-resource languages. However, the performance on these languages is hindered by their underrepresentation in the pretraining data. To alleviate this problem, we propose a novel multilingual training technique based on teacher-student knowledge distillation. In this setting, we utilize monolingual teacher models optimized for their language. We use those teachers along with balanced (sub-sampled) data to distill the teachers' knowledge into a single multilingual student. Our method outperforms standard training methods in low-resource languages and retrains performance on high-resource languages while using the same amount of data. If applied widely, our approach can increase the representation of low-resource languages in NLP systems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源