论文标题

合奏变压器的高效和准确排名任务:问答系统的应用程序

Ensemble Transformer for Efficient and Accurate Ranking Tasks: an Application to Question Answering Systems

论文作者

Matsubara, Yoshitomo, Soldaini, Luca, Lind, Eric, Moschitti, Alessandro

论文摘要

大型变压器模型可以高度改善答案句子的选择(AS2)任务,但是它们的高计算成本阻止了它们在许多实际应用中的使用。在本文中,我们探讨了以下研究问题:如何使AS2模型更准确而不显着提高其模型复杂性?为了解决这个问题,我们提出了一个多个负责人的学生体系结构(名为Cerberus),这是一个有效的神经网络,旨在将大型变压器的合奏提炼成单个较小的模型。 Cerberus由两个组成部分组成:用于编码输入的变压器层堆栈,以及一组排名头;与传统的蒸馏技术不同,它们每个人都通过维护合奏成员多样性的方式来培训各种大型变压器架构。所得模型仅使用一些额外的参数捕获了异质变压器模型的知识。我们显示了Cerberus在AS2的三个英语数据集上的有效性;我们提出的方法的表现优于我们考虑的所有单模蒸馏,与最先进的大型AS2型号相媲美,该型号具有2.7倍的参数并慢2.5倍。我们的模型代码可在https://github.com/amazon-research/wqa-cerberus上找到

Large transformer models can highly improve Answer Sentence Selection (AS2) tasks, but their high computational costs prevent their use in many real-world applications. In this paper, we explore the following research question: How can we make the AS2 models more accurate without significantly increasing their model complexity? To address the question, we propose a Multiple Heads Student architecture (named CERBERUS), an efficient neural network designed to distill an ensemble of large transformers into a single smaller model. CERBERUS consists of two components: a stack of transformer layers that is used to encode inputs, and a set of ranking heads; unlike traditional distillation technique, each of them is trained by distilling a different large transformer architecture in a way that preserves the diversity of the ensemble members. The resulting model captures the knowledge of heterogeneous transformer models by using just a few extra parameters. We show the effectiveness of CERBERUS on three English datasets for AS2; our proposed approach outperforms all single-model distillations we consider, rivaling the state-of-the-art large AS2 models that have 2.7x more parameters and run 2.5x slower. Code for our model is available at https://github.com/amazon-research/wqa-cerberus

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源