论文标题

SMDT:选择性存储器的神经文档翻译

SMDT: Selective Memory-Augmented Neural Document Translation

论文作者

Zhang, Xu, Yang, Jian, Huang, Haoyang, Ma, Shuming, Zhang, Dongdong, Li, Jinlong, Wei, Furu

论文摘要

现有的文档级神经机器翻译(NMT)模型已经充分探索了不同的上下文设置,以为目标生成提供指导。但是,很少关注为丰富的上下文信息开设更多样化的环境。在本文中,我们提出了一个选择性记忆的神经文档翻译模型,以处理包含上下文中的大假设空间的文档。具体来说,我们从训练语料库中检索了类似的双语句子对来增强全球环境,然后使用选择性机制扩展了两流注意模型,以捕获本地环境和各种全球环境。这种统一的方法使我们的模型可以在三个公开文档级的机器翻译数据集上优雅地训练,并显着优于先前的文档级NMT模型。

Existing document-level neural machine translation (NMT) models have sufficiently explored different context settings to provide guidance for target generation. However, little attention is paid to inaugurate more diverse context for abundant context information. In this paper, we propose a Selective Memory-augmented Neural Document Translation model to deal with documents containing large hypothesis space of the context. Specifically, we retrieve similar bilingual sentence pairs from the training corpus to augment global context and then extend the two-stream attention model with selective mechanism to capture local context and diverse global contexts. This unified approach allows our model to be trained elegantly on three publicly document-level machine translation datasets and significantly outperforms previous document-level NMT models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源