论文标题

通过主题控制和常识执行的方程式产生数学单词问题

Generating Math Word Problems from Equations with Topic Controlling and Commonsense Enforcement

论文作者

Cao, Tianyang, Zeng, Shuang, Zhao, Songge, Mansur, Mairgup, Chang, Baobao

论文摘要

近年来,在神经语言模型的帮助下,文本生成任务的进步很大。但是,存在一个具有挑战性的任务:基于数学方程生成数学问题文本,到目前为止,这几乎没有取得进展。在本文中,我们提出了一种新颖的方程式 - 问题文本生成模型。在我们的模型中,1)我们提出了一个灵活的方案来有效编码数学方程,然后我们通过一个数学方程来增强方程式编码器(vae)2),我们执行主题选择,然后进行主题记忆机制,以限制传统的生成器(限制传统的生成器),以限制传统的生成模型,我们在传统上限制了链接,我们的链接(我们是限制了传统的生成模型,我们限制了传统的生成模型,我们)将kg中相关单词的单词解码为目标,目的是将背景知识注入我们的模型。我们通过自动公制和人类评估来评估我们的模型,实验证明了我们的模型在产生的问题文本的准确性和丰富度中优于基线和以前的模型。

Recent years have seen significant advancement in text generation tasks with the help of neural language models. However, there exists a challenging task: generating math problem text based on mathematical equations, which has made little progress so far. In this paper, we present a novel equation-to-problem text generation model. In our model, 1) we propose a flexible scheme to effectively encode math equations, we then enhance the equation encoder by a Varitional Autoen-coder (VAE) 2) given a math equation, we perform topic selection, followed by which a dynamic topic memory mechanism is introduced to restrict the topic distribution of the generator 3) to avoid commonsense violation in traditional generation model, we pretrain word embedding with background knowledge graph (KG), and we link decoded words to related words in KG, targeted at injecting background knowledge into our model. We evaluate our model through both automatic metrices and human evaluation, experiments demonstrate our model outperforms baseline and previous models in both accuracy and richness of generated problem text.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源