论文标题

KEML:含词汇关系分类的知识增强的元学习框架

KEML: A Knowledge-Enriched Meta-Learning Framework for Lexical Relation Classification

论文作者

Wang, Chengyu, Qiu, Minghui, Huang, Jun, He, Xiaofeng

论文摘要

词汇关系描述了概念如何以关系三元组的形式与语义相关。由于模式的稀疏性表明存在这种关系的存在,因此对概念之间词汇关系的准确预测很具有挑战性。我们提出了富含知识的元学习(KEML)框架,以解决词汇关系分类的任务。在Keml中,提出了LKB-Be​​rt(词汇知识基础)模型,以从大规模的文本语料库中学习概念表示,并以遥远的监督注入了丰富的词汇知识。定义了辅助任务的概率分布,以提高模型识别不同类型的词汇关系的能力。我们进一步将元学习过程结合了辅助任务分布,并有监督的学习来培训神经词汇关系分类器。多个数据集的实验表明,KEML的表现优于最先进的方法。

Lexical relations describe how concepts are semantically related, in the form of relation triples. The accurate prediction of lexical relations between concepts is challenging, due to the sparsity of patterns indicating the existence of such relations. We propose the Knowledge-Enriched Meta-Learning (KEML) framework to address the task of lexical relation classification. In KEML, the LKB-BERT (Lexical Knowledge Base-BERT) model is presented to learn concept representations from massive text corpora, with rich lexical knowledge injected by distant supervision. A probabilistic distribution of auxiliary tasks is defined to increase the model's ability to recognize different types of lexical relations. We further combine a meta-learning process over the auxiliary task distribution and supervised learning to train the neural lexical relation classifier. Experiments over multiple datasets show that KEML outperforms state-of-the-art methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源