论文标题
与对比度学习和共同信息最大化的跨域情感分类
Cross-Domain Sentiment Classification with Contrastive Learning and Mutual Information Maximization
论文作者
论文摘要
对比学习(CL)已成功地成为一种强大的表示学习方法。在这项工作中,我们提出了攀登:对比度学习,并以相互信息最大化,以探索CL在跨域情感分类的潜力。据我们所知,攀登是第一个在跨领域进行自然语言处理(NLP)任务的对比度学习的人。由于目标域上标签的稀缺性,我们引入了互助信息最大化(MIM)以外的CL来利用最能支持最终预测的功能。此外,MIM能够维持模型预测的相对平衡的分布,并扩大目标域上类之间的边距。较大的边距增加了我们的模型的鲁棒性,并使相同的分类器在跨域中是最佳的。因此,我们在Amazon-Review数据集以及航空公司数据集上实现了新的最新结果,显示了我们提出的方法攀登的功效。
Contrastive learning (CL) has been successful as a powerful representation learning method. In this work we propose CLIM: Contrastive Learning with mutual Information Maximization, to explore the potential of CL on cross-domain sentiment classification. To the best of our knowledge, CLIM is the first to adopt contrastive learning for natural language processing (NLP) tasks across domains. Due to scarcity of labels on the target domain, we introduce mutual information maximization (MIM) apart from CL to exploit the features that best support the final prediction. Furthermore, MIM is able to maintain a relatively balanced distribution of the model's prediction, and enlarges the margin between classes on the target domain. The larger margin increases our model's robustness and enables the same classifier to be optimal across domains. Consequently, we achieve new state-of-the-art results on the Amazon-review dataset as well as the airlines dataset, showing the efficacy of our proposed method CLIM.