论文标题
揭开几次班级学习的基础和新颖的表演
Demystifying the Base and Novel Performances for Few-shot Class-incremental Learning
论文作者
论文摘要
很少有班级学习(FSCIL)解决了具有挑战性的现实情况,在这些情况下,看不见的小说班级不断到达几个样本。在这些情况下,需要开发一个模型,该模型在不忘记先验知识的情况下识别新颖类。换句话说,FSCIL旨在保持基本表现并同时提高新颖性能。但是,几乎没有研究分别研究这两种表演。在本文中,我们首先将整个模型分解为四种类型的参数,并证明这两种性能的趋势随着新颖类别出现时的更新参数而变化很大。基于分析,我们为FSCIL提出了一种简单的方法,该方法是将其作为非PC的,它使用了标准化的原型分类器,而无需进一步培训增量新颖的类别。结果表明,我们的直接方法与先进的最新算法具有可比性的性能。
Few-shot class-incremental learning (FSCIL) has addressed challenging real-world scenarios where unseen novel classes continually arrive with few samples. In these scenarios, it is required to develop a model that recognizes the novel classes without forgetting prior knowledge. In other words, FSCIL aims to maintain the base performance and improve the novel performance simultaneously. However, there is little study to investigate the two performances separately. In this paper, we first decompose the entire model into four types of parameters and demonstrate that the tendency of the two performances varies greatly with the updated parameters when the novel classes appear. Based on the analysis, we propose a simple method for FSCIL, coined as NoNPC, which uses normalized prototype classifiers without further training for incremental novel classes. It is shown that our straightforward method has comparable performance with the sophisticated state-of-the-art algorithms.