论文标题
迭代的自我知识蒸馏 - 从坑洼分类到细粒度和共同识别
Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition
论文作者
论文摘要
坑洼的分类已成为道路检查车辆的重要任务,以从潜在的汽车事故和维修账单中节省驾驶员。鉴于计算能力有限和固定数量的训练时期,我们建议迭代自我知识蒸馏(ISKD)来训练轻质的坑洼分类器。 ISKD旨在在知识蒸馏方面改善教师和学生模型,在四个轻量级网络体系结构上的三个坑洼分类数据集上优于最先进的自我知识蒸馏方法,该方法支持自我知识蒸馏应迭代地进行,而不是一次。教师和学生模型之间的准确性关系表明,学生模型仍然可以从受过中等培训的教师模型中受益。这意味着更好的教师模型通常会产生更好的学生模型,我们的结果证明了ISKD的设计是合理的。除了坑洼分类外,我们还证明了ISKD在与通用分类,细粒度分类和医学成像应用相关的六个附加数据集上的功效,这些数据支持ISKD可以作为通用性能提升,而无需给定的教师模型和额外的可训练参数。
Pothole classification has become an important task for road inspection vehicles to save drivers from potential car accidents and repair bills. Given the limited computational power and fixed number of training epochs, we propose iterative self knowledge distillation (ISKD) to train lightweight pothole classifiers. Designed to improve both the teacher and student models over time in knowledge distillation, ISKD outperforms the state-of-the-art self knowledge distillation method on three pothole classification datasets across four lightweight network architectures, which supports that self knowledge distillation should be done iteratively instead of just once. The accuracy relation between the teacher and student models shows that the student model can still benefit from a moderately trained teacher model. Implying that better teacher models generally produce better student models, our results justify the design of ISKD. In addition to pothole classification, we also demonstrate the efficacy of ISKD on six additional datasets associated with generic classification, fine-grained classification, and medical imaging application, which supports that ISKD can serve as a general-purpose performance booster without the need of a given teacher model and extra trainable parameters.