论文标题

通过标准化的RBF内核提高样品效率

Improving Sample Efficiency with Normalized RBF Kernels

论文作者

Pineda-Arango, Sebastian, Obando-Paniagua, David, Dedeoglu, Alperen, Kurzendörfer, Philip, Schestag, Friedemann, Scholz, Randolf

论文摘要

在深度学习模型中,使用较少数据的学习变得越来越重要。本文探讨了如何训练具有归一化径向基函数(RBF)内核的神经网络以提高样品效率。此外,我们展示了这种输出层如何找到嵌入空间紧凑且分离良好的空间。为了实现这一目标,我们提出了一种两阶段方法,以培训有关分类任务的那些类型的神经网络。在CIFAR-10和CIFAR-100上进行的实验表明,与具有SoftMax输出层的网络相比,通过提出的方法,具有归一化核作为输出层的网络可以实现更高的样本效率,高紧凑性和良好的分离性。

In deep learning models, learning more with less data is becoming more important. This paper explores how neural networks with normalized Radial Basis Function (RBF) kernels can be trained to achieve better sample efficiency. Moreover, we show how this kind of output layer can find embedding spaces where the classes are compact and well-separated. In order to achieve this, we propose a two-phase method to train those type of neural networks on classification tasks. Experiments on CIFAR-10 and CIFAR-100 show that networks with normalized kernels as output layer can achieve higher sample efficiency, high compactness and well-separability through the presented method in comparison to networks with SoftMax output layer.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源