论文标题
SE(2) - 等级网络的差异不变
Differential invariants for SE(2)-equivariant networks
论文作者
论文摘要
在计算机视觉中的许多任务中都存在对称性,在这些任务中,相同类别的对象可能会发生转换,例如由于不同的相机方向而旋转,或者由于视角而缩放。数据中这种对称性的知识以及神经网络的等效性可以改善其对新样本的概括。差异不变是根据函数的部分导数计算出的均衡运算符。在本文中,我们使用差分不变式来定义形成均等神经网络层的均值操作员。具体而言,我们得出由旋转和翻译组成的特殊欧几里得组SE(2)的不变性,并将其应用于构建SE(2) - 等级网络,称为SE(2)差分不变性网络(SE2DINNET)。随后在分类任务中对网络进行测试,这些任务需要对旋转的程度或不变性程度。即使提出的SE2DINNET的参数远低于比较模型,但结果与最新的ART进行了比较。
Symmetry is present in many tasks in computer vision, where the same class of objects can appear transformed, e.g. rotated due to different camera orientations, or scaled due to perspective. The knowledge of such symmetries in data coupled with equivariance of neural networks can improve their generalization to new samples. Differential invariants are equivariant operators computed from the partial derivatives of a function. In this paper we use differential invariants to define equivariant operators that form the layers of an equivariant neural network. Specifically, we derive invariants of the Special Euclidean Group SE(2), composed of rotations and translations, and apply them to construct a SE(2)-equivariant network, called SE(2) Differential Invariants Network (SE2DINNet). The network is subsequently tested in classification tasks which require a degree of equivariance or invariance to rotations. The results compare positively with the state-of-the-art, even though the proposed SE2DINNet has far less parameters than the compared models.