论文标题

卷积层是离散偏移但不连续翻译的均等层

Convolutional layers are equivariant to discrete shifts but not continuous translations

论文作者

McGreivy, Nick, Hakim, Ammar

论文摘要

这个简短说明的目的是阐明对卷积神经网络(CNN)的共同误解。 CNN由卷积层组成,卷积层由于重量共享而偏移。但是,即使忽略边界效应,以及没有汇总和子采样时,卷积层也不是翻译等效的。这是因为移位均值是一种离散的对称性,而翻译等效性是连续的对称性。这一事实在模棱两可的机器学习中是众所周知的,但通常在非专家之间被忽略。为了最大程度地减少混乱,我们建议使用术语“移位均值”一词来指代像素和“翻译均衡依据”中的离散偏移来参考连续翻译。

The purpose of this short and simple note is to clarify a common misconception about convolutional neural networks (CNNs). CNNs are made up of convolutional layers which are shift equivariant due to weight sharing. However, convolutional layers are not translation equivariant, even when boundary effects are ignored and when pooling and subsampling are absent. This is because shift equivariance is a discrete symmetry while translation equivariance is a continuous symmetry. This fact is well known among researchers in equivariant machine learning, but is usually overlooked among non-experts. To minimize confusion, we suggest using the term `shift equivariance' to refer to discrete shifts in pixels and `translation equivariance' to refer to continuous translations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源