论文标题

关于深神经网络的数值

On Numerosity of Deep Neural Networks

论文作者

Zhang, Xi, Wu, Xiaolin

论文摘要

最近,一个挑衅性的主张发表了数字感觉自发出现在仅用于视觉对象识别的深度神经网络中。如果是真的,这对机器学习和认知科学的领域具有重要意义。在本文中,我们证明了上述声称是不幸的。支持该主张的统计分析是有缺陷的,因为与对象识别网络中的大量神经元相比,用于识别数字感知神经元的样本集太小。通过这种有缺陷的分析,人们可能会错误地识别任何未经训练的随机初始化的深神网络中的数量感应神经元。通过上述批评,我们问一个问题,如果仔细培训了深度卷积神经网络的数字?我们的发现混杂。即使经过数字摄影的培训,深度学习方法仍然很难获取数字的抽象概念,这是学龄前儿童轻松执行的认知任务。但是,另一方面,我们确实发现了一些令人鼓舞的证据,这表明深层神经网络的分配转移比大数字更强大。

Recently, a provocative claim was published that number sense spontaneously emerges in a deep neural network trained merely for visual object recognition. This has, if true, far reaching significance to the fields of machine learning and cognitive science alike. In this paper, we prove the above claim to be unfortunately incorrect. The statistical analysis to support the claim is flawed in that the sample set used to identify number-aware neurons is too small, compared to the huge number of neurons in the object recognition network. By this flawed analysis one could mistakenly identify number-sensing neurons in any randomly initialized deep neural networks that are not trained at all. With the above critique we ask the question what if a deep convolutional neural network is carefully trained for numerosity? Our findings are mixed. Even after being trained with number-depicting images, the deep learning approach still has difficulties to acquire the abstract concept of numbers, a cognitive task that preschoolers perform with ease. But on the other hand, we do find some encouraging evidences suggesting that deep neural networks are more robust to distribution shift for small numbers than for large numbers.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源