论文标题

关于重新参数的relu人工神经网络参数的界限:Lipschitz规范控制网络参数矢量的分数总和

On bounds for norms of reparameterized ReLU artificial neural network parameters: sums of fractional powers of the Lipschitz norm control the network parameter vector

论文作者

Jentzen, Arnulf, Kröger, Timo

论文摘要

在科学文献中,这是一个基本事实,即馈电式完全连接的整流线性单元(relu)人工神经网络(ANN)的Lipschitz规范可以通过ANN参数矢量的规范范围的总和来限制上面的乘法常数。粗略地说,在这项工作中,我们揭示了浅不会的,即交谈不平等也是正确的。更正式的是,我们证明具有相同实现函数的ANN参数向量的等效类别的规范是由上面的乘法常数,其限制在ANN实现函数的Lipschitz Norm的幂等(指数$ 1/2 $和$ 1 $)。此外,我们证明,这种上限仅在使用Lipschitz Norm时才能存在,但既不适合Hölder规范也不适合Sobolev-Slobodeckij规范。此外,我们证明,这种上限仅适用于Lipschitz Norm的力量总和,指数为$ 1/2 $和$ 1 $,但仅对Lipschitz Norm而言并不容易。

It is an elementary fact in the scientific literature that the Lipschitz norm of the realization function of a feedforward fully-connected rectified linear unit (ReLU) artificial neural network (ANN) can, up to a multiplicative constant, be bounded from above by sums of powers of the norm of the ANN parameter vector. Roughly speaking, in this work we reveal in the case of shallow ANNs that the converse inequality is also true. More formally, we prove that the norm of the equivalence class of ANN parameter vectors with the same realization function is, up to a multiplicative constant, bounded from above by the sum of powers of the Lipschitz norm of the ANN realization function (with the exponents $ 1/2 $ and $ 1 $). Moreover, we prove that this upper bound only holds when employing the Lipschitz norm but does neither hold for Hölder norms nor for Sobolev-Slobodeckij norms. Furthermore, we prove that this upper bound only holds for sums of powers of the Lipschitz norm with the exponents $ 1/2 $ and $ 1 $ but does not hold for the Lipschitz norm alone.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源