论文标题

神经保护法:无差异观点

Neural Conservation Laws: A Divergence-Free Perspective

论文作者

Richter-Powell, Jack, Lipman, Yaron, Chen, Ricky T. Q.

论文摘要

我们调查了深层神经网络的参数化,通过设计满足连续性方程,这是一种基本的保护法。这是通过观察到的,即连续性方程的任何解决方案都可以表示为无差异向量场。因此,我们建议通过差异形式的概念构建无差异神经网络,并在自动分化的帮助下实现了两个实际的结构。结果,我们可以将始终完全满足连续性方程的密度和向量场对进行参数化,从而提出需要额外的惩罚方法或昂贵的数值模拟。此外,我们证明这些模型是通用的,因此可以用来表示任何无差异向量场。最后,我们通过计算基于神经网络的解决方案,求解霍奇分解并学习动态最佳传输图来实验验证我们的方法。

We investigate the parameterization of deep neural networks that by design satisfy the continuity equation, a fundamental conservation law. This is enabled by the observation that any solution of the continuity equation can be represented as a divergence-free vector field. We hence propose building divergence-free neural networks through the concept of differential forms, and with the aid of automatic differentiation, realize two practical constructions. As a result, we can parameterize pairs of densities and vector fields that always exactly satisfy the continuity equation, foregoing the need for extra penalty methods or expensive numerical simulation. Furthermore, we prove these models are universal and so can be used to represent any divergence-free vector field. Finally, we experimentally validate our approaches by computing neural network-based solutions to fluid equations, solving for the Hodge decomposition, and learning dynamical optimal transport maps.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源