论文标题

$δ$ -PINNS:复杂几何形状上的物理信息神经网络

$Δ$-PINNs: physics-informed neural networks on complex geometries

论文作者

Costabal, Francisco Sahli, Pezzuto, Simone, Perdikaris, Paris

论文摘要

物理知识的神经网络(PINN)在解决涉及部分微分方程的前进和反问题方面表现出了希望。尽管最近在扩展PINN可以解决的问题类别方面取得了进展,但大多数现有用例都涉及简单的几何域。迄今为止,尚无清晰的方法来告知Pinns有关解决问题的域拓扑。在这项工作中,我们根据拉普拉斯 - 贝特拉米操作员的特征函数提出了针对PINN的新型位置编码机制。该技术允许为代表给定对象几何形状的神经网络创建一个输入空间。我们近似具有有限元素的偏微分方程的特征函数以及涉及的操作员。我们对所提出的方法进行了广泛的测试和比较,以复杂形状(例如线圈,散热器和兔子),具有不同的物理学,例如二依基方程和传热。我们还研究了我们方法对所使用的本征函数数量的敏感性,以及用于本征函数和基础操作员的离散化。在传统的Pinn无法产生有意义的解决方案的情况下,我们的结果与地面真相数据表明了与地面真相数据的一致性。我们设想这项新技术将扩大PINNS的有效性到更现实的应用。

Physics-informed neural networks (PINNs) have demonstrated promise in solving forward and inverse problems involving partial differential equations. Despite recent progress on expanding the class of problems that can be tackled by PINNs, most of existing use-cases involve simple geometric domains. To date, there is no clear way to inform PINNs about the topology of the domain where the problem is being solved. In this work, we propose a novel positional encoding mechanism for PINNs based on the eigenfunctions of the Laplace-Beltrami operator. This technique allows to create an input space for the neural network that represents the geometry of a given object. We approximate the eigenfunctions as well as the operators involved in the partial differential equations with finite elements. We extensively test and compare the proposed methodology against traditional PINNs in complex shapes, such as a coil, a heat sink and a bunny, with different physics, such as the Eikonal equation and heat transfer. We also study the sensitivity of our method to the number of eigenfunctions used, as well as the discretization used for the eigenfunctions and the underlying operators. Our results show excellent agreement with the ground truth data in cases where traditional PINNs fail to produce a meaningful solution. We envision this new technique will expand the effectiveness of PINNs to more realistic applications.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源