Abstract
Physics-informed neural networks (PINNs) are a novel approach to solving partial differential equations (PDEs) through deep learning. They offer a unified manner for solving forward and inverse problems, which is beneficial for various engineering problems, including heat transfer analysis. However, traditional PINNs suffer from low accuracy and efficiency due to the fully-connected neural network framework and the method to incorporate physical laws. In this paper, a novel physics-informed learning architecture, named physics-informed fully convolutional networks (PIFCNs), is developed to simultaneously solve forward and inverse problems in thermal conduction. The use of fully convolutional networks (FCNs) significantly reduces the density of connections. Thus, the computational cost is reduced. With the advantage of the nodal-level match between inputs and outputs in FCNs, the output solution can be used directly to formulate discretized PDEs via a finite difference method, which is more accurate and efficient than the traditional approach in PINNs. The results demonstrate that PIFCNs can flexibly implement Dirichlet and Neumann boundary conditions to predict temperature distribution. Remarkably, PIFCNs can also estimate unknown thermal diffusivity with an accuracy exceeding 99%, even with incomplete boundaries and limited sampling data. The results obtained from PIFCNs outperform those obtained from PINNs.