GNOT: A General Neural Operator Transformer for Operator Learning
From MaRDI portal
Publication:6427874
arXiv2302.14376MaRDI QIDQ6427874
Author name not available (Why is that?)
Publication date: 28 February 2023
Abstract: Learning partial differential equations' (PDEs) solution operators is an essential problem in machine learning. However, there are several challenges for learning operators in practical applications like the irregular mesh, multiple input functions, and complexity of the PDEs' solution. To address these challenges, we propose a general neural operator transformer (GNOT), a scalable and effective transformer-based framework for learning operators. By designing a novel heterogeneous normalized attention layer, our model is highly flexible to handle multiple input functions and irregular meshes. Besides, we introduce a geometric gating mechanism which could be viewed as a soft domain decomposition to solve the multi-scale problems. The large model capacity of the transformer architecture grants our model the possibility to scale to large datasets and practical problems. We conduct extensive experiments on multiple challenging datasets from different domains and achieve a remarkable improvement compared with alternative methods. Our code and data are publicly available at url{https://github.com/thu-ml/GNOT}.
Has companion code repository: https://github.com/thu-ml/gnot
This page was built for publication: GNOT: A General Neural Operator Transformer for Operator Learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6427874)