Direkt zum Inhalt springen
Computer Vision Group
TUM School of Computation, Information and Technology
Technical University of Munich

Technical University of Munich

Menu

Links

Informatik IX
Computer Vision Group

Boltzmannstrasse 3
85748 Garching info@vision.in.tum.de

Follow us on:

News

04.03.2024

We have twelve papers accepted to CVPR 2024. Check our publication page for more details.

18.07.2023

We have four papers accepted to ICCV 2023. Check out our publication page for more details.

02.03.2023

CVPR 2023

We have six papers accepted to CVPR 2023. Check out our publication page for more details.

15.10.2022

NeurIPS 2022

We have two papers accepted to NeurIPS 2022. Check out our publication page for more details.

15.10.2022

WACV 2023

We have two papers accepted at WACV 2023. Check out our publication page for more details.

More


Summary on Fourier Neural Operator for Parametric Partial Differential Equations by Ferdinand

Developed a method for learning the operator for parametric PDEs, giving the solution without having to specify the exact parameter values, boundary conditions or the discretization. This is achieved by exploiting the fact that a kernel integral operator can be represented as a convolution in the fourier space (via linear transformations), and the parameters are learnt there, meaning you can project up/down to any dimension of discretization, including much higher resolutions than used in training. One caveat is that the accuracy is still significantly lower than traditional discretized methods (which they chose not to compare FNO with), which may be excused considering the low training time, and an inference time ~3 orders of magnitude lower than the computation time for 2D navier-stokes PDEs.


Summary on Message Passing Neural PDE Solvers by Nils

The paper aims to learn the solution to various kinds of partial differential equations via an autoregressive model. They deviate from the more typical "LLM-style" autoregressive approach in two main ways, which they call the push forward and temporal bundling trick. The necessity of the push forward trick is best motivated as in the paper using the distribution shift problem. Each time the model makes a prediction for the next timestep(s), the prediction will have a slight error. Now during training, this is not a big problem as we have the correct label for the next tilmestep and can just use that for further predictions. However during test time (or when actually using the model) we don't have labels so we need to use the models previous predictions to make a new one, meaning our test time distribution looks very different as it accumulates errors in model predictions. The paper uses an adversarial stability loss, implemented via a shortened backward pass in the neural net to try and combat this problem (this is what the call the push forward trick). The temporal bundling trick is conceptually easier. It simply means to predict more than one time step into the future from the current state. This naturally also helps with distribution shift, as the model is simply called less. For the actual architecture they use a relatively simple GNN with message passing, this is motivated by the fact that both the finite difference method, finite volume method and WENO can be seen as a specific form of message passing. This architecture is then evaluated on three different types of equations: Burgers’ equation without diffusion, Burgers’ equation with variable diffusion and a mixed scenario. Here the models performance is compared to WENO and two different Fourier Neural Operators (FNO). It seems to outperform all in both speed and minimum error (except for the FNO on the Burgers' equation without diffusion).

Alternate title: Message Passing GNNs are all you need for solving PDEs

Rechte Seite

Informatik IX
Computer Vision Group

Boltzmannstrasse 3
85748 Garching info@vision.in.tum.de

Follow us on:

News

04.03.2024

We have twelve papers accepted to CVPR 2024. Check our publication page for more details.

18.07.2023

We have four papers accepted to ICCV 2023. Check out our publication page for more details.

02.03.2023

CVPR 2023

We have six papers accepted to CVPR 2023. Check out our publication page for more details.

15.10.2022

NeurIPS 2022

We have two papers accepted to NeurIPS 2022. Check out our publication page for more details.

15.10.2022

WACV 2023

We have two papers accepted at WACV 2023. Check out our publication page for more details.

More