Exploring Physical Latent Spaces for Deep Learning - Télécom Paris Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Exploring Physical Latent Spaces for Deep Learning

Résumé

We explore training deep neural network models in conjunction with physical simulations via partial differential equations (PDEs), using the simulated degrees of freedom as latent space for the neural network. In contrast to previous work, we do not impose constraints on the simulated space, but rather treat its degrees of freedom purely as tools to be used by the neural network. We demonstrate this concept for learning reduced representations. It is typically extremely challenging for conventional simulations to faithfully preserve the correct solutions over long time-spans with traditional, reduced representations. This problem is particularly pronounced for solutions with large amounts of small scale features. Here, data-driven methods can learn to restore the details as required for accurate solutions of the underlying PDE problem. We explore the use of physical, reduced latent space within this context, and train models such that they can modify the content of physical states as much as needed to best satisfy the learning objective. Surprisingly, this autonomy allows the neural network to discover alternate dynamics that enable a significantly improved performance in the given tasks. We demonstrate this concept for a range of challenging test cases, among others, for Navier-Stokes based turbulence simulations.

Dates et versions

hal-04084065 , version 1 (27-04-2023)

Identifiants

Citer

Chloe Paliard, Nils Thuerey, Kiwon Um. Exploring Physical Latent Spaces for Deep Learning. 2023. ⟨hal-04084065⟩
39 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More