En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK

gerer mes paniers

  • z

    Destination de la recherche

    Raccourcis

    1

    Data-driven latent representations for time-dependent problems - Lecture 2

    Sélection Signaler une erreur
    Multi angle
    Auteurs : Zepeda-Núñez, Leonardo (Auteur de la Conférence)
    CIRM (Editeur )

    00:00
    00:00
     

    Résumé : High-fidelity numerical simulation of physical systems modeled by time-dependent partial differential equations (PDEs) has been at the center of many technological advances in the last century. However, for engineering applications such as design, control, optimization, data assimilation, and uncertainty quantification, which require repeated model evaluation over a potentially large number of parameters, or initial conditions, these simulations remain prohibitively expensive, even with state-of-art PDE solvers. The necessity of reducing the overall cost for such downstream applications has led to the development of surrogate models, which captures the core behavior of the target system but at a fraction of the cost. In this context, new advances in machine learning provide a new path for developing surrogates models, particularly when the PDEs are not known and the system is advection-dominated. In a nutshell, we seek to find a data-driven latent representation of the state of the system, and then learn the latent-space dynamics. This allows us to compress the information, and evolve in compressed form, therefore, accelerating the models. In this series of lectures, I will present recent advances in two fronts: deterministic and probabilistic modeling latent representations. In particular, I will introduce the notions of hyper-networks, a neural network that outputs another neural network, and diffusion models, a framework that allows us to represent probability distributions of trajectories directly. I will provide the foundation for such methodologies, how they can be adapted to scientific computing, and which physical properties they need to satisfy. Finally, I will provide several examples of applications to scientific computing.

    Keywords : scientific machine learning; generative methods; sampling; diffusion models; hyper-networks

    Codes MSC :
    65C20 - Models (numerical methods)
    65L20 - Stability of numerical methods
    37N30 - Dynamical systems in numerical analysis

      Informations sur la Vidéo

      Réalisateur : Récanzone, Luca
      Langue : Anglais
      Date de publication : 04/08/2023
      Date de captation : 20/07/2023
      Sous collection : Research School
      arXiv category : Machine Learning ; Artificial Intelligence
      Domaine : Numerical Analysis & Scientific Computing ; Computer Science ; Dynamical Systems & ODE
      Format : MP4 (.mp4) - HD
      Durée : 01:11:38
      Audience : Researchers ; Graduate Students ; Doctoral Students, Post-Doctoral Students
      Download : https://videos.cirm-math.fr/2023-07-20_Zepeda_Nunez_2.mp4

    Informations sur la Rencontre

    Nom de la rencontre : CEMRACS 2023: Scientific Machine Learning / CEMRACS 2023: Apprentissage automatique scientifique
    Organisateurs de la rencontre : Auroux, Didier ; Campos Pinto, Martin ; Després, Bruno ; Dolean, Victorita ; Frénod, Emmanuel ; Lanteri, Stéphane ; Michel-Dansac, Victor
    Dates : 17/07/2023 - 21/07/2023
    Année de la rencontre : 2023
    URL Congrès : https://conferences.cirm-math.fr/2904.html

    Données de citation

    DOI : 10.24350/CIRM.V.20072303
    Citer cette vidéo: Zepeda-Núñez, Leonardo (2023). Data-driven latent representations for time-dependent problems - Lecture 2. CIRM. Audiovisual resource. doi:10.24350/CIRM.V.20072303
    URI : http://dx.doi.org/10.24350/CIRM.V.20072303

    Voir aussi

    Bibliographie

    • FINZI, Marc Anton, BORAL, Anudhyan, WILSON, Andrew Gordon, et al. User-defined Event Sampling and Uncertainty Quantification in Diffusion Models for Physical Dynamical Systems. In : International Conference on Machine Learning. PMLR, 2023. p. 10136-10152. - https://proceedings.mlr.press/v202/finzi23a/finzi23a.pdf

    • WAN, Zhong Yi, BAPTISTA, Ricardo, CHEN, Yi-fan, et al. Debias Coarsely, Sample Conditionally: Statistical Downscaling through Optimal Transport and Probabilistic Diffusion Models. arXiv preprint arXiv:2305.15618, 2023. - https://doi.org/10.48550/arXiv.2305.15618

    • BORAL, Anudhyan, WAN, Zhong Yi, ZEPEDA-NÚÑEZ, Leonardo, et al. Neural Ideal Large Eddy Simulation: Modeling Turbulence with Neural Stochastic Differential Equations. arXiv preprint arXiv:2306.01174, 2023. - https://doi.org/10.48550/arXiv.2306.01174

    • WAN, Zhong Yi, ZEPEDA-NÚÑEZ, Leonardo, BORAL, Anudhyan, et al. Evolve Smoothly, Fit Consistently: Learning Smooth Latent Dynamics For Advection-Dominated Systems. arXiv preprint arXiv:2301.10391, 2023. - https://doi.org/10.48550/arXiv.2301.10391

    • DRESDNER, Gideon, KOCHKOV, Dmitrii, NORGAARD, Peter, et al. Learning to correct spectral methods for simulating turbulent flows. arXiv preprint arXiv:2207.00556, 2022. - https://doi.org/10.48550/arXiv.2207.00556



    Imagette Video

    Sélection Signaler une erreur
    Close