En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK

Documents 65Nxx 5 résultats

Filtrer
Sélectionner : Tous / Aucun
Q
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
This talk will be devoted to the usage of new discretization schemes on polyhedral meshes in an industrial context. These discretizations called CDO [1, 2] (Compatible Discrete Operator) or Hybrid High Order [3,4] (HHO) schemes have been recently implemented in Code Saturne [5]. Code Saturne is an open-source code developed at EDF R&D aiming at simulating single-phase flows. First, the advantages of robust polyhedral discretizations will be recalled. Then, the underpinning principles of CDO schemes will be presented as well as some applications: diffusion equations, transport problems, groundwater flows or the discretization of the Stokes equations. High Performance Computing (HPC) aspects will be also discussed as it is an essential feature in an industrial context either to address complex and large computational domains or to get a quick answer. Some highlights on the main outlooks will be given to conclude.[-]
This talk will be devoted to the usage of new discretization schemes on polyhedral meshes in an industrial context. These discretizations called CDO [1, 2] (Compatible Discrete Operator) or Hybrid High Order [3,4] (HHO) schemes have been recently implemented in Code Saturne [5]. Code Saturne is an open-source code developed at EDF R&D aiming at simulating single-phase flows. First, the advantages of robust polyhedral discretizations will be ...[+]

65Nxx ; 65N50 ; 76S05

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
2y

Gradient discretisations : tools and applications - Eymard, Robert (Auteur de la Conférence) | CIRM H

Post-edited

Some convergence properties for the approximation of second order elliptic problems with a variety of boundary conditions (homogeneous Dirichlet, homogeneous or non-homogeneous Neumann or Fourier boundary conditions), using a given discretisation method, can be obtained when this method is plugged into the Gradient Discretisation Method (GDM) framework.
Instead of defining one GDM framework for each of these boundary conditions, we show that these properties can be stated using the same abstract tools for all the above boundary conditions. Then these tools enable the application of the GDM to a larger class of elliptic problems.[-]
Some convergence properties for the approximation of second order elliptic problems with a variety of boundary conditions (homogeneous Dirichlet, homogeneous or non-homogeneous Neumann or Fourier boundary conditions), using a given discretisation method, can be obtained when this method is plugged into the Gradient Discretisation Method (GDM) framework.
Instead of defining one GDM framework for each of these boundary conditions, we show that ...[+]

65J05 ; 65Nxx ; 47A58

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y

Learning operators - Lecture 1 - Mishra, Siddhartha (Auteur de la Conférence) | CIRM H

Multi angle

Operators are mappings between infinite-dimensional spaces, which arise in the context of differential equations. Learning operators is challenging due to the inherent infinite-dimensional context. In this course, we present different architectures for learning operators from data. These include operator networks such as DeepONets and Neural operators such as Fourier Neural Operators (FNOs) and their variants. We will present theoretical results that show that these architectures learn operators arising from PDEs. A large number of numerical examples will be provided to illustrate them.[-]
Operators are mappings between infinite-dimensional spaces, which arise in the context of differential equations. Learning operators is challenging due to the inherent infinite-dimensional context. In this course, we present different architectures for learning operators from data. These include operator networks such as DeepONets and Neural operators such as Fourier Neural Operators (FNOs) and their variants. We will present theoretical results ...[+]

65Mxx ; 65Nxx ; 68Txx

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y

Learning operators - Lecture 2 - Mishra, Siddhartha (Auteur de la Conférence) | CIRM H

Multi angle

Operators are mappings between infinite-dimensional spaces, which arise in the context of differential equations. Learning operators is challenging due to the inherent infinite-dimensional context. In this course, we present different architectures for learning operators from data. These include operator networks such as DeepONets and Neural operators such as Fourier Neural Operators (FNOs) and their variants. We will present theoretical results that show that these architectures learn operators arising from PDEs. A large number of numerical examples will be provided to illustrate them.[-]
Operators are mappings between infinite-dimensional spaces, which arise in the context of differential equations. Learning operators is challenging due to the inherent infinite-dimensional context. In this course, we present different architectures for learning operators from data. These include operator networks such as DeepONets and Neural operators such as Fourier Neural Operators (FNOs) and their variants. We will present theoretical results ...[+]

65Mxx ; 65Nxx ; 68Txx

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y

Learning operators - Lecture 3 - Mishra, Siddhartha (Auteur de la Conférence) | CIRM H

Multi angle

Operators are mappings between infinite-dimensional spaces, which arise in the context of differential equations. Learning operators is challenging due to the inherent infinite-dimensional context. In this course, we present different architectures for learning operators from data. These include operator networks such as DeepONets and Neural operators such as Fourier Neural Operators (FNOs) and their variants. We will present theoretical results that show that these architectures learn operators arising from PDEs. A large number of numerical examples will be provided to illustrate them.[-]
Operators are mappings between infinite-dimensional spaces, which arise in the context of differential equations. Learning operators is challenging due to the inherent infinite-dimensional context. In this course, we present different architectures for learning operators from data. These include operator networks such as DeepONets and Neural operators such as Fourier Neural Operators (FNOs) and their variants. We will present theoretical results ...[+]

65Mxx ; 65Nxx ; 68Txx

Sélection Signaler une erreur