Team, Visitors, External Collaborators
Overall Objectives
New Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Partnerships and Cooperations
XML PDF e-pub
PDF e-Pub

Section: Overall Objectives


Numerical simulation has been booming over the last thirty years, thanks to increasingly powerful numerical methods, computer-aided design (CAD) and the mesh generation for complex 3D geometries, and the coming of supercomputers (HPC). The discipline is now mature and has become an integral part of design in science and engineering applications. This new status has lead scientists and engineers to consider numerical simulation of problems with ever increasing geometrical and physical complexities. A simple observation of this chart

CAD Mesh Solver Visualization / Analysis ,

shows: no mesh = no simulation along with "bad" mesh = wrong simulation. We have concluded that the mesh is at the core of the classical computational pipeline and a key component to significant improvements. Therefore, the requirements on meshing methods are an ever increasing need, with increased difficulty, to produce high quality meshes to enable reliable solution output predictions in an automated manner. These requirements on meshing or equivalent technologies cannot be removed and all approaches face similar issues.

In this context, Gamma team was created in 1996 and has focused on the development of robust automated mesh generation methods in 3D, which was clearly a bottleneck at that time when most of the numerical simulations were 2D. The team has been very successful in tetrahedral meshing with the well-known software Ghs3d [53], [56] which has been distributed worldwide so far and in hexahedral meshing with the software Hexotic [68], [69] which was the first automated full hex mesher. The team has also worked on surface meshers with Yams [49] and BLSurf [45] and visualization with Medit . Before Medit , we were unable to visualize in real time 3D meshes !

In 2010, Gamma3 team has replaced Gamma with the choice to focus more on meshing for numerical simulations. The main goal was to emphasize and to strengthen the link between meshing technologies and numerical methods (flow or structure solvers). The metric-based anisotropic mesh adaptation strategy has been very successful with the development of many error estimates, the generation of highly anisotropic meshes, its application to compressible Euler and Navier-Stokes equations [41], and its extension to unsteady problems with moving geometries [43] leading to the development of several software Feflo.a/AMG-Lib, Wolf, Metrix, Wolf-Interpol . A significant accomplishment was the high-fidelity prediction of the sonic boom emitted by supersonic aircraft [40]. We were the first to compute a certified aircraft sonic boom propagation in the atmosphere, thanks to mesh adaptation. The team has started to work on parallelism with the development of the multi-thread library LPlib and the efficient management of memory using space filling curves, and the generation of large meshes (a billion of elements) [66]. Theoretical work on high-order meshes has been also done [54].

Today, numerical simulation is an integral part of design in engineering applications with the main goal of reducing costs and speeding up the process of creating new design. Four main issues for industry are:

Let us now discuss in more details each of these issues.

Generating a discrete surface mesh from a CAD geometry definition has been the numerical analysis Achille's heel for the last 30 years. Significant issues are far too common and range from persistent translation issues between systems that can produce ill defined geometry definitions to overwhelming complexity for full configurations with all components. A geometry definition that is ill defined often does not perfectly capture the geometry's features and leads to a bad mesh and a broken simulation. Unfortunately, CAD system design is essentially decoupled from the needs of numerical simulation and is largely driven by the those of manufacturing and other areas. As a result, this step of the numerical simulation pipeline is still labor intensive and the most time consuming. There is a need to develop alternative geometry processes and models that are more suitable for numerical simulations.

Companies working on high-tech projects with high added value (Boeing, Safran, Dassault-Aviation, Ariane Group, ...) consider their design pipeline inside a HPC framework. Indeed, they are performing complex numerical simulations on complex geometries on a daily-basis, and they aim at using this in a shape-optimization loop. Therefore, any tools added to their numerical platform should be HPC compliant. This means that all developments should consider hybrid parallelism, i.e., to be compatible with distributed memory architecture (MPI) and shared memory architecture (multi-threaded), to achieve scalable parallelism.

One of the main goals of numerical simulation is to reduce the cost of creating new designs (e.g reduce the number of wind-tunnel and flight tests in the aircraft industry). The emergence of 3D printers is, in some cases, making tests easier to perform, faster and cheaper. It is thus mandatory to control the cost of the numerical simulations, in other word, it is important to use less resources to achieve the same accuracy. The cost takes into account the engineer time as well as the computing resources needed to perform the numerical simulation. The cost for one simulation can vary from 15 euros for simple models (1D-2D), to 150 euros for Reynolds-averaged Navier-Stokes (3D) stationary models, or up to 15 000 euros for unsteady models like LES or Lattice-Boltzmann (Source Valéo and Safran Tech.). It is important to know that a design loop is equivalent to performing between 100 and 1 000 numerical simulations. Consequently, the need for more efficient algorithms and processes is still a key factor.

Another crucial point is checking and certification of errors and uncertainties in high-fidelity numerical simulations. These errors can come from several sources: i) modeling error (for example via turbulence models or initial conditions), ii) discretization error (due to the mesh), iii) geometry error (due to the representation of the design) and iv) implementation errors in the considered software. The error assessment and mesh generation procedure employed in the aerospace industry for CFD simulations relies heavily on the experience of the CFD user. The inadequacy of this practice even for geometries frequently encountered in engineering practice has been highlighted in studies of the AIAA (The American Institute of Aeronautics and Astronautics.) CFD Drag Prediction Workshops [72] and High-Lift Prediction Workshops [85], [84]. These studies suggest that the range of scales present in the turbulent flow cannot be adequately resolved using meshes generated following what is considered best present practices. In this regard, anisotropic mesh adaptation is considered as the future, as stated in the NASA report "CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences" [87] and the study dedicated to mesh adaptation [77].

These preoccupations are the core of the Gamma project scientific program. To answer the first issue, Gamma will focus on designing and developing a geometry modeling framework specifically intended for mesh generation and numerical simulation purposes. This is a mandatory step for automated geometry-mesh and mesh adaptation processes with an integrated geometry model. To answer the last three issues, the Gamma team will work on the development of a high-order mesh-adaptive solution platform compatible with HPC environment. To this end, Gamma will pursue its work on advanced mesh generation methods which should fulfill the following capabilities: i) geometric adaptive, ii) solution adaptive, iii) high-order, iv) multi-elements (structured or not), and v) using hybrid scalable parallelism. Note that items i) to iv) are based on the well-posed metric-based theoretical framework. Moreover, Gamma will continue to work on robust flow solvers, solving the turbulent Navier-Stokes equations from second order using Finite Volume - Finite Element numerical scheme to higher-order using Flux Reconstruction (FR) method.

The combination of adaptation - high-order - multi-elements coupled with appropriate error estimates is for the team the way to go to reduce the cost of numerical simulations while ensuring high-fidelity in a fully automated framework.