Entropy production and mesh reﬁnement – application to wave breaking

– We propose an adaptive numerical scheme for hyperbolic conservation laws based on the numerical density of entropy production (the amount of violation of the theoretical entropy inequality). Thus it is used as an a posteriori error which provides information on the need to reﬁne the mesh in the regions where discontinuities occur and to coarsen the mesh in the regions where the solutions remain smooth. Nevertheless, due to the CFL stability condition the time step is restricted and leads to time consuming simulations. Therefore, we propose a local time stepping algorithm. This approach is validated in the case of classical 1D numerical tests. Then, we present a 3D application. Indeed, according to an eulerian bi-ﬂuid formulation at low Mach, an hyperbolic system of conservation laws allows an easily parallelization and mesh reﬁnement by “blocks”.


Introduction
The present study deals with the numerical simulation of multi-fluid flows such as wave propagating, wave breaking and impacting on structure. Numerical simulations of these processes generally involve the complete resolution of the Navier-Stokes equations in both air and water phases. This approach is widely used for its physical relevancy, but its cost in terms of cpu-time remains a major drawback. Three-dimensional simulations still require a significant effort in software development and mesh refinement technique on powerful computers [1][2][3][4][5][6]. Simplified models based on potential theory or shallowwater equations [7][8][9] can be used to reduce the cpu-time. Unfortunately, they are not able to adequately represent the complex free surface dynamics occurring during wave breaking. An alternative intermediate formulation is used here, aiming to be physically more relevant than shallowwater models and less cpu-time consuming than the full Navier-Stokes simulations. The model is based on a fast 3D two phases flow solver: viscous effects are neglected and an artificial compressibility approach is used leading to an hyperbolic system of conservation laws. This formulation, combined with explicit time integration, allows an efficient parallel implementation. Our model has been successively validated on experimental and numerical test a Corresponding author: yushchen@univ-tln.fr cases [10], improved by the selection of an isothermal model [11] and recently tested on breaking wave problem over a sloping bottom with and without roughness elements [12]. In order to keep accurate simulation in a reasonable cpu-time, it is relevant to use adaptive mesh refinement technique based on an efficient criterion. Recent developments proved that the numerical production of entropy is a very powerful error "like" indicator [13][14][15][16]. This approach leads to a super-convergent scheme [17]. Coupled to a local time stepping scheme, it allows to significantly reduce the computational time. The method is recalled in the first part of the paper. The second part concerns the numerical validation in the case of one dimensional test. Application on a wave breaking problem is shown in the third part while the last section is dedicated to conclusion and prospects.

Entropy production
We are interested in numerical integration of non linear hyperbolic systems of conservation laws of the form (express here in 1d for the sake of simplicity).
Article published by EDP Sciences  (1) with high accuracy is a challenging problem since it is well-known that solutions can and will breakdown at a finite time, even if the initial data are smooth, and develop complex structure (shock wave interactions). In such a situation, the uniqueness of the (weak) solution is lost and is recovered by completing the system (1) with an entropy inequality of the form: where (s, ψ) stands for a convex entropy-entropy flux pair. This inequality allows to select the physical relevant solution. Moreover, the entropy satisfies a conservation equation only in regions where the solution is smooth and an inequality when the solution develops shocks. In simple cases, it can be proved that the term missing in (2) to make it an equality is a Dirac mass. Numerical approximation of Equations (1) with (2) leads to the so-called numerical density of entropy production, which is a measure of the amount of violation of the entropy equation (as a measure of the local residual as in [18][19][20]). As a consequence, the numerical density of entropy production provides information on the need to locally refine the mesh (e.g. if the solution develops discontinuities) or to coarsen the mesh (e.g. if the solution is smooth and well-approximated) as already used by Puppo [14][15][16] and Golay [13]. Even if the shocks are well-captured on coarse grid using finite volume scheme, such indicator is able (as shown in Puppo [16]) not only to provide an efficient a posteriori error, but also to reproduce the qualitative structure of the solution and to pilot the adaptive scheme. Explicit adaptive schemes are well-known to be time consuming due to a CFL stability condition. The cpu-time increases rapidly as the mesh is refined. Nevertheless, the cpu-time can be significantly reduced using the local time stepping algorithm (see e.g. [17,[21][22][23][24]).
The numerical scheme is presented in details in [17], including the local time stepping scheme, the mesh refinement procedure by dyadic tree. The reader can found more details about the 3d approach in [11,12].

One-dimensional test case
We now present some results using the adaptive multi scale scheme. Numerical solutions are computed using the one-dimensional gas dynamics equations for ideal gas: where ρ, u, p, γ, E are respectively the density, the velocity, the pressure, the ratio of the specific heats (set to 1.4) and the total energy E = ε + u 2 /2 (where ε is the internal specific energy). Using the conservative variables w = (ρ, ρu, ρE) T , we classically define the entropy by s(w) = −ρ ln (p/ρ γ ) and the entropy flux by In what follows, we perform several numerical tests. We will refer AB1 as the first order scheme, AB2 as the second order Adams-Basforth scheme, RK2 as the second order Runge-Kutta scheme. AB2 and RK2 use a MUSCL reconstruction. Moreover, all computations are made with a dynamic grid. We also use the cap "M" when the local time stepping algorithm is employed. N Lmax stands for the average number of cells used during a simulation of an adaptive mesh refinement scheme with a maximum level L max .
We consider the Shu and Osher's problem [25] with initial conditions . As a reference solution, we compute the solution on a uniform fixed grid (20 000 cells) with the RK2 scheme. This solution being computed on a very fine fixed grid, as predicted by the theory, the density of entropy production is almost concentrated at the shocks. Even if small productions are present between 0.5 ≤ x ≤ 0.75, one can consider such a solution as an "exact" one. On Figure 1, we plot the density of the reference solution, the one by AB1, AB2 and RK2 schemes and their numerical density of entropy production. Starting from 500 cells, the adaptive schemes lead to very close solutions for each scheme and the numerical density of entropy production vanishes everywhere where the solution is smooth and every solution fit to the reference solution. However, focusing closely to the oscillating area between 0.5 ≤ x ≤ 0.7, one can observe that the standard classification of methods holds: AB1, AB2 and RK2. Table 1 301 summarizes the computation of the total entropy production P (integral of (2) in time and space), the discrete l 1 x norm of the error on the density, the cpu-time, the average number of cells and the maximum number of cells at final time. It is well-known that the AB2 scheme is less stable and less accurate than the RK2 scheme. Nonetheless, in the framework of the local time stepping, for almost the same accuracy the AB2M scheme computes 3 times faster than the RK2 which corresponds to a significant gain in time.

Dambreak problem
We now present one result computed using the isothermal bi-fluid model [11]: where ϕ denotes the fraction of water. We define the entropy by Instead of octree meshing, we use cartesian block meshing. The computational domain is splitted in many "blocks" which are devoted to a parallel process. According to the average entropy production and thus the mesh refinement level N , each block is meshed in a cartesian way (2 N −1 n x × 2 N −1 n y × 2 N −1 n z ) cells. The interface between two blocks is therefore, most of the time, a non conforming one. The model is applied to the classical dambreaking problem with an obstacle (as described by Koshizuka in [26]). The computational domain (584 mm × 584 mm × 0.5 mm) is splitted in 321 blocks which will be dispached on 120 cores. The air-water interface is well described according to the mesh refinement procedure which follows the high values of the numerical production of entropy. Let us denote p e the numerical density of entropy production (2) and p e its average value over the domain at time t. If the average value of p e over a block is less than 2% of p e , the block is coarsened. And if the average value of p e over a block is more than 20% of p e , the block is refined. According to [17] the level of two adjacent blocks never exceeds 2 in order to avoid oscillations. Figures 2 (resp. 3) represent the results at t = 0.2 s (resp. t = 0.4 s). Without any restriction, the maximum number of cells is between 70 000 and 100 000 during all the simulation for an elapsed computing time about 5 h. In this simulation each block is meshed in a Cartesian way with 2 × 2 N −1 cells in the x and y direction and always 1 cell in the z direction. It leads to a mesh size from 12 mm to 0.75 mm in this example.

Conclusion
In this paper, we use a first and second order methods in space and time which are coupled with an adaptive algorithm employing local time stepping. In this adaptive numerical scheme the grid is locally refined or coarsened according to the entropy indicator. Several numerical tests have been performed and show an impressive improvement with respect to uniform grids even if a large number of cells is used.
All numerical tests also show that the numerical density of entropy production combined with the proposed mesh refinement parameter is a relevant local error indicator (everywhere where the solution remains smooth) and discontinuity detector: large shocks and oscillating solutions are very well-captured. Moreover, we  have shown that the implementation of the local time stepping algorithm can significantly reduce the computational time keeping the same order of accuracy. Applied to complex air-water flows, it is a accurate and powerful numerical tool. This approach can be even improved by balancing more efficiently the tasks in the parallel process.