Partager

Publications

Publications

Les thèses soutenues au CMAP sont disponibles en suivant ce lien:
Découvrez les thèses du CMAP

Sont listées ci-dessous, par année, les publications figurant dans l'archive ouverte HAL.

2023

  • Properties of Discrete Sliced Wasserstein Losses
    • Tanguy Eloi
    • Flamary Rémi
    • Delon Julie
    Mathematics of Computation, American Mathematical Society, 2023. The Sliced Wasserstein (SW) distance has become a popular alternative to the Wasserstein distance for comparing probability measures. Widespread applications include image processing, domain adaptation and generative modelling, where it is common to optimise some parameters in order to minimise SW, which serves as a loss function between discrete probability measures (since measures admitting densities are numerically unattainable). All these optimisation problems bear the same sub-problem, which is minimising the Sliced Wasserstein energy. In this paper we study the properties of E : Y −→ SW 2 2 (γ Y , γ Z), i.e. the SW distance between two uniform discrete measures with the same amount of points as a function of the support Y ∈ R n×d of one of the measures. We investigate the regularity and optimisation properties of this energy, as well as its Monte-Carlo approximation E p (estimating the expectation in SW using only p samples) and show convergence results on the critical points of E p to those of E, as well as an almost-sure uniform convergence. Finally, we show that in a certain sense, Stochastic Gradient Descent methods minimising E and E p converge towards (Clarke) critical points of these energies. (10.1090/mcom/3994)
    DOI : 10.1090/mcom/3994
  • Moments approaches for asymptotic inverse problems of depolymerisation and fragmentation systems
    • Doumic Marie
    , 2024. Shrinkage of large particles, either through depolymerisation (i.e. progressive shortening) or through fragmentation (breakage into smaller pieces) may be modelled by discrete equations, of Becker-D\"oring type, or by continuous ones. In this note, we review two kinds of inverse problems: the first is the estimation of the initial size-distribution from moments measurements in a depolymerising system, in collaboration with Philippe Moireau and inspired by experiments carried out by Human Rezaei's team; the second is the inference of fragmentation characteristics from size distribution samples, in collaboration with Miguel Escobedo and Magali Tournus, based on biological questions and experiments of Wei-Feng Xue's team.
  • Damping optimization of viscoelastic thin structures, application and analysis
    • Joubert Antoni
    • Allaire Grégoire
    • Amstutz Samuel
    • Diani Julie
    Structural and Multidisciplinary Optimization, Springer Verlag, 2023, 66 (7), pp.149. In this work damping properties of bending viscoelastic thin structures are enhanced by topology optimization. Homogeneous linear viscoelastic plates are optimized and compared when modeled by either the Kirchhoff-Love or Reissner-Mindlin plate theories as well as by the bulk 3D viscoelastic constitutive equations. Mechanical equations are numerically solved by the finite element method and designs are represented by the level-set approach. High performance computing techniques allow to solve the transient viscoelastic problem for very thin 3D meshes, enabling a wider range of applications. The considered isotropic material is characterized by a generalized Maxwell model accounting for the viscoelasticity of both Young modulus and Poisson's ratio. Numerical results show considerable design differences according to the chosen mechanical model, and highlights a counterintuitive section shrinking phenomenon discussed at length. The final numerical example extends the problem to an actual shoe sole application, performing its damping optimization in an industrial context. (10.1007/s00158-023-03602-z)
    DOI : 10.1007/s00158-023-03602-z
  • Gradient estimates for the Schrödinger potentials: convergence to the Brenier map and quantitative stability
    • Chiarini Alberto
    • Conforti Giovanni
    • Greco Giacomo
    • Tamanini Luca
    Communications in Partial Differential Equations, Taylor & Francis, 2023, 48 (6), pp.895-943. (10.1080/03605302.2023.2215527)
    DOI : 10.1080/03605302.2023.2215527
  • Adaptive importance sampling based on fault tree analysis for piecewise deterministic Markov process
    • Chennetier Guillaume
    • Chraibi Hassane
    • Dutfoy Anne
    • Garnier Josselin
    , 2023. Piecewise deterministic Markov processes (PDMPs) can be used to model complex dynamical industrial systems. The counterpart of this modeling capability is their simulation cost, which makes reliability assessment untractable with standard Monte Carlo methods. A significant variance reduction can be obtained with an adaptive importance sampling (AIS) method based on a cross-entropy (CE) procedure. The success of this method relies on the selection of a good family of approximations of the committor function of the PDMP. In this paper original families are proposed. They are well adapted to high-dimensional industrial systems. Their forms are based on reliability concepts related to fault tree analysis: minimal path sets and minimal cut sets. The proposed method is discussed in detail and applied to academic systems and to a realistic system from the nuclear industry.
  • Influence of the choice of the seismic intensity measure on fragility curves estimation in a bayesian framework based on reference prior
    • Van Biesbroeck Antoine
    • Gauchy Clément
    • Feau Cyril
    • Garnier Josselin
    , 2023, pp.94-111. Seismic fragility curves are key quantities of the Seismic Probabilistic Risk Assessment studies carried out on industrial facilities. They express the probability of failure of a mechanical structure conditional to a scalar value derived from the seismic ground motion called Intensity Measure (IM). Estimating such curves is a daunting task because for most structures of interest few data are available. For this reason, some methods of the literature rely on the use of the parametric log-normal model coupled with Bayesian approaches for parameter estimation. We are interested in this work in binary datasets (failure or not failure) and we investigate the influence of the choice of the prior and of the IM – Peak Ground Acceleration vs. Pseudo Spectral Acceleration – on the convergence of the estimates for a given seismic scenario. We show that Bayesian fragility curve estimation results based on Jeffreys’ prior outperform the ones based on classical priors whatever the IM. We also show that, when an IM is more correlated to the structural response, the differences between the results obtained with the different priors are less marked. They testify to the fact that an IM more correlated to the response of the structure induces a lower variability of the estimate of the median of the log-normal model. This is not the case for the log standard deviation whose estimate is affected by samples which are more degenerate with this kind of IM, namely which are partitioned into two disjoint subsets when classified according to IM values: the subset for which there is no failure and the other one for which there is failure, these two subsets possibly being present in the same sample or separately. Such degeneracy affects all methods to varying degrees, but with Jeffreys’ prior the results are clearly better, attesting to its robustness against such inevitable situations in practice. (10.7712/120223.10327.19899)
    DOI : 10.7712/120223.10327.19899
  • Entropic Optimal Planning for Path-Dependent Mean Field Games
    • Ren Zhenjie
    • Tan Xiaolu
    • Touzi Nizar
    • Yang Junjian
    SIAM Journal on Control and Optimization, Society for Industrial and Applied Mathematics, 2023, 61 (3), pp.1415-1437. (10.1137/22M1484444)
    DOI : 10.1137/22M1484444
  • From the distributions of times of interactions to preys and predators dynamical systems
    • Bansaye Vincent
    • Cloez Bertrand
    Journal of Mathematical Biology, Springer, 2023, 87 (2). We consider a stochastic individual based model where each predator searches during a random time and then manipulates its prey or rests. The time distributions may be non-exponential. An age structure allows to describe these interactions and get a Markovian setting. The process is characterized by a measure-valued stochastic differential equation. We prove averaging results in this infinite dimensional setting and get the convergence of the slow-fast macroscopic prey predator process to a two dimensional dynamical system. We recover classical functional responses. We also get new forms arising in particular when births and deaths of predators are affected by the lack of food. (10.1007/s00285-023-01925-5)
    DOI : 10.1007/s00285-023-01925-5
  • Modèles structurés multi-niveaux de dynamiques épidémiques
    • Kubasch Madeleine
    • Bansaye Vincent
    • Deslandes François
    • Vergu Elisabeta
    , 2023.
  • Large population games with interactions through controls and common noise: convergence results and equivalence between open-loop and closed-loop controls
    • Djete Mao Fabrice
    ESAIM: Control, Optimisation and Calculus of Variations, EDP Sciences, 2023, 29, pp.39. In the presence of a common noise, we study the convergence problems in mean field game (MFG) and mean field control (MFC) problem where the cost function and the state dynamics depend upon the joint conditional distribution of the controlled state and the control process. In the first part, we consider the MFG setting. We start by recalling the notions of measure-valued MFG equilibria and of approximate closed-loop Nash equilibria associated to the corresponding N -player game. Then, we show that all convergent sequences of approximate closed-loop Nash equilibria, when N → ∞, converge to measure-valued MFG equilibria. And conversely, any measure-valued MFG equilibrium is the limit of a sequence of approximate closed-loop Nash equilibria. In other words, measure-valued MFG equilibria are the accumulation points of the approximate closed-loop Nash equilibria. Previous work has shown that measure-valued MFG equilibria are the accumulation points of the approximate openloop Nash equilibria. Therefore, we obtain that the limits of approximate closed-loop Nash equilibria and approximate open-loop Nash equilibria are the same. In the second part, we deal with the MFC setting. After recalling the closed-loop and open-loop formulations of the MFC problem, we prove that they are equivalent. We also provide some convergence results related to approximate closed-loop Pareto equilibria. (10.1051/cocv/2023005)
    DOI : 10.1051/cocv/2023005
  • Equilibrium in Functional Stochastic Games with Mean-Field Interaction
    • Abi Jaber Eduardo
    • Neuman Eyal
    • Voss Moritz
    , 2023. We consider a general class of nite-player stochastic games with mean-eld interaction, in which the linear-quadratic cost functional includes linear operators acting on controls in L2. We propose a novel approach for deriving the Nash equilibrium of the game explicitly in terms of operator resolvents, by reducing the associated rst order conditions to a system of stochastic Fredholm equations of the second kind and deriving their closed form solution. Furthermore, by proving stability results for the system of stochastic Fredholm equations we derive the convergence of the equilibrium of the N-player game to the corresponding mean-eld equilibrium. As a by-product we also derive an ε-Nash equilibrium for the mean-eld game, which is valuable in this setting as we show that the conditions for existence of an equilibrium in the meaneld limit are less restrictive than in the nite-player game. Finally we apply our general framework to solve various examples, such as stochastic Volterra linear-quadratic games, models of systemic risk and advertising with delay, and optimal liquidation games with transient price impact.
  • Decentralized Finance & Blockchain Technology
    • Gobet Emmanuel
    • Melachrinos Anastasia
    , 2023. Decentralized finance has soared in popularity since 2020, accounting for hundreds of billions of dollars in trade volume over the past two years. During this tutorial, we will begin by overviewing the core differences between centralized and decentralized finance, with a focus on those platforms underlying infrastructure (Blockchain), their design, their microstructure (Order books and Automated Market Makers), and what impact those differences have in the methodology used to value cryptocurrencies in a hybrid (on-chain and off-chain) world. Topics that will be talked about deeply include the impact of the blockchain’s inherent latency to design choices (AMMs vs Orderbooks or Hybrid DeFi Derivatives platforms), and similarities between high-frequency trading attacks and attacks observed on DeFi protocols. The DeFi protocols categories we will go through include decentralized exchanges, lending protocols, and DeFi derivatives. This tutorial will address these topics from a descriptive point of view to understand how it works in reality and from a quantitative point of view to model them. The presentation will also include open problems and future perspectives.
  • Multistep interface coupling for high-order adaptive black-box multiphysics simulations
    • François L.
    • Massot M.
    , 2023. Many multiphysics problem can be described by the coupling of several models through physical surfaces. Relying on existing model-specific solvers is very desirable, however they must be coupled in a way that ensures an accurate and stable coupled simulation. In this contribution, we present a multistep coupling scheme which relies on the history of the exchanged quantities to enable a high-order accurate coupling with time adaptation. Explicit and implicit variants are discussed in details. Numerical experiments conducted with an opensource demonstrator on a conjugate heat transfer problem show that high-order convergence is attained, and that stability is favourable compared to other classical approaches. (10.23967/c.coupled.2023.027)
    DOI : 10.23967/c.coupled.2023.027
  • An expansion formula for Hawkes processes and application to cyber-insurance derivatives *
    • Hillairet Caroline
    • Réveillac Anthony
    • Rosenbaum Mathieu
    Stochastic Processes and their Applications, Elsevier, 2023, 160, pp.89-119. In this paper we provide an expansion formula for Hawkes processes which involves the addition of jumps at deterministic times to the Hawkes process in the spirit of the wellknown integration by parts formula (or more precisely the Mecke formula) for Poisson functional. Our approach allows us to provide an expansion of the premium of a class of cyber insurance derivatives (such as reinsurance contracts including generalized Stop-Loss contracts) or risk management instruments (like Expected Shortfall) in terms of so-called shifted Hawkes processes. From the actuarial point of view, these processes can be seen as "stressed" scenarios. Our expansion formula for Hawkes processes enables us to provide lower and upper bounds on the premium (or the risk evaluation) of such cyber contracts and to quantify the surplus of premium compared to the standard modeling with a homogenous Poisson process. (10.1016/j.spa.2023.02.012)
    DOI : 10.1016/j.spa.2023.02.012
  • Hi! PARIS: IA et Sciences des données pour la société
    • Richard Gael
    • Nicolas Vieille
    • Eric Moulines
    Télécom : revue de l'Association Amicale des ingénieurs de l'Ecole Nationale Supérieure des télécommunications, 2023 (#209).
  • Numerical analysis of lattice Boltzmann schemes : from fundamental issues to efficient and accurate adaptive methods
    • Bellotti Thomas
    , 2023. The work presented in this thesis falls within the field tackling the analysis of numerical methods for Partial Differential Equations and pays particular attention to lattice Boltzmann schemes.This class of schemes has been used since the end of the 1980s, particularly in fluid mechanics, and is characterised by its great computational efficiency. However, lattice Boltzmann methods are very demanding in terms of memory space and are designed for uniform Cartesian meshes. Moreover, we lack general theoretical tools allowing us to analyse their consistency, stability and finally convergence. The work of the thesis is articulated around two main axes. The first one consists in proposing a strategy to apply lattice Boltzmann methods to non-uniform grids being adapted in time, in order to reduce the computing and storage costs. The ability to control the error and to be able to use the same approach irrespective of the underlying lattice Boltzmann scheme are additional constraints to be taken into account. To this end, we propose to dynamically adapt the lattice as well as to adjust any Boltzmann method to non-uniform meshes by relying on multiresolution analysis. This allows us to propose an innovative framework for moving meshes while respecting the posed constraints. Then, we demonstrate that the proposed method has excellent properties in terms of the perturbations of the original scheme and that it thus allows to reduce the spurious phenomena linked to the adapted meshes. The implementation of this procedure in an open-source software, allowing to represent and manage adapted grids by different approaches in a unified and innovative framework, is then addressed. The second line of research consists in giving a mathematically rigorous framework to the lattice Boltzmann methods, related in particular to their consistency with respect to the target PDEs, their stability, and thus their convergence. For this purpose, we propose a procedure, based on algebraic results, to eliminate the non-conservative moments of any lattice Boltzmann scheme, by recasting it into a multi-step Finite Difference scheme on the conserved moments. The notions of consistency and stability relevant to lattice Boltzmann methods are therefore those of Finite Difference schemes. In particular, all the results concerning the latter, among others the Lax theorem, are naturally transposed to the lattice Boltzmann schemes. A further step consists in studying the consistency and stability directly on the original scheme without having to calculate its ``corresponding'' Finite Difference method. This allows us to obtain the modified equations and to show the validity of the von Neumann stability analyses commonly used within the community. This new theoretical framework also makes it possible to study the influence of the initialization of the methods on the result of the simulations as well as to initiate preliminary studies on the monotonicity of lattice Boltzmann schemes and on their boundary conditions, which constitute openings for future work.
  • Mean estimation for Randomized Quasi Monte Carlo method
    • Gobet Emmanuel
    • Lerasle Matthieu
    • Métivier David
    , 2023.
  • Strong Gaussian approximation of metastable density-dependent Markov chains on large time scales
    • Prodhomme Adrien
    Stochastic Processes and their Applications, Elsevier, 2023, 160 (June), pp.218-264. Density-dependent Markov chains form an important class of continuous-time Markov chains in population dynamics. On any fixed time window [0, T ], when the scale parameter K > 0 is large such chains are well approximated by the solution of an ODE (the fluid limit), with Gaussian fluctuations superimposed upon it. In this paper we quantify the period of time during which this Gaussian approximation remains precise, uniformly on the trajectory, in the case where the fluid limit converges to an exponentially stable equilibrium point. We provide a new coupling between the density-dependent chain and the approximating Gaussian process, based on a construction of Kurtz using the celebrated Komlós-Major-Tusnády theorem for random walks. We show that under mild hypotheses the time T(K) necessary for the strong approximation error to reach a threshold ε(K)<<1 is at least of order exp(V K ε(K)), for some constant V > 0. This notably entails that the Gaussian approximation yields the correct asymptotics regarding the time scales of moderate deviations. We also present applications to the Gaussian approximation of the logistic birth-and-death process conditioned to survive, and to the estimation of a quantity modeling the cost of an epidemic. (10.1016/j.spa.2023.01.018)
    DOI : 10.1016/j.spa.2023.01.018
  • On the estimation of extreme quantiles with neural networks
    • Allouche Michaël
    • Girard Stéphane
    • Gobet Emmanuel
    , 2023. We propose new parametrizations for neural networks in order to estimate extreme quantiles in both non-conditional and conditional heavy-tailed settings. All proposed neural network estimators feature a bias correction based on an extension of the usual second-order condition to an arbitrary order. The convergence rate of the uniform error between extreme log-quantiles and their neural network approximation is established. The finite sample performances of the non-conditional neural network estimator are compared to other bias-reduced extreme-value competitors on simulated data. It is shown that our method outperforms them in difficult heavy-tailed situations where other estimators almost all fail. The source code is available at https://github.com/michael-allouche/nn-quantile-extrapolation.git. Finally, the conditional neural network estimators are implemented to investigate the behaviour of extreme rainfalls as functions of their geographical location in the southern part of France.
  • Tutorial "Quantitative issues in Centralised and Decentralised Finance
    • Gobet Emmanuel
    • Melachrinos Anastasia
    , 2023.
  • The quintic Ornstein-Uhlenbeck volatility model that jointly calibrates SPX & VIX smiles
    • Abi Jaber Eduardo
    • Illand Camille
    • Li Shaun Xiaoyuan
    Risk, Infopro Digital, 2023, Cutting edge section. The quintic Ornstein-Uhlenbeck volatility model is a stochastic volatility model where the volatility process is a polynomial function of degree five of a single Ornstein-Uhlenbeck process with fast mean reversion and large vol-of-vol. The model is able to achieve remarkable joint fits of the SPX-VIX smiles with only 6 effective parameters and an input curve that allows to match certain term structures. Even better, the model remains very simple and tractable for pricing and calibration: the VIX squared is again polynomial in the Ornstein-Uhlenbeck process, leading to efficient VIX derivative pricing by a simple integration against a Gaussian density; simulation of the volatility process is exact; and pricing SPX products can be done efficiently and accurately by standard Monte Carlo techniques with suitable antithetic and control variates.
  • Optimal ecological transition path of a credit portfolio distribution, based on Multidate Monge-Kantorovich formulation
    • Gobet Emmanuel
    • Lage Clara
    Annals of Operations Research, Springer Verlag, 2023. Accounting for climate transition risks is one of the most important challenges in the transition to a low-carbon economy. Banks are encouraged to align their investment portfolios to CO2 trajectories fixed by international agreements, showing the necessity of a quantitative methodology to implement it. We propose a mathematical formulation for this problem and a multistage optimization criterion for a transition between the current bank portfolio and a target one. The optimization Problem combines the Monge-Kantorovich formulation of optimal transport, for which the cost is defined according to the financial context, and a credit risk measure. We show that the problem is well-posed, and can be embedded into a saddle-point problem for which Primal-Dual algorithms can be used. We design a numerical scheme that is able to solve the problem in available time, with nice scalability properties according to the number of decision times; its numerical convergence is analysed. Last we test the model using real financial data, illustrating that the optimal portfolio alignment may differ from the naive interpolation between the initial portfolio and the target. (10.1007/s10479-023-05385-4)
    DOI : 10.1007/s10479-023-05385-4
  • Local volatility under rough volatility
    • Bourgey Florian
    • de Marco Stefano
    • Friz Peter
    • Pigato Paolo
    Mathematical Finance, Wiley, 2023, 33 (4), pp.1119-1145. Several asymptotic results for the implied volatility generated by a rough volatility model have been obtained in recent years (notably in the small‐maturity regime), providing a better understanding of the shapes of the volatility surface induced by rough volatility models, supporting their calibration power to SP500 option data. Rough volatility models also generate a local volatility surface, via the so‐called Markovian projection of the stochastic volatility. We complement the existing results on implied volatility by studying the asymptotic behavior of the local volatility surface generated by a class of rough stochastic volatility models, encompassing the rough Bergomi model. Notably, we observe that the celebrated “1/2 skew rule” linking the short‐term at‐the‐money skew of the implied volatility to the short‐term at‐the‐money skew of the local volatility, a consequence of the celebrated “harmonic mean formula” of [Berestycki et al. (2002). Quantitative Finance, 2, 61–69 ], is replaced by a new rule: the ratio of the at‐the‐money implied and local volatility skews tends to the constant (as opposed to the constant 1/2), where H is the regularity index of the underlying instantaneous volatility process. (10.1111/mafi.12392)
    DOI : 10.1111/mafi.12392
  • Tropical linear regression and mean payoff games: or, how to measure the distance to equilibria
    • Akian Marianne
    • Gaubert Stéphane
    • Qi Yang
    • Saadi Omar
    SIAM Journal on Discrete Mathematics, Society for Industrial and Applied Mathematics, 2023, 37 (2), pp.632-674. We study a tropical linear regression problem consisting in finding the best approximation of a set of points by a tropical hyperplane. We establish a strong duality theorem, showing that the value of this problem coincides with the maximal radius of a Hilbert's ball included in a tropical polyhedron. We also show that this regression problem is polynomial-time equivalent to mean payoff games. We illustrate our results by solving an inverse problem from auction theory. In this setting, a tropical hyperplane represents the set of equilibrium prices. Tropical linear regression allows us to quantify the distance of a market to the set of equilibria, and infer secret preferences of a decision maker. (10.1137/21M1428297)
    DOI : 10.1137/21M1428297
  • Generative modeling of extremes with neural networks
    • Allouche Michaël
    • Girard Stéphane
    • Gobet Emmanuel
    , 2023.