Past Event: PhD Dissertation Defense
12 – 1PM
Friday Jul 7, 2023
This work presents new approaches to model order reduction for a wide class of parameterized, time-dependent partial differential equations (PDEs). Our objective is to non-intrusively construct inexpensive computational models that can be solved rapidly to map parameter values to approximate PDE solutions. Such parameterized reduced-order models may be used as physics-based surrogates for uncertainty quantification and inverse problems that require many forward solves of parametric PDEs. Our approach is based on Operator Inference, a scientific machine learning framework combining data-driven learning and physics-based modeling. Traditional model order reduction methods use direct reductions of simulation codes to construct reduced-order models, but this is often infeasible for complex production-level codes. Operator Inference, by contrast, constructs reduced-order models using only knowledge of the structure of the governing equations for the physical system and available simulation data.
The major contributions of this work are threefold: (i) improving the robustness and algorithmic scalability of the Operator Inference approach by requiring stability from the learned reduced-order models through appropriate regularization, (ii) efficiently quantifying the errors and uncertainties associated with learning reduced-order models from data alone, and (iii) adapting the Operator Inference framework to the parametric setting for a wide class of problems. A customizable regularization is introduced to the operator regression problem to avoid over-fitting, improve conditioning in the regression, and promote stability in the model. The task of determining an optimal regularization is posed as an optimization problem that balances training error and stability of long-time integration dynamics. This regularization has a statistical interpretation when the task of learning a reduced-order model from data is posed as a Bayesian inverse problem. We further extend the framework to two classes of parametric problems: PDE systems with time-periodic solutions, and those with affine-parametric dependencies. In the first case, the theory of linear time-periodic systems motivates a linear reduced-order model with a new choice of inputs; in the second case, the parametric structure of the governing equations is embedded directly into the reduced-order model and the corresponding operator regression. We also state and prove conditions for well-posedness for the associated learning problem. Finally, we present an open-source implementation for this approach in Python.
Our approach applies to a wide range of scientific and engineering problems and is demonstrated in this work for multiple problems, including the FitzHugh-Nagumo neuron model, a single-injector combustion process, and a plasma flow model for a glow discharge device. With appropriate regularization and an informed selection of learning variables, the reduced-order models exhibit high accuracy in re-predicting the training regime and acceptable accuracy in predicting future dynamics, while achieving several orders of magnitude speedup in computational cost.
Shane A McQuarrie is a CSEM fellow and PhD candidate at the Oden Institute for Computational Engineering and Sciences at The University of Texas at Austin. Prior to studying in Texas, he earned a Bachelor of Science degree in mathematics (applied and computational mathematics emphasis) and a Master of Science degree in mathematics from Brigham Young University. Shane is the recipient of the 2023 John von Neumann Fellowship in Computational Science at Sandia National Laboratories. Shane is married to Laura Hilton, and they are the parents of three active children.