`Crunch Seminar`

## May 24,2023

Speaker: Dr. **Sathesh** **Mariappan**

**Abstract**

**Understanding vortex-acoustic lock-in in gas turbine combustors**

Lean premixed, pre-vaporized combustion systems are prone to large amplitude self-sustained detrimental oscillations, termed combustion instability. Traditionally, combustion instability is understood to be caused by a positive feedback loop between unsteady combustion and acoustic field of the combustion chamber. Modern combustors/afterburners have a swirler, a bluff body, or their combination to anchor flame. These burner geometries cause vortex shedding, which perturbs the flame strongly and, in many cases, becomes the dominant mechanism causing instability. One of the important and exciting features occurring in such vortex shedding combustors is the phenomenon of vortex-acoustic lock-in. In general, the frequencies of vortex shedding and acoustic field of the combustor are different. During instability, it was found experimentally that vortex shedding and acoustic oscillations occur at a common frequency. Thus the study of lock-in is an interesting academic problem and has an essential practical relevance.In the seminar, I will present our salient experimental observations as the combustor dynamics transition from unlocked to lock-in state. The observations are modeled using a lower-order integrate-and-fire-type model for vortex shedding coupled with the acoustic field. The coupled model is studied analytically in the context of a p:q lock-in. Bifurcation type and lock-in boundaries are identified. Various generic conclusions about lock-in and its relevance to combustion instability are discussed.

## Friday Seminar

**We are at the cross-roads in Computational Mathematics! **The machine learning revolution is real this time around and is changing our field in a fundamental way! We may experience the sudden death of FEM and other classical numerical methods, and the rise of new and simpler methods using **Deep Learning.** No longer do we have to spend days in building elaborate grids or agonizing over solution smoothness and precise boundary conditions. Instead, we will be able to produce realistic solutions for non-sterilized computational problems in diverse physical and biological sciences. Most importantly, we will be able to discover new equations from all this data!

**At CRUNCH we developed PINNs and DeepOnet.** We now lead the way in this new revolution on scientific machine learning for diverse applications. *The postdocs, students, and visitors of CRUNCH lead this revolution with a bold spirit and no fear of new directions and new challenging applications!*

**It’s all about PINNs!** that is Physics-Informed Neural Networks. The thrust of the research of the CRUNCH group is the development of data-driven stochastic multiscale methods for physical and biological applications, specifically numerical algorithms. We also employ visualization methods and parallel software for continuum and atomistic simulations in biophysics, soft matter and functional materials, fluid and solid mechanics, biomedicine and related applications. Scientific Machine Learning is a new (disruptive) area that we emphasize, i.e., encoding conservation laws into kernels to build Physics-informed Learning Machines or neural networks to build Physics-informed Neural Networks.

It’s all about DeepONet! We were the first to develop neural operators based on rigorous mathematical theory in 2019, see the original paper here. DeepONet is blazingly fast next to numerical solvers, according to Quantamagazine. Irina Higgins of DeepMind wrote in Nature-MI: “…Once DeepONet is trained, it can be applied to new input functions, thus producing new results substantially faster than numerical solvers. Another benefit of DeepONet is that it can be applied to simulation data, experimental data or both, and the experimental data may span multiple orders of magnitude in spatiotemporal scales, thus allowing scientists to estimate dynamics better by pooling the existing data. While DeepONet is still only a first step towards building truly powerful and scalable universal operator approximators, it opens up exciting opportunities, like modeling the dynamics of complex systems where no analytical descriptions exist, for example social dynamics…”

**Long history of pioneering research:** Previously, numerical methods developed at CRUNCH are spectral/hp element methods, multi-element polynomial chaos, stochastic molecular dynamics (DPD), and spectral and high-order methods for fractional partial differential equations. The CRUNCH group has pioneered such methods, e.g. the spectral element method on unstructured meshes (1995), generalized polynomial chaos (gPC) for uncertainty quantification, rigorous coarse grained molecular methods (2010), and poly-fractonomials for fractional operators (2015). More recently we have focused on PINNs, hidden fluid mechanics, and numerical Gaussian Processes that allow us to solve PDEs from noisy measurements, only without the tyranny of building elaborate grids! Currently we employ deep neural networks to solve complex PDEs in continuous space-time domains! We are also interested to employ fractional operators and Gaussian processes to discover Hidden Physics Models. Our group has pionerred this!

Funding is currently provided by DOE, AFOSR, DARPA, ARO, ARL, NIH and NSF, Cummins, Ansys, FEA, etc.

## CRUNCH is the cathedral of interdisciplinary research

CRUNCH supports diversity and inclusion. It has supported the MET School @ PVD, WISE@Brown, and the Association of Women in Mathematics@Brown.

### Math + Machine Learning + X

The CRUNCH group is the research team led by Professor George Em Karniadakis in the Division of Applied Mathematics at Brown University. CRUNCH members have diverse interdisciplinary backgrounds, and they work at the interface of Computational Mathematics + Machine Learning + X, where X may be problems in biology, geophysics, soft matter, functional materials, physical chemistry, or fluid and solid mechanics.

CRUNCH’s leader, Professor George Karniadakis, was elected to the National Academy of Engineering (Class of 2022) in recognition of his contributions to engineering for “computational tools, from high-accuracy algorithms to machine learning, and applications to complex flows, stochastic processes, and microfluidies.”

Professor Karniadakis received the 2021 SIAM/ACM Prize in Computational Science and Engineering for “advancing spectral elements, reduced-order modeling, uncertainty quantification, dissipative particle dynamics, fractional PDEs, and scientific machine learning. He accomplished all this while pushing applications to extreme computational scales and mentoring many leaders. Professor Karniadakis has devoted over 30 years of mentorship to his Ph.D. students from over 20 institutions and 10 different nationalities!

The CRUNCH group welcomes collaborators and visitors with bold ideas from across different fields. Our new emphasis is on Scientific Machine Learning and on PINNs that the CRUNCH group pioneered. The industry likes it (thanks to ANSYS and NVIDIA) and everyone copies us shamelessly but we like it! PINNs are Physics-Informed Neural Networks and we have a whole alphabet of PINNS: cPINNS (conservative); vPINNs (variational); pPINNs (parareal); nPINNs (nonlocal); B-PINNs (Bayesian), etc. You can find all the papers here!

PINNs is the most downloaded paper in JCP. DeepONet is the new game changer for operator regression.

An atomistic fingerprint algorithm for learning *ab initio* molecular force fields

Understanding of the interactions between sickle cell fibers