Wednesday Lunch Seminar

Date: Wednesday, January 24, 2024, 12:00 PM ET

Dear All,
This week ProfessorSteven Frankelwill present a study on “GPU-Accelerated Simulations of Hypersonic Propulsion.”Here is a summary of the talk:

In this talk, we will present details of our recent developments related to GPU-accelerated high-fidelity numerical methods for hypersonic propulsion.  Specifically, we will present a series of validation cases related to hypersonic boundary layer transition culminating in recent simulations of Mach 6 flow over a cone-cylinder-flare geometry.  In addition, efforts related to simulating multi-scalar mixing in cavity-based scramjets will be presented.  The talk will conclude with some comments related to next steps including possible roles for machine learning to aid in further simulations.

This week, the seminar takes place over the Zoom on Wednesday at 12:00 pm EST. Please join the seminar using the following link:

Zoom Link

Crunch Seminar

Presentation #1 (12pm-1pm): Efficient and Physically Consistent Surrogate Modeling of Chemical Kinetics Using Deep Operator NetworksAnuj Kumar, North Carolina State University

Abstract: In the talk, we’ll explore a new combustion chemistry acceleration scheme we’ve developed for reacting flow simulations, utilizing deep operator networks (DeepONets). The scheme, implemented on a subset of thermochemical scalars crucial for chemical system’s evolution, advances the current solution vector by adaptive time steps.  In addition, the original DeepONet architecture is modified to incorporate the parametric dependence of these stiff ODEs associated with chemical kinetics.  Unlike previous DeepONet training approaches, our training is conducted over short time windows, using intermediate solutions as initial states. An additional framework of latent-space kinetics identification with modified DeepONet is proposed, which enhances the computational efficiency and widens the applicability of the proposed scheme. The scheme is demonstrated on the “simple” chemical kinetics of hydrogen oxidation and the more complex chemical kinetics of n-dodecane high- and low-temperatures. The proposed framework accurately learns the chemical kinetics and efficiently reproduces species and temperature temporal profiles. Moreover, a very large speed-up with a good extrapolation capability is also observed with the proposed scheme. Additional framework of incorporating physical constraints such as total mass and elemental conservation, into the training of DeepONet for subset of thermochemical scalars of complex reaction mechanisms is proposed. Levering the strong correlation between full and subset of scalars, the framework establishes an accurate and physically consistent mapping. The framework is demonstrated on the chemical kinetics of CH4 oxidation.

Presentation #2 (1pm-2pm): SNIP: Bridging Mathematical Symbolic and Numeric Realms with Unified Pre-training
Kazem Meidani, Carnegie Mellon University

Abstract: In an era where symbolic mathematical equations are indispensable for modeling complex natural phenomena, scientific inquiry often involves collecting observations and translating them into mathematical expressions. Recently, deep learning has emerged as a powerful tool for extracting insights from data. However, existing models typically specialize in either numeric or symbolic domains, and are usually trained in a supervised manner tailored to specific tasks. This approach neglects the substantial benefits that could arise from a task-agnostic unified understanding between symbolic equations and their numeric counterparts. To bridge the gap, we introduce SNIP, a Symbolic-Numeric Integrated Pre-training, which employs joint contrastive learning between symbolic and numeric domains, enhancing their mutual similarities in the pre-trained embeddings. By performing latent space analysis, we observe that SNIP provides cross-domain insights into the representations, revealing that symbolic supervision enhances the embeddings of numeric data and vice versa. We evaluate SNIP across diverse tasks, including symbolic-to-numeric mathematical property prediction and numeric-to-symbolic equation discovery, commonly known as symbolic regression. Results show that SNIP effectively transfers to various tasks, consistently outperforming fully supervised baselines and competing strongly with established task-specific methods, especially in few-shot learning scenarios where available data is limited.


Zoom Link

Crunch seminar youtube channel

Welcome to the CRUNCH Seminars hosted by our dynamic research group every Friday! Join us for a captivating exploration of the cutting-edge field of Scientific Machine Learning as we welcome expert speakers from around the globe. These engaging online sessions provide a unique platform for knowledge exchange and collaboration, attracting a diverse and global audience. Each week, our speakers share insights, innovations, and advancements in the exciting intersection of science and machine learning. Be part of the CRUNCH community, where ideas converge, and connections flourish, as we delve into the forefront of interdisciplinary research.

All CRUNCH Seminars are thoughtfully recorded for your convenience, ensuring that you can revisit and share the wealth of knowledge presented. Catch up on past talks at your own pace by exploring our comprehensive collection on our YouTube channel.