Prof. Jian-xun Wang's research group -- we advance knowledge at the Interface of scientific AI and computational physics (scientific machine learning, data assimilation, physics-informed deep learning, Bayesian learning, differentiable programming, uncertainty quantification)
I am very happy to share that I delivered a 3-hour short course on Scientific Machine Learning (SciML) for computational physics at the ASME Summer Heat Transfer Conference (SHTC 2024) this week. https://event.asme.org/SHTC/Program/Short-Courses
During the course, we explored:
Using physics to regularize ML training and inform neural architecture design
Integrating ML with traditional numerical solvers using differentiable programming
Leveraging generative AI for modeling stochastic spatiotemporal processes
It was wonderful to see so many participants and to connect with researchers and students.
Title: Neural Differentiable Physics: Unifying Numerical PDEs and Deep Learning for Data-Augmented Computational Physics
Abstract:
Predictive modeling and simulation are essential for understanding, predicting, and controlling complex physical processes across many engineering disciplines. However, traditional numerical models, which are based on first principles, face significant challenges, especially for complex systems involving multiple interacting physics across a wide range of spatial and temporal scales. (1) A primary obstacle stems from our often-incomplete understanding of the underlying physics, which results in inadequate mathematical models that fail to accurately capture system behavior. (2) Additionally, the high computational demands of traditional solvers represent another substantial barrier, especially when real-time control or many repeated model queries are required, as in design optimization, inference, and uncertainty quantification. Fortunately, the continual evolution of sensing technology and the exponential increase in data availability have opened new avenues for the development of data-driven computational modeling frameworks. Bolstered by advanced machine learning and GPU computing techniques, these models hold the promise of greatly enhancing our predictive capabilities, effectively tackling the challenges posed by traditional numerical models. While data science and machine learning offer novel methods for computational mechanics models, challenges persist, such as the need for extensive data, limited generalizability, and lack of interpretability. Addressing existing challenges for predictive modeling issues requires innovative computational methods that integrate advanced machine learning techniques with physics principles. This talk will introduce some of our efforts along this direction, spotlighting the Neural Differentiable Physics, a novel SciML framework unifying classic numerical PDEs and advanced deep learning models for computational modeling of complex physical systems. Our approach centers on the integration of numerical PDE operators into neural architectures, enabling the fusion of prior knowledge of known physics, multi-resolution data, numerical techniques, and deep neural networks through differentiable programming. The way for integrating physics into the deep learning model represents a novel departure from existing SciML frameworks, such as Physics-Informed Neural Networks (PINNs). By combining the strengths of known physical principles and established numerical techniques with cutting-edge deep learning and AI technology, this innovative framework promises to inaugurate a new era in the understanding and modeling of complex physical systems, with far-reaching implications for science and engineering applications.
Our group, CoMSAIL, will be delivering 6 presentations at APS DFD 2023. If you are also attending the conference in DC, please stop by and check out our talks (focusing on differentiable physics, GPU computing, and hybrid neural modeling for fluid flow)