Lastest News

New publication in Computer Methods in Applied Mechanics and Engineering

L. Sun*, H. Gao*, S. Pan, J.-X. Wang. Surrogate Modeling for Fluid Flows Based on Physics- Constrained Deep Learning Without Simulation Data. Computer Methods in Applied Mechanics and Engineering, 2019 (Forthcoming), see preprint

Numerical simulations on fluid dynamics problems primarily rely on spatially or/and temporally discretization of the governing equation into the finite-dimensional algebraic system solved by computers. Due to complicated nature of the physics and geometry, such process can be computational prohibitive for most real-time applications and many-query analyses. Therefore, developing a cost-effective surrogate model is of great practical significance. Deep learning (DL) has shown new promises for surrogate modeling due to its capability of handling strong nonlinearity and high dimensionality. However, the off-the-shelf DL architectures fail to operate when the data becomes sparse. Unfortunately, data is often insufficient in most parametric fluid dynamics problems since each data point in the parameter space requires an expensive numerical simulation based on the first principle, e.g., Naiver–Stokes equations. In this paper, we provide a physics-constrained DL approach for surrogate modeling of fluid flows without relying on any simulation data. Specifically, a structured deep neural network (DNN) architecture is devised to enforce the initial and boundary conditions, and the governing partial differential equations are incorporated into the loss of the DNN to drive the training. Numerical experiments are conducted on a number of internal flows relevant to hemodynamics applications, and the forward propagation of uncertainties in fluid properties and domain geometry is studied as well. The results show excellent agreement on the flow field and forward-propagated uncertainties between the DL surrogate approximations and the first-principle numerical simulations.

Luning and Han are second-year PhD students in J-X. Wang’s group. Congrats!

Presented PCML paper in USC Workshop on Research Challenges at the interface of Machine Learning and Uncertainty Quantification

Presented a paper entitled of Surrogate Modeling for Fluid Flows Based on Physics-Constrained, Label-Free Deep Learning at USC Workshop on Research Challenges at the interface of Machine Learning and Uncertainty Quantification. Please check out http://hyperion.usc.edu/MLUQ/agenda.html

Invited talk at Workshop of Machine Learning for Computational Fluid and Solid Dynamics

Tuesday, February 19, 2019 – Thursday, February 21, 2019  [www] Santa Fe

Recent breakthroughs in machine learning (ML), including the stunning successes of AlphaZero and AlphaGo, have demonstrated tremendous potential for transforming scientific computing.  The application of these exciting advances in algorithms and computer architectures to the computational modeling and simulation community introduces several additional requirements and challenges beyond traditional applications such as data analytics and computer vision. These include physical constraints (the subject of CNLS Physics-Informed Machine Learning workshops in 2016 and 2018), the need for uncertainty quantification (UQ), and computational requirements for embedded ML models, e.g. for parameter tuning, sub-scale physics models, optimization, UQ, or data assimilation. This workshop will bring together international leaders in the development and application of ML methods for fluid and solid dynamics.

Welcome to Jian-Xun Wang’s Research Group

Dr. Wang’s current research focuses on data-enabled, physics-based computational modeling for a number of physical systems, including cardiovascular/cerebrovascular flows, intracranial system, turbulent flows, and other computational-mechanics problems. The main idea is to develop accurate physics-based computational models by leveraging available data from high-fidelity simulations, experiments, and clinical measurements using advanced data assimilation and machine learning techniques. Moreover, he is also interested in quantifying and reducing uncertainties associated with the developed computational models.