|
|
|
|
|
Research Interests
My research interests have been on the numerical analysis, scientific
computing,
algorithm, partial differential equations and tangential suspension system.
I have been working on:
- Tangential Wheel Suspension System .
- Moving mesh finite element methods.
- Conservative front tracking.
- Back and forth error compensation and correction (BFECC) method with
applications in level set interface computation, fluid simulations, conveniently computing electromagnetic waves with complicated geometry etc.
A smoke simulation by NVIDIA using BFECC.
- Central schemes and central discontinuous Galerkin methods on overlapping cells for conservation laws
and associated differential equations.
- Non-oscillatory hierarchical reconstruction (HR) for discontinuous Galerkin methods,
central discontinuous Galerkin methods, central and finite volume schemes. A common mistake in the
implementation of HR..
- Neural Networks with Local Converging Inputs (NNLCI, preprint, publication) predicts smooth and non-smooth PDE solutions across diverse complex domain geometries, achieving orders-of-magnitude reductions in computational complexity while requiring minimal training data and exhibiting strong generalization. Predict solutions containing discontinuities--such as shocks, contact discontinuities, and their interactions--with sharp resolution and high efficiency; see [1] (1D) and [2] (2D, 2023 featured article, Comm. in Comput. Phys). Predict electromagnetic waves scattered off complicated perfect electric conductors (with training and prediction in different domains). Predict supersonic flows in irregular domains using unstructured grids--even when training and prediction occur on different geometries--selected by the AIAA Journal Seminar Series Committee as one of 16 open-access highlights spanning 2021-2025, and further distinguished among the top 5 papers featured in a globally broadcast author seminar . Predict solutions of a PNP ion-channel model--nonlinear elliptic systems defined over multiple domains with complex interface conditions and singular source terms. Key features of NNLCI include: 1. The neural network architecture can remain small and simple. 2. A single fine-grid simulation can yield hundreds to thousands of local samples for training. 3. Fine-grid simulations for training can be sparsely distributed across the parameter space, enabling strong generalization. 4. It accurately captures both smooth solution features and discontinuities, including complex shock interactions, with sharp resolution. 5. It is well-suited for complex domain geometries, and training and prediction can be performed on different domains. 6. 2D numerical experiments demonstrate a two-order-of-magnitude reduction in computational complexity for shock-interaction problems, and roughly a 500-fold reduction for smooth solutions such as electromagnetic waves.
|