Da Long

Da Long

Ph.D. Student in Computer Science

Kahlert School of Computing, the University of Utah

About Me

I am a Ph.D. student in computer science at the Kahlert School of Computing, University of Utah. I earned my Bachelor of Science degree from the University of Arizona, with double majors in computer science and mathematics.

My research primarily focuses on AI for Science. This inlcudes long-term forecasting multi-scale and multi-physics dynamics, developing generative/determinstic neural operators for functional mapping, scientific large language model, foundation model for physics, etc. I am advised by Dr. Shandian Zhe.

My research interest includes:

  • AI for Science: Foundation Model for Physics, Spatio-temporal Forecasting, Surrogate Modeling
  • Probabilistic Learning: Scientific Large Language Model, Diffusion Model for Operator Learning, Gaussian Process

I am looking for a machine learning internship for summer 2025. Please contact me via email if you are interested.

Publications

(2025). Invertible Fourier Neural Operators for Tackling Both Forward and Inverse Problems. AISTATS.

PDF

(2024). Equation Discovery with Bayesian Spike-and-Slab Priors and Efficient Kernels. AISTATS.

PDF

(2024). A kernel approach for pde discovery and operator learning. Physica D: Nonlinear Phenomena.

PDF

(2023). Solving High Frequency and Multi-Scale PDEs with Gaussian Processes. ICLR.

PDF

(2022). AutoIP: A united framework to integrate physics into Gaussian processes. ICML.

PDF

Preprint

(2024). Spatio-temporal Fourier Transformer (StFT) for Long-term Dynamics Prediction. Preprint.

(2024). Arbitrarily-Conditioned Multi-Functional Diffusion for Multi-Physics Emulation. Preprint.

PDF

(2024). Toward Efficient Kernel-Based Solvers for Nonlinear PDEs. Preprint.

(2024). Pseudo Physics-Informed Neural Operators. Preprint.

Teaching

Teaching Mentorships

  • CS 6190 Probabilistic Machine Learning (Spring 2023)
  • CS 6350 Machine Learning (Fall 2022)