Da Long

Da Long

Ph.D. Student in Computer Science

Kahlert School of Computing, the University of Utah

About Me

I am a Ph.D. student in computer science at the Kahlert School of Computing, University of Utah. I earned my Bachelor of Science degree from the University of Arizona, with double majors in computer science and mathematics.

My research primarily focuses on AI for Science. This inlcudes long-term forecasting multi-scale and multi-physics dynamics, developing generative/determinstic neural operators for functional mapping, scientific large language model, foundation model for physics, etc. I am advised by Dr. Shandian Zhe.

My research interest includes:

  • AI for Science: Foundation Model for Physics, Spatio-temporal Forecasting, Surrogate Modeling
  • Probabilistic Learning: Scientific Large Language Model, Diffusion Model for Operator Learning, Gaussian Process

I am on job market for full-time roles starting in summer/fall 2026. Please feel free to reach out to me if you are hiring. Any opportunity is appreciated.

Publications

(2025). Arbitrarily-Conditioned Multi-Functional Diffusion for Multi-Physics Emulation. ICML.

PDF

(2025). Toward Efficient Kernel-Based Solvers for Nonlinear PDEs. ICML.

PDF

(2025). Invertible Fourier Neural Operators for Tackling Both Forward and Inverse Problems. AISTATS.

PDF

(2024). Equation Discovery with Bayesian Spike-and-Slab Priors and Efficient Kernels. AISTATS.

PDF

(2024). A kernel approach for pde discovery and operator learning. Physica D: Nonlinear Phenomena.

PDF

Preprint

(2024). Spatio-temporal Fourier Transformer (StFT) for Long-term Dynamics Prediction. Preprint.

PDF

(2024). Pseudo Physics-Informed Neural Operators. Preprint.

PDF

Teaching

Teaching Mentorships

  • CS 6190 Probabilistic Machine Learning (Spring 2023)
  • CS 6350 Machine Learning (Fall 2022)