Juncai He

Juncai He 

Juncai He

R.H. Bing Postdoctoral Fellow
Department of Mathematics
The University of Texas at Austin
2515 Speedway, PMA
Austin, TX 78712

Phone: +1 616-227-5319
Email: jhe@utexas.edu

About me

I am currently a R.H. Bing postoctoral fellow in the Department of Mathematics at UT Austin.

I received the B.S. degree in Mathematics and Applied Mathematics from Sichuan University in 2014. In Summer of 2019, I received my Ph.D. degree in Computational Mathematics under the supervision of Prof. Jinchao Xu and Prof. Jun Hu at Peking University at Beijing, China. From 2019 to 2020, I worked as a Postdoctoral Scholar supervised by Prof. Jinchao Xu in The Center for Computational Mathematics and Application (CCMA) in the Department of Mathematics at The Pennsylvania State University, University Park.


  • Deep Learning, Stochastic Optimization.

  • Numerical Analysis, Finite Element Methods, Multigrid Methods.

My research interests are in algorithm development and theoretical analysis for machine learning and numerical methods for partial differential equations (PDEs). I have received broad and in-depth training in finite element, multigrid (MG) methods and machine learning. I have studied the finite element exterior calculus (FEEC) method both for its theoretical analysis and also the application in structure-preserving discretization for multi-physical systems. I have also applied techniques from numerical PDEs for understanding and improving deep learning models and algorithms in data science. In particular, I have worked on three different but related topics:

  1. finite element methods and deep neural networks (DNNs);

  2. multigrid methods and architecture of convolutional neural networks;

  3. stochastic optimization methods.


  • J. He, L. Li, J. Xu. Approximation Properties of Deep ReLU CNNs. ArXiv: 2109.00190, 2021.

  • Q. Chen, W. Hao, J. He. A Weight Initialization Based on the Linear Product Structure for Neural Networks. ArXiv: 2109.00125, 2021.

  • J. He, L. Li, J. Xu. ReLU Deep Neural Networks from the Hierarchical Basis Perspective. ArXiv: 2105.04156, 2021.

  • Q. Chen, W. Hao, J. He. Power Series Expansion Neural Network. ArXiv: 2102.13221, 2021.

  • J. He, X. Jia, J. Xu, L. Zhang and L. Zhao. Make ell_1 Regularization Effective in Training Sparse CNN. Computational Optimization and Applications. 2020. https:doi.org10.1007s10589-020-00202-1 ArXiv: 1807.04222v4.

  • J. He, L. Li, J. Xu, and C. Zheng. ReLU Deep Neural Networks and linear Finite Elements. Journal of Computational Mathematics. 38(3): 502-527, 2020. doi:10.4208/jcm.1810-m2018-0096.

  • J. He, K. Hu, J. Xu. Generalized Gaffney Inequality and Discrete Compactness for Discrete Differential Forms. Numerische Mathematik. 2019. https:doi.org10.1007s00211-019-01076-0.

  • J. He and J. Xu. MgNet: A Unified Framework of Multigrid and Convolutional Neural Network. Science China Mathematics. 62(7): 1331–1354, 2019. https:doi.org10.1007s11425-019-9547-2.

  • J. He, Y. Chen, L. Zhang and J. Xu. Constrained Linear Data-feature Mapping in Image Classification. ArXiv: 1911.10428, 2019.



Instructor at UT Austin for:

Teaching assistant at Penn State and Peking University for:

  • MATH 497: Deep Learning Algorithms and Analysis , Penn State University, Jul. 2019

  • An Introduction for Applied Mathematics, Peking University, Feb. 2017 - Jun. 2017

  • Advanced Linear Algebra I, Peking University, Sept. 2016 - Jan. 2017

  • Calculus, Peking University, Sept. 2015 - Jan. 2016

The Workshops/ Minisymposium Organized