Research

Research Interests

  • Code Generation and Optimization
  • Deep Learning Systems
    • Frameworks for deploying deep learning workloads on distributed architectures. Related publications: arXiv 2021.
    • Graph compilers that study the fusion of graph nodes. Related publications: MLSys 2022, OSDI 2023.
    • Operator compilers that investigate automatic loop transformations and memory managements for tensor operations. Related publications: PLDI 2021, PACT 2022, MLSys 2023.
  • Numerical Program Analysis
    • Floating-point error detection. Related publications: ASE 2023.
    • Repairing numerical programs through expression rewriting.

Funding & Grants

  • 2021.01 - 2024.12
    Deep Learning and Tensor Compilers based on the Polyhedral Model
    • National Natural Science Foundation of China (Grant No. U20A20226)
    • A joint project with the team of Prof. Jidong Zhai and OneFlow Research
    • Principal Investigator of the Information Engineering University part
    • ¥ 2,600,000 in total; ¥ 800,000 for the Information Engineering University
  • 2019.01 - 2021.12
    Analysis and Optimization of the Precision of Mathematical Functions on Domestic Processors
    • National Natural Science Foundation of China (Grant No. 61802434)
    • Researcher (Principal Investigator: Prof. Jinchen Xu)
    • ¥ 250,000
  • 2018.01 - 2020.12
    Polyhedral Compilation Techniques for Heterogeneous Architectures
    • National Natural Science Foundation of China (Grant No. 61702546)
    • Principal Investigator
    • ¥ 240,000

Honors and Awards

  • 2023.1
    • ACM SIGHPC China Rising Star
  • 2023.09
    • Outstanding Young Scholar (A Class) of the Renmin University of China
  • 2020.1
    • IEEE/ACM MICRO-53 Best Paper Nominees
  • 2019.08
    • HPC China 2019 Outstanding Paper Award
  • 2017.11
    • SIGPLAN Grant for PLDI 2017
  • 2016.04
    • SIGPLAN PLMW Scholarship for OOPSLA 2016
  • 2013.01
    • Excellent MPhil Dissertation Award, 1st place