Jinda Jia
Email: jindjia@iu.edu and LinkedIn
Ph.D. in progress, Indiana University Bloomington
M.S., University of Florida
B.S., Shandong University
Bio
I am a second-year Ph.D. student at Indiana University, currently supervised by Dr. Dingwen Tao and Dr. Fengguang Song. My research interests lie in Machine Learning Systems, High-Performance Computing, and Distributed Training Systems.
Currently, I am working on projects focused on reducing communication overhead in large language model (LLM) training. My research explores techniques such as compression and computation-communication overlapping to improve efficiency.
Posts
- Dec 2024: I will present our work, SDP4Bit, as a poster at NeurIPS 2024 in Vancouver.
- Nov 2024: Attending SC2024 in Atlanta from 17 to 22.
- Nov 2024: Congrats! Our paper, COMPSO was accepted to PPoPP 2025.
- Oct 2024: Congrats! Our paper, SDP4Bit was accepted to NeurIPS 2024, and I am honored to be the first equal-contribution collaborator on it.
Publications
-
COMPSO: Optimizing Gradient Compression for Distributed Training with Second-Order Optimizers
Baixi Sun, Weijin Liu, J. Gregory Pauloski, Jiannan Tian, Jinda Jia, Daoce Wang, Boyuan Zhang, Mingkai Zheng, Sheng Di, Sian Jin, Zhao Zhang, Xiaodong Yu, Kamil A. Iskra, Pete Beckman, Guangming Tan, Dingwen Tao
PPoPP 2025 -
SDP4Bit: Toward 4-bit Communication Quantization in Sharded Data Parallelism for LLM Training
Jinda Jia*, Cong Xie*, Hanlin Lu, Daoce Wang, Hao Feng, Chengming Zhang, Baixi Sun, Haibin Lin, Zhi Zhang, Xin Liu, Dingwen Tao
NeurIPS 2024
Paper Code