Ning Ding

Ph.D. student, Computer Science, Tsinghua University.

Tsinghua University

Beijing, China


I’m a final-year Ph.D student at the Department of Computer Science and Technology, Tsinghua Univeristy. I’m advised by Prof. Hai-Tao Zheng and also co-advised by Prof. Zhiyuan Liu.


My research spans the areas of natural language processing and machine learning. At the current stage, I am particularly interested advanced stimulation of language models. My research aims to develop theory, tools, and algorithms to effectively and efficiently drive language models (especially the large ones), and also establish a deeper understanding by observing the behaviors of models.

Selected Papers

  1. Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models
    Ning Ding, Yujia Qin, Guang Yang, Fuchao Wei, Zonghan Yang, Yusheng Su, Shengding Hu, Yulin Chen, Chi-Min Chan, Weize Chen, Jing Yi, Weilin Zhao, Zhiyuan Liu, Hai-Tao Zheng, Jianfei Chen, Yang Liu, Jie Tang, Juanzi Li, and Maosong Sun
    In arXiv Preprint
  2. OpenPrompt: An Open-source Framework for Prompt-learning
    🏆 Best Demo Paper Award
    Ning Ding, Shengding Hu, Weilin Zhao, Yulin Chen, Zhiyuan Liu, Hai-Tao Zheng, and Maosong Sun
    In ACL System Demonstration 2022
  3. Few-NERD: A Few-shot Named Entity Recognition Dataset
    Ning Ding, Guangwei Xu, Yulin Chen, Xiaobin Wang, Xu Han, Pengjun Xie, Hai-Tao Zheng, and Zhiyuan Liu
    In Annual Meeting of the Association for Computational Linguistics,
  4. Prototypical Representation Learning for Relation Extraction
    Ning Ding, Xiaobin Wang, Yao Fu, Guangwei Xu, Rui Wang, Pengjun Xie, Ying Shen, Fei Huang, Hai-Tao Zheng, and Rui Zhang
    In International Conference on Learning Representations,


Mar 23, 2022 Our paper about Delta Tuning is released!
Oct 1, 2021 Check out OpenPrompt, wonderful collaborators.
May 25, 2021 Check out our survey of PTMs.
May 25, 2021 Check out PTR.
May 21, 2021 Check out Few-NERD (to appear in ACL 2021).

web counter