About Me

I am a Ph.D. student in School of Computer Science & Technology, Beijing Institute of Technology.

My research interests are mainly focused on natural language generation (especially text summarization in cross-lingual scenarios) with deep learning approaches.

As large language models continue to gain popularity, I am curious about why they could be competent for so many challenging tasks. Hence, I am currently working on in-context learning, an iconic emerging ability of LLMs, to better understand the implicit working mechanism of large langauge models.

Additionally, I am also passionate about extending the impressive abilities of LLMs to languages beyond English.

My Ph.D. supervisors are Prof. Heyan Huang and Assoc. Prof. Yang Gao.

Download my resumé.

Interests
  • In-context learning
  • Text summarization
  • Low resource language generation
Education
  • Ph.D. Student, 2019 - 2025

    Beijing Institute of Technology

  • B.S., 2015 - 2019

    China University of Geosciences (Beijing)

News

  • [Jan 23, 2024] A new preprint about the working mechanism of in-context learning has been released on ArXiv!
  • [May 17, 2023] Finally confirmed to visit the McGill NLP Group and be supervised by Prof. Jackie C. K. Cheung for a year starting on July 1st. Hope to make some academic contributions during my time in Montreal!
  • [August 15, 2022] Another paper on which I am the third author has been accepted by COLING 2022!
  • [Apirl 20, 2022] A paper on which I am the third author has been accepted by IJCAI 2022!
  • [March 31, 2022] A paper has been accepted by SIGIR 2022! See you at the conference!
  • [June 28, 2021] I join Alibaba DAMO academy as a research intern!
  • [July 9, 2021] My volunteer application for ACL 2021 has been accepted. See you at the virtual conference!
  • [May 6, 2021] A paper has been accepted by ACL-IJCNLP 2021!

Recent Publications

(2023). DePA: Improving Non-autoregressive Translation with Dependency-Aware Decoder. IWSLT 2023.

PDF Cite

(2022). Unifying Cross-lingual Summarization and Machine Translation with Compression Rate. SIGIR 2022.

PDF Cite Code

(2022). PSP: Pre-trained Soft Prompts for Few-Shot Abstractive Summarization. COLING 2022.

PDF Cite

(2021). Exploring Explainable Selection to Control Abstractive Summarization. AAAI 2021.

PDF Cite Code

(2019). Multiple perspective answer reranking for multi-passage reading comprehension. NLPCC 2019.

PDF Cite Code

Contact