I am currently a Ph.D. student in National University of Defense Technology (NUDT). I am supervised by Prof. Xinjun Mao and co-supervised by Prof. Yue Yu. I received the bachelor’s degree from Tsien Hsue-shen Class, NUDT, in June 2020. My research interests include AI/LLM4SE, code reuse, code snippet adaptation, and empirical software engineering.

🔥 News

  • 2025.09:  🎉🎉 Our paper about Code Adaptation Benchmark (AdaptEval) was accepted by ASE 2025 after major revision! This is the first ASE paper in our group!
  • 2025.06:  🎉🎉 One paper accepted to ICSME 2025.
  • 2025.04:  🎉🎉 Thrilled to announce our ICPC paper won 🏆ACM SIGSOFT Distinguished Paper Award!
  • 2025.01:  🎉🎉 One paper was accepted by ICPC 2025.
  • 2024.12:  🎉🎉 One paper was accepted by SANER 2025.
  • 2024.10:  🎉🎉 Our paper about LLM-based Code Adaptation was accepted by ICSE 2025! This is the first CCF-A SE conference paper in our group!

📝 Publications

Representative Works

ASE 2025
sym

AdaptEval: A Benchmark for Evaluating Large Language Models on Code Snippet Adaptation

Tanghaoran Zhang, Xinjun Mao, Shangwen Wang, Yuxin Zhao, Yao Lu, Jin Zhang, Zhang Zhang, Kang Yang and Yue Yu.

ASE 2025 (CCF-A)

Project

  • We construct AdaptEval, the first benchmark for evaluating LLM-based code snippet adaptation. It incorporates three distinctive features: (1) practical context derived from developers’ practices, preserving rich contextual information from Stack Overflow and GitHub communities; (2) multi-granularity annotations supporting the evaluation of LLMs across diverse adaptation scenarios; (3) fine-grained evaluation implemented by a two-tier testing framework, which enables evaluating LLMs’ performance across various individual adaptations. Six instruction-tuned LLMs and three reasoning LLMs are evaluated.
ICSE 2025
sym

Instruct or Interact? Exploring and Eliciting LLMs’ Capability in Code Snippet Adaptation Through Prompt Engineering

Tanghaoran Zhang, Yue Yu, Xinjun Mao, Shangwen Wang, Kang Yang, Yao Lu, Zhang Zhang and Yuxin Zhao.

ICSE 2025 (CCF-A)

Project

  • We first empirically investigate the capability of Large Language Models (LLMs) on the code adaptation task and find their sub-optimal performance are caused by three main reasons: (1) Unclear Requirement, (2) Requirement Misalignment and (3) Context Misapplication. To resolve above issues, we propose an interactive prompting approach to eliciting LLMs’ ability in code snippet adaptation.
TSE 2024
sym

How Do Developers Adapt Code Snippets to Their Contexts? An Empirical Study of Context-Based Code Snippet Adaptations

Tanghaoran Zhang, Yao Lu, Yue Yu, Xinjun Mao, Yang Zhang and Yuxin Zhao.

TSE (CCF-A, SCI-Q1), 2024

Project

  • We investigate how developers adapt code snippets to their project context based on a semi-structured interview and a quantitative study on 300 real-world adaptation cases. We point out current challenges in the adaptation practices and obtain four typical context-based adaptation patterns.

All Publications

🎖 Honors and Awards

  • 2025.04, 🏆ACM SIGSOFT Distinguished Paper Award in ICPC 2025.
  • 2023.03, 💰Second-Prize Merit Scholarship, NUDT.
  • 2020.05, 💰Qiangjun Scholarship, NUDT.
  • 2019.10, 🏅Outstanding Winner of M* Modeling Contest.
  • 2019.10, 🏆CCF Outstanding Undergraduate, CCF.
  • 2019.05, 💰Yinhe Scholarship, College of Computer Science and Technology, NUDT.
  • 2019.05, 🏅Meritorious Winner of MCM/ICM.
  • 2018.05, 🏅Meritorious Winner of MCM/ICM.

📖 Educations

  • 2020.09 - now, National University of Defense Technology (NUDT), Ph.D. Student in Software Engineering.
  • 2016.09 - 2020.06, Tsien Hsue-shen Class (1/30), National University of Defense Technology (NUDT), B.E. in Software Engineering.
  • 2010.09 - 2016.06, The High School Affiliated to Renmin University of China (RDFZ), Middle and High School.

⚙️ Services

  • Program Committee
    • ICSE’26 Shadow PC
  • Reviewer
    • EMSE, KAIS
  • External Reviewer
    • Journal: TSE, TOSEM, EMSE, JCST, JoS
    • Conference: ICLR’25, ASE’24, ESEM’24

💬 Invited Talks

  • 2024.12, ICSE 2025 Paper Pre-conference Presentation | Video.

💻 Internships

  • 2025.03 - 2025.09, Visiting student at Peng Cheng Laboratory, Shenzhen, Guangdong.
  • 2019.07 - 2019.10, Mitacs Research Internship, at SEAL in Queen’s University, Canada, supervised by Prof. Ying Zou.