We are recruiting undergraduate research interns!

Laboratory Introduction

The Large-scale Intelligence and Knowledge Laboratory (NJU-LINK) at Nanjing University, led by Professor Zhaoxiang Zhang and Assistant Professor Jiaheng Liu, focuses on cutting-edge technologies in the field of large models, conducting in-depth collaboration with leading domestic internet technology companies and first-class research institutions worldwide to explore the latest research achievements in today’s popular AI directions.

Introduction to the Mentor

Jiaheng Liu is currently an Assistant Professor and Special-Term Researcher at the School of Artificial Intelligence, Nanjing University.

After earning his Ph.D. in 2023, Dr. Liu was selected for the Alibaba Star program and worked at Alibaba until February 2025. He has published over 60 papers in internationally renowned academic journals and conferences.

A summary of Dr. Liu’s recent achievements includes:

  • ACL 2025 (13 papers accepted, 10 Main + 3 Findings): Covering code intelligence, long-chain reasoning, large model security, factual correctness evaluation, complex instruction following, multimodal understanding, large model alignment, planning and decision-making, and more.
  • ICLR 2025 (4 papers accepted): Covering agents, large model reasoning, music generation, code intelligence, and more.
  • NAACL 2025 (2 papers accepted, 1 Main + 3 Findings): Covering large model security, large model alignment, and more.
  • AAAI 2025 (2 papers accepted): Covering large model reasoning, table understanding based on large models, and more.
  • NeurIPS 2024 (4 papers accepted): Covering large model distillation, vertical domain large model training, agents, large model evaluation, and more.
  • ACL 2024 (7 papers accepted, 4 Main + 3 Findings, including 1 Outstanding Paper Award): Covering large model safety alignment, long-text enhancement for large models, multi-turn dialogue, code intelligence, large model evaluation, and more.
  • EMNLP 2024 (1 paper accepted, 1 Findings): Focused on long-text understanding in large models.
  • ICML 2024 (1 paper accepted): Focused on large model compression.

Our Collaborators

The laboratory currently maintains deep cooperation with Kuaishou, ByteDance, Alibaba, Shanghai AI Lab, StepFun, and other organizations, with abundant research projects and internship opportunities.

Our Resources

This laboratory maintains substantial computational resources and sufficient research funding over the long term. We are also happy to recommend laboratory members for internships and further studies at our partner organizations.

Lab Atmosphere

  • We maintain an open attitude towards all research ideas and concepts
  • The lab supports remote research internships, flexible work arrangements, and no clock-in requirements
  • Faculty-student communication is harmonious with no generation gap (if you like our logo, you’ll also like the atmosphere here)

Recruitment Information

Target Candidates

This research intern recruitment is aimed at: Class of 2024 undergraduates in related majors (sophomore year starting in September) and Class of 2023 undergraduates in related majors (junior year starting in September)

Research Directions

The laboratory welcomes undergraduates interested in the following research directions to join us in frontline research:

  • Large model pre-training, post-training, evaluation
  • Multimodal large models, code intelligence, agent systems, large language models

Our Commitments

  • All undergraduate students participating in research projects will be credited as co-authors on resulting papers.
  • For undergraduates with strong responsibility and research capabilities, with supervisor approval, they may independently lead a project and publish first-author papers at top conferences during their undergraduate studies.

Our Expectations for You

  • Solid Academic Foundation: Strong interest in research with good professional fundamentals.
  • Responsibility and Execution: Ability to take research tasks seriously, balance academics with research under supervisor guidance, and meet established goals on time.
  • Teamwork and Communication: Positive and optimistic attitude with excellent communication skills, ability to integrate quickly into the team and collaborate effectively.

Contact Us

Contact Person: Yanghai Wang

Please indicate your purpose when contacting, and attach your resume or brief self-introduction.

Jiaheng Liu
Jiaheng Liu
Assistant Professor, PhD Supervisor

Alibaba Star, one of the founding members of Multimodal Art Projection (M-A-P). Expert in large language models and multimodal large models.

Yanghai Wang
Yanghai Wang
Master Student

Master student at NJU-LINK Laboratory, focused on artificial intelligence and machine learning research.