Shimao Zhang (张世茂)

alt text 

Master Student
School of Computer Science
Nanjing University, Xianlin Campus
163 Xianlin Avenue, Qixia District
Nanjing 210023, China
E-mail: smzhang@smail.nju.edu.cn

About me

I'm a master student of School of Computer Science at Nanjing University now. I have just received my B.Sc. degree at Nanjing University in 2023. I have finished a period of unforgettable time in LAMDA Group supervised by Prof. Lijun Zhang during my undergraduate years. Additionally, I'm admitted to pursue my M.Sc. degree in Nanjing University without entrance examination. I'm now a member of NJU NLP Group, co-advised by Prof. Shujian Huang and Prof. Jiajun Chen.

I'm a research intern at NU MLL Lab in Northwestern University this summer of 2025, advised by Prof. Manling Li.

And I was a research intern in Microsoft Research Asia (MSRA) (work with Dr. Yeyun Gong, Dr. Xiao Liu), Ant Research (work with Dr. Jian Guan) and Meituan M17 Foundation Model Group.

Seeking for 2026 Fall PhD opportunities! Please feel free to contact me!

Just feel free to follow my accounts listed in the menu. And if you are interested to cooperate with me, please contact me.

CV [English Version] (Please contact me directly for the lastest version)

Research Interests

My current research interests include:

  • Large Language Models (LLMs)

  • Reasoning

  • Interpretability

  • Trustworthy AI

I’m also highly interested in expanding my research to multimodal scenarios in the future, including the physical world.

Publications

* Equal Contribution   Corresponding Author

  • How does Alignment Enhance LLMs' Multilingual Capabilities? A Language Neurons Perspective

    Shimao Zhang*, Zhejian Lai*, Xiang Liu*, Shuaijie She, Xiao Liu, Yeyun Gong, Shujian Huang, and Jiajun Chen

    AAAI 2026 (Oral Presentation)

    [arXiv] [Code]

  • Process-based Self-Rewarding Language Models

    Shimao Zhang, Xiao Liu, Xin Zhang, Junxiao Liu, Zheheng Luo, Shujian Huang, and Yeyun Gong

    ACL 2025 - Findings

    [arXiv] [Code]

  • Getting More from Less: Large Language Models are Good Spontaneous Multilingual Learners

    Shimao Zhang, Changjiang Gao, Wenhao Zhu, Jiajun Chen, Xin Huang, Xue Han, Junlan Feng, Chao Deng, and Shujian Huang

    EMNLP 2024 (Oral Presentation)

    [arXiv] [Code]

  • Distributed Projection-free Online Learning for Smooth and Convex Losses

    Yibo Wang*, Yuanyu Wan*, Shimao Zhang, and Lijun Zhang

    AAAI 2023 (Oral Presentation)

    [PDF] [Bibtex]

  • EDT: Improving Large Language Models' Generation by Entropy-based Dynamic Temperature Sampling

    Shimao Zhang, Yu Bao, and Shujian Huang

    arXiv:2403.14541

    [arXiv]

  • PATS: Process-Level Adaptive Thinking Mode Switching

    Yi Wang*, Junxiao Liu*, Shimao Zhang*, Jiajun Chen, and Shujian Huang

    arXiv:2505.19250

    [arXiv] [Code]

More information of publications in Google Scholar.

Awards & Honors

  • First-class Academic Scholarship, Nanjing University, 2023, 2024, 2025

  • Huatai Securities Technology Scholarship, Nanjing University, 2025

  • Outstanding Graduate Student, Nanjing University, 2024, 2025(pacesetter)

  • BYD Scholarship, Nanjing University, 2024

  • People's Scholarship, Nanjing University, 2021, 2022

  • First Prize, The 34th National High School Mathematics League, Chinese Mathematical Society, 10.2018




Last Updated 2025-12-27 17:14