About Me
I am an undergraduate student majoring in Artificial Intelligence at Dalian University of Technology, where I study in the School of Future Technology. My recent work is centered on AI for science, with a particular interest in protein optimization, molecular representation learning, and language-model-driven research systems.
I also write on my personal blog and keep notes in my personal wiki.
Research Interests
- Agentic research and AI scientist systems
- AI for biology, especially protein optimization
- Molecular representation learning for large language models and graph neural networks
News
- Mar. 2026 Our paper Language-guided expertise evolution for protein optimization was accepted as a poster at the ICLR 2026 Workshop on AI with Recursive Self-Improvement.
- Jan. 2026 I was invited by MiraclePlus (奇绩创坛) for an interview on protein optimization and ranked in the top 10%.
- Sep. 2025 I was awarded First Prize in the National Mechanical Engineering Innovation Competition.
- Aug. 2025 I received Third Prize in the Northeast regional round of the China Collegiate Computing Contest AIGC Innovation Competition.
- Jun. 2025 I received First Prize in the Dalian University of Technology Information Security Competition.
Research
Protein optimization with language-guided expertise evolution
In Language-guided expertise evolution for protein optimization, we explore a multi-agent framework that adapts frozen language models to protein optimization by evolving an external expertise pool with dense language feedback. This line of work focuses on making scientific optimization more scalable and interpretable than parameter-heavy reinforcement learning.
Protein hydration and crystallization condition prediction
I am also working on protein hydration and crystallization condition prediction. This project uses large language models to extract structured crystallization conditions from free-text Protein Data Bank records, resulting in a curated sub-database of roughly 7,000 proteins with complete crystallization metadata. On top of that dataset, I use graph convolutional models to predict major crystallization variables such as PEG concentration and polymerization degree, and I am exploring how hydration information can strengthen protein representations.
Molecular representation learning for large language models
In From Graphs to Tokens: Substructure-Aware Molecular Representation for Large Language Models, we study substructure-aware tokenization for molecular graphs so that large language models can better capture chemical structure and generalize to unseen molecules.
Motif-driven graph representation learning
In Motif-driven molecular graph representation learning, we investigate a universal way to integrate molecular motifs into graph neural networks, improving model expressiveness while remaining compatible with different GNN architectures.
Selected Publications
- Xingyue Liu, Zijie Xing, Runze Wang, Luoming Hu, and Yanming Shen. Language-guided expertise evolution for protein optimization. ICLR 2026 Workshop on AI with Recursive Self-Improvement.
- Runze Wang, Zijie Xing, Xingyue Liu, Mingqi Yang, Che He, and Yanming Shen. From Graphs to Tokens: Substructure-Aware Molecular Representation for Large Language Models. Information Processing and Management, 2026.
- Runze Wang, Yuting Ma, Xingyue Liu, Zijie Xing, and Yanming Shen. Motif-driven molecular graph representation learning. Expert Systems with Applications, 2025.
More details are available on the publications page and my Google Scholar profile.
Selected Honors
- National Scholarship, 2024-2025
- Academic Excellence Scholarship, 2023-2024 and 2024-2025
- National First Prize,
机械工程创新创意赛无损检测 AI 评片赛, 2025 - Third Prize, Northeast regional round, China Collegiate Computing Contest AIGC Innovation Competition, 2025
- First Prize, Dalian University of Technology Information Security Competition, 2025
