I will join the Geometric Machine Learning Group at Harvard as a postdoc in June 2026. I received my PhD in computer science from the University of California, San Diego, advised by Rose Yu. I received my master’s degree in computer science from Georgia Institute of Technology, and bachelor degrees in computer science and physics from the University of Illinois at Urbana-Champaign.

I am interested in mathematical structures in deep learning. My research focuses on symmetries in neural networks, as well as their impact on optimization, loss landscapes, deep learning theory, and applications. For an overview, see our TMLR survey paper, Symmetry in Neural Network Parameter Spaces.

Publications

Conference

  • Emergence of Hierarchical Emotion Organization in Large Language Models
    Bo Zhao*, Maya Okawa*, Eric J. Bigelow, Rose Yu, Tomer Ullman, Ekdeep Singh Lubana, Hidenori Tanaka
    International Conference on Machine Learning (ICML), 2026
    [arXiv] [blog]

  • Demystifying Mergeability: Interpretable Properties to Predict Model Merging Success
    Luca Zhou, Bo Zhao, Rose Yu, Emanuele Rodolà
    International Conference on Machine Learning (ICML), 2026
    [arXiv]

  • Understanding Mode Connectivity via Parameter Space Symmetry
    Bo Zhao, Nima Dehmamy, Robin Walters, Rose Yu
    International Conference on Machine Learning (ICML), 2025
    [arXiv]

  • Understanding the Difficulty of Solving Cauchy Problems with PINNs
    Tao Wang, Bo Zhao, Sicun Gao, Rose Yu
    Learning for Dynamics and Control Conference (L4DC), 2024
    [arXiv] [code]

  • Improving Convergence and Generalization Using Parameter Symmetries
    Bo Zhao, Robert M. Gower, Robin Walters, Rose Yu
    International Conference on Learning Representations (ICLR), 2024 (Oral presentation)
    [arXiv] [code]

  • DYffusion: A Dynamics-informed Diffusion Model for Spatiotemporal Forecasting
    Salva Rühling Cachay, Bo Zhao, Hailey Joren, Rose Yu
    Advances in Neural Information Processing Systems (NeurIPS), 2023
    [arXiv] [code] [blog]

  • Symmetries, Flat Minima, and the Conserved Quantities of Gradient Flow
    Bo Zhao*, Iordan Ganev*, Robin Walters, Rose Yu, Nima Dehmamy (*equal contribution)
    International Conference on Learning Representations (ICLR), 2023
    [arXiv] [code]

  • Symmetry Teleportation for Accelerated Optimization
    Bo Zhao, Nima Dehmamy, Robin Walters, Rose Yu
    Advances in Neural Information Processing Systems (NeurIPS), 2022
    [arXiv] [code] [video]

  • LIMO: Latent Inceptionism for Targeted Molecule Generation
    Peter Eckmann, Kunyang Sun, Bo Zhao, Mudong Feng, Michael Gilson, Rose Yu
    International Conference on Machine Learning (ICML), 2022
    [paper] [code] [video] [website]

  • Concentric Spherical Neural Network for 3D Representation Learning
    James Fox, Bo Zhao, Beatriz Gonzalez Del Rio, Sivasankaran Rajamanickam, Rampi Ramprasad, Le Song
    International Joint Conference on Neural Networks (IJCNN), 2022
    [paper] [pdf] [code]

Journal

  • Symmetry in Neural Network Parameter Spaces
    Bo Zhao, Robin Walters, Rose Yu
    Transactions on Machine Learning Research (TMLR), 2026
    [arXiv]

  • Multiple Aging Mechanisms in Ferroelectric Deuterated Potassium Dihydrogen Phosphate
    Gregory A. Fields, Samuel F. Cieszynski, Bo Zhao, Kidan A. Tadesse, Mohammed A. Sheikh, Eugene V Colla, and M. B. Weissman.
    Journal of Applied Physics 125, 194102, 2019
    [paper] [pdf]

Selected preprints and workshops

  • A Survey of Weight Space Learning: Understanding, Representation, and Generation
    Xiaolong Han, Zehong Wang, Bo Zhao, Binchi Zhang, Jundong Li, Damian Borth, Rose Yu, Haggai Maron, Yanfang Ye, Lu Yin, Ferrante Neri
    ArXiv preprint arXiv: 2603.10090, 2026
    [arXiv]

  • Improving Learning to Optimize Using Parameter Symmetries
    Guy Zamir, Aryan Dokania, Bo Zhao, Rose Yu
    Workshop on Neural Network Weights as a New Data Modality at ICLR 2025
    [arXiv] [code]

  • Data-Free Transformer Quantization Using Parameter-Space Symmetry
    Lucas Laird, Bo Zhao, Rose Yu, Robin Walters
    Workshop on High-dimensional Learning Dynamics (HiLD) at ICML 2025

  • Optimizing Reasoning Efficiency through Prompt Difficulty Prediction
    Bo Zhao, Berkcan Kapusuzoglu, Kartik Balasubramaniam, Sambit Sahu, Supriyo Chakraborty, Genta Indra Winata
    Workshop on Efficient Reasoning at NeurIPS 2025
    [arXiv]

Teaching

  • Teaching Assistant, CS 291A Generative AI, UC San Diego (Fall 2025)

  • Teaching Assistant, CS 4641 Machine Learning, Georgia Tech (Fall 2020)

  • Lead Course Assistant, CS 225 Data Structures, UIUC (Spring 2019)

  • Course Assistant, CS 225 Data Structures, UIUC (Fall 2017, Spring 2018, Fall 2018)

Academic Service

  • Reviewer: ICML (2022—2026), NeurIPS (2022—2025), ICLR (2024—2026), AISTATS (2024—2026), AAAI (2025—2026), TMLR (2024—2026)

Experience

  • Capital One, Applied Research Intern, 2025.6—2025.8

  • NTT Research, Research Intern, 2024.6—2024.9

  • IBM, AI Intern, 2022.6—2022.9

Honors and Awards

  • Rising Stars in EECS, MIT and Boston University, 2025

  • Rising Stars in Data Science, Stanford University, 2025

  • NVIDIA Graduate Fellowship finalist, 2025

  • Qualcomm Innovation Fellowship finalist, 2024

  • DeepMind PhD Fellowship, 2023

Education

  • Ph.D. in Computer Science, University of California San Diego, 2021—2026

  • M.S. in Computer Science, Georgia Institute of Technology, 2019—2021

  • B.S. in Computer Science, University of Illinois at Urbana-Champaign, 2016—2019

  • B.S. in Physics, University of Illinois at Urbana-Champaign, 2016—2019