Hongyang (Ryan) Zhang Address: 177 Huntington Ave FL 22nd, Room #2211 Email: ho.zhang@northeastern.edu |
I am an Assistant Professor of Computer Science at Northeastern University, Boston. My research interests lie at the intersection of machine learning, algorithms, statistical learning theory, and data mining, including topics such as:
Efficient learning algorithms for using and fine-tuning neural networks, as well as understanding the generalization properties of neural networks with complex struccture by examining the Hessian.
Efficient foundation models, mostly focusing on multitask learning and fine-tuning.
Learning with graph-structured data, including an application to studying traffic accident patterns on the road.
I received my Ph.D. in computer science from Stanford and my B.Eng. in computer science from Shanghai Jiao Tong University. I spent about a year as a postdoc within the statistics and data science department at University of Pennsylvania. For more information about me, please look at my cv.
I enjoy working on technically challenging problems, and strive for broader impacts by creating new knowledge and improving access to education. I support accessible and reproducible research. See our experiment codes on GitHub. Here is a list of my recent activities.
Generalization of neural networks, optimization algorithms, statistical learning theory (talk slides):
Noise Stability Optimization for Finding Flat Minima: A Hessian-based Regularization Approach. H. R. Zhang, D. Li, and H. Ju. Transactions on Machine Learning Research (TMLR), 2024
Generalization in Graph Neural Networks: Improved PAC-Bayesian Bounds on Graph Diffusion. H. Ju, D. Li, A. Sharma, and H. R. Zhang. Artificial Intelligence and Statistics (AISTATS), 2023
On the Generalization Effects of Linear Transformations in Data Augmentation. S. Wu*, H. R. Zhang*, G. Valiant, and C. Ré. International Conference on Machine Learning (ICML), 2020
Algorithmic Regularization in Over-parameterized Matrix Sensing and Neural Networks with Quadratic Activations. Y. Li*, T. Ma*, and H. Zhang*. Annual Conference on Learning Theory (COLT), 2018
Foundation models, multitask learning, fine-tuning (some recent talk slides):
Identification of Negative Transfers in Multitask Learning Using Surrogate Models. D. Li, H. L. Nguyen, and H. R. Zhang. Transactions on Machine Learning Research (TMLR), 2023. Featured Certification
Robust Fine-Tuning of Deep Neural Networks with Hessian-based Generalization Guarantees. H. Ju, D. Li, and H. R. Zhang. International Conference on Machine Learning (ICML), 2022
Understanding and Improving Information Transfer in Multi-Task Learning. S. Wu*, H. R. Zhang*, and C. Ré. International Conference on Learning Representations (ICLR), 2020
Manuscripts
Precise High-Dimensional Asymptotics for Quantifying Heterogeneous Transfers. F. Yang, H. R. Zhang, S. Wu, C. Ré, and W. Su
You may also see my Google Scholar for publication information.
DS 5220, Supervised Machine Learning and Learning Theory, Fall 2021, Fall 2022, Fall 2024.
CS 6140, Machine Learning, Fall 2023.
CS 7140, Advanced Machine Learning, Spring 2021, Spring 2022, Spring 2023.
DS 4400, Machine Learning and Data Mining I, Spring 2023.
CS 7180, Special Topics in Artificial Intelligence (Algorithmic and Statistical Aspects of Deep Learning), Fall 2020.
Link to our lab page. Here is a list of current, active lab members:
Dongyue Li, PhD
Zhenshuo Zhang, PhD
Haotian Ju, MS RA
Abhinav Nippani, MS RA
Prospective students: Please take a look at labpage and github. Students with a strong background in mathematics or programming are generally a good fit for conducting the projects. If you are interested, please contact me at hongyang90@gmail.com.
Students who wish to visit me at 177: Note that this building has access control. If you plan to visit me in-person, you should confirm the scheduled time with me before arrival so that I could give you permission to enter the building at the security.
Program Committee Member: ICML (2019-2023; meta-reviewer, 2024), NeurIPS (2019-2024), ICLR (2021-2024), COLT (2024), AISTATS (2021-2022; meta-reviewer, 2023-2025), ALT (meta-reviewer, 2024), KDD (2023-2024), AAAI (2020-2022; meta-reviewer, 2025), WWW 2022, WSDM (2022-2024).
External Conference/Journal Reviewer: STOC (2017, 2018, 2022), FOCS (2015, 2024), SODA (2016, 2021), ITCS (2018, 2019), WINE (2014), ICALP (2014). IEEE Transactions on Information Theory (2022), JMLR (2022), TMLR (2023-).
Conference Organization: INFORMS session chair (2023-2024).
October 2019 to July 2020: Postdoctoral Researcher, University of Pennsylvania
September 2013 to September 2019: Ph.D., Stanford University
July 2012 to July 2013: Research Assistant, Nanyang Technological University
September 2008 to September 2012: B.Eng., Shanghai Jiao Tong University
My Chinese name is written as 张泓洋.