Hongyang (Ryan) Zhang Address: |
Hi! I am an Assistant Professor of Computer Science at Northeastern University, Boston, working at the intersection of machine learning, design and analysis of algorithms, learning theory, data, and networks. Some of the topics I work on are:
Efficient learning algorithms (e.g., for using large neural networks).
Multitask learning, fine-tuning.
Generalization of neural networks, for instance measuring geometric properties of large models using Hessian.
Matrix and tensor methods, nonconvex optimization.
Data, data augmentation, data selection, networks.
Transportation, road safety.
I received a Ph.D. in computer science from Stanford, where I worked within the theoretical computer science and the statistical machine learning groups. I was a postdoc at UPenn for a brief stint. More information about me can be found in my cv.
I enjoy working on technically challenging problems, and strive for broader impacts by creating new knowledge and improving access to education. I support accessible and reproducible research. See our experiment codes on GitHub. Here is a list of my recent activities.
See below for some of my representative papers
Identification of Negative Transfers in Multitask Learning Using Surrogate Models, Transactions on Machine Learning Research (TMLR) 2023, Featured Certification
D. Li, H. L. Nguyen, and H. R. Zhang
Generalization in Graph Neural Networks: Improved PAC-Bayesian Bounds on Graph Diffusion, Artificial Intelligence and Statistics (AISTATS) 2023
H. Ju, D. Li, A. Sharma, and H. R. Zhang
Robust Fine-Tuning of Deep Neural Networks with Hessian-based Generalization Guarantees, International Conference on Machine Learning (ICML) 2022
H. Ju, D. Li, and H. R. Zhang
On the Generalization Effects of Linear Transformations in Data Augmentation, International Conference on Machine Learning (ICML) 2020
S. Wu*, H. R. Zhang*, G. Valiant, and C. Ré
Understanding and Improving Information Transfer in Multi-Task Learning, International Conference on Learning Representations (ICLR) 2020
S. Wu*, H. R. Zhang*, and C. Ré
Algorithmic Regularization in Over-parameterized Matrix Sensing and Neural Networks with Quadratic Activations, Conference on Learning Theory (COLT) 2018
Y. Li*, T. Ma*, and H. Zhang*
Preprints
Noise Stability Optimization for Flat Minima with Tight Rates
H. Ju, D. Li, and H. R. Zhang
Precise High-Dimensional Asymptotics for Quantifying Heterogeneous Transfers
F. Yang, H. R. Zhang, S. Wu, C. Ré, and W. Su
You may also see my Google Scholar for publication information.
Here are the slides that presentes our recent work on multitask learning, fine-tuning, and a boosting framework.
DS 5220, Supervised Machine Learning and Learning Theory, Fall 2021, Fall 2022, Fall 2024.
CS 6140, Machine Learning, Fall 2023.
CS 7140, Advanced Machine Learning, Spring 2021, Spring 2022, Spring 2023.
DS 4400, Machine Learning and Data Mining I, Spring 2023.
CS 7180, Special Topics in Artificial Intelligence (Algorithmic and Statistical Aspects of Deep Learning), Fall 2020.
Dongyue Li, CS PhD
Michael Zhang, CS PhD
Zhenshuo Zhang, CS PhD
Haotian Ju, Master student RA
Abhinav Nippani, Master student RA
Prospective students: We are actively looking for students to join the lab. If you have ideas, you are welcome to contact me to discuss. We host visiting students who are passionate about research, and we work with students who are already on campus. You may want to look at our recent papers and projects first before reaching out to me. We are particularly excited about students who have a strong background in mathematics or programming. You can email me at hongyang90@gmail.com.
Note for student visitors: The 177 building has access control. If you plan to visit me in-person, you should confirm the scheduled time with me before arrival so that I could give you permission to enter the building at the security.
Program Committee Member: ICML (2019-2023; meta-reviewer, 2024), NeurIPS (2019-2024), ICLR (2021-2024), COLT (2024), AISTATS (2021-2022; meta-reviewer, 2023-2024), ALT (meta-reviewer, 2024), KDD (2023-2024), AAAI (2020-2022; meta-reviewer, 2025), WWW 2022, WSDM (2022-2024).
External Conference/Journal Reviewer: STOC (2017, 2018, 2022), FOCS (2015, 2024), SODA (2016, 2021), ITCS (2018, 2019), WINE (2014), ICALP (2014). IEEE Transactions on Information Theory (2022), JMLR (2022), TMLR (2023-).
Conference Organization: INFORMS session chair (2023-2024).
October 2019 to July 2020: Postdoctoral Researcher, University of Pennsylvania
September 2013 to September 2019: Ph.D., Stanford University
July 2012 to July 2013: Research Assistant, Nanyang Technological University
September 2008 to September 2012: B.Eng., Shanghai Jiao Tong University
I grew up in Tianmen, Hubei (in China) and my parents currently lives in Tianjin. My Chinese name is written as 张泓洋.