![]() |
Assistant Professor of Computer Science ho.zhang@northeastern.edu (work) Address: 177 Huntington Ave, Room 2211 (note). |
Hi! I am an assistant professor of computer science at Northeastern University since Fall 2020. My research interests lie at the intersection of machine learning and deep learning, statistical learning theory, and network analysis, with a particular emphasis on the design and analysis of algorithms, and the generalization properties of deep neural networks. I received my Ph.D. from Stanford, advised by Ashish Goel and Greg Valiant. Meanwhile, I had the great fortune working with the Theoretical Computer Science and Machine Learning groups at Stanford. Before moving to Boston, I spent ten months as a postdoc at UPenn's statistics department. I received my B.Eng. from Shanghai Jiao Tong University.
My current research focuses on methodological and theoretical questions relating to machine learning and network analysis, including understanding generalization in deep and pretrained neural networks, using data-dependent measures such as Hessians, designing principled methods for robustly learning from multiple tasks or heterogeneous subpopulations, and designing algorithms for social, mobility network data.
Representative Works
Generalization in Graph Neural Networks: Improved PAC-Bayesian Bounds on Graph Diffusion
Haotian Ju, Dongyue Li, Aneesh Sharma, and H. R. Zhang
Artificial Intelligence and Statistics (AISTATS) 2023. Accepted
Robust Fine-Tuning of Deep Neural Networks with Hessian-based Generalization Guarantees
Haotian Ju, Dongyue Li, and H. R. Zhang
International Conference on Machine Learning (ICML) 2022
[Presented at ICML Workshop 2022], [code]
On the Generalization Effects of Linear Transformations in Data Augmentation
Sen Wu*, H. R. Zhang*, Gregory Valiant, and Christopher Ré
International Conference on Machine Learning (ICML) 2020
[code], [blog: Automating the Art of Data Augmentation – Part III Theory]
Understanding and Improving Information Transfer in Multi-Task Learning
Sen Wu*, H. R. Zhang*, and Christopher Ré
International Conference on Learning Representations (ICLR) 2020
[video], [blog: When Multi-Task Learning Works – And When It Doesn’t]
Algorithmic Regularization in Over-parameterized Matrix Sensing and Neural Networks with Quadratic Activations
Yuanzhi Li, Tengyu Ma, and H. R. Zhang*
Conference on Learning Theory (COLT) 2018. Best Paper Award
You can also see a more complete list of my publications, either chronologically, or categorized.
I work with an amazing group of students including: Dongyue Li (Ph.D., starting 2021), Haotian Ju (MS in Data Analytics, since 2021), Shreya Singh (MS in CS, since 2022), and Jack Wilkins (Undergraduate in CS, since 2022).
Graduated students: Virender Singh (MS 2021; First employment: Data Scientist at Salesforce), Minghao Liu (MS 2022; First employment, SDE at Palantir).
Prospective students: Please take a look at my recent papers and recent projects at GitHub. Undergraduate students interested in pursuing research are also encouraged to contact me. Students from underrepresented populations are especially encouraged to apply to our Ph.D. in Computer Science program and will be eligible for two years of fellowship. A list of currently open project topics includes:
Multitask learning and federated learning.
Deep learning theory, i.e., generalization and optimization.
The ideal student needs to be self-motivated, and have a strong background in programming. If you are interested in working with my students and me as an RA, please fill out this form.
Recent Activities
January 2023 (paper): Excited about a new paper that will be presented at AISTATS’23! We prove tight generalization bounds for message-passing neural networks that scale with the spectral norm of the graph diffusion matrices.
December 2022 (paper): Excited about a recent paper that will be presented at SDM’23! We present algorithms to reduce epidemic spread on weighted graphs and time-varying graphs.
December 2022 (service): Panelist of NSF Core programs.
November 2022 (talk): Excited to receive an invitation as part of AAAI’23 New Faculty Highlights!
October 2022 (talk): Talked about transfer learning and random matrix theory at INFORMS.
September 2022 (service): Excited to be an area chair of AISTATS’23!
August 2022 (service): PC member of WSDM’23 and AAAI’23.
August 2022 (paper): presented a new algorithm about reducing epidemic spread at the epiDAMIK workshop at KDD’22.
A list of old updates
CS 7140, Advanced Machine Learning, Spring 2021, Spring 2022, Spring 2023.
DS 4400, Machine Learning and Data Mining I, Spring 2023.
DS 5220, Supervised Machine Learning and Learning Theory, Fall 2021, Fall 2022.
CS 7180, Special Topics in AI: Algorithmic and Statistical Aspects of Deep Learning, Fall 2020.
2020: postdoc at UPenn.
2019: Ph.D. from Stanford (advisors: Ashish Goel and Greg Valiant).
2013: research assistant at Nanyang Technological University (advisor: Ning Chen, and also Xiaotie Deng).
2012: B.Eng. from Shanghai Jiao Tong University (advisors: Ning Chen and Pinyan Lu, part of the ACM class advised by Yong Yu).