I am currently a second year Ph.D. student at Princeton University, and I am fortunate to be advised by Prof. Sanjeev Arora.

I completed my undergraduate at Andrew Chi-Chih Yao’s CS pilot class at Tsinghua University, where I am advised by Prof. Wei Chen. Previously, I am also a reserach intern at Microsoft Research Asia Theory group and my mentor is Prof. Wei Chen. In my junior year, I also visited Duke University where I had a great time working with Prof. Rong Ge.

Research Interests

I am broadly interested in theoretical machine learning. Some particular interests include theory for natual language processing, deep learning theory, theory of optimization and federated learning, online algorithms and online leraning.

News

  • [Jan. 2022] New paper out: BEER: Fast O(1/T) Rate for Decentralized Nonconvex Optimization with Communication Compression.
  • [Dec. 2021] New paper out: Faster Rates for Compressed Federated Learning with Client-Variance Reduction.
  • [Aug. 2021] I give a talk at FLOW: Federated Learning One World Seminar about our new federated learning paper.

Contact Information

haoyu AT princeton DOT edu

zhaohy16 AT mail DOT tsinghua DOT org DOT cn

thomaszhao1998 AT gmail DOT com