Chenyu Wang

Massachusetts Institute of Technology

About me
Publications & Preprints
Selected Awards
Internship Experience

About Me

I am a second-year PhD student at MIT EECS, advised by Tommi Jaakkola and Caroline Uhler. I am also affiliated with Eric and Wendy Schmidt Center (EWSC) at Broad Institute. My research interests lie broadly in machine learning, representation learning, and AI for science. Recently my research focuses on multi-modal representation learning and perturbation modelling for drug discovery. I am also interested in foundation models for science and spatial-temporal modelling in system biology.

Before my PhD, I obtained my Bachelor’s degree from Tsinghua University, working as a research assistant in Tsinghua Universal Machine Learning (THUML) Group under the supervision of Mingsheng Long. I was also fortunate to work as a research intern with Mengdi Wang at Princeton University and with Cyrus Shahabi at University of Southern California.

Google Scholar / LinkedIn / Twitter

Resume (Updated in Jan 2024)

Publications & Preprints

Removing Biases from Molecular Representations via Information Maximization
Chenyu Wang, Sharut Gupta, Caroline Uhler, Tommi Jaakkola
International Conference on Learning Representations. ICLR 2024.
[Paper] [Code]

We propose InfoCORE to mitigate the confounding factors in multimodal molecular representation learning from multiple information sources, in particular the confounding batch effects in drug screening data.

Tree-Based Neural Bandits for High-Value Protein Design
Chenyu Wang*, Joseph Kim*, Le Cong, Mengdi Wang (* Equal Contribution)
56th Annual Conference on Information Sciences and Systems. CISS 2022.

We propose an MCTS-guided neural contextual bandits algorithm that utilizes a modified upper-confidence bound algorithm for accelerating the search for optimal protein designs.

HAGEN: Homophily-Aware Graph Convolutional Recurrent Network for Crime Forecasting
Chenyu Wang*, Zongyu Lin*, Xiaochen Yang, Mingxuan Yue, Jiao Sun, Cyrus Shahabi (* Equal Contribution)
AAAI Conference on Artificial Intelligence. AAAI 2022. (Oral Presentation)
[Paper] [Code] [Talk at TGL]

We present HAGEN, an end-to-end graph convolutional recurrent network with a novel homophily-aware graph learning module to learn spatiotemporal dynamics for crime forecasting.

Open Domain Generalization with Domain-Augmented Meta-Learning
Yang Shu*, Zhangjie Cao*, Chenyu Wang, Jianmin Wang, Mingsheng Long (* Equal Contribution)
IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2021.
[Paper] [Code]

We propose a novel Domain-Augmented Meta-Learning framework (DAML) to address the new open domain generalization problem, which conducts meta-learning over domains augmented at both feature-level and label-level.


Massachusetts Institute of Technology
PhD student in Computer Science
Advisor: Tommi Jaakkola, Caroline Uhler

Tsinghua University
B.S. in Economics
Minor in Data Science and Technology
Advisor: Mingsheng Long
Mentor: Yang Shu

Princeton University
Research intern
Advisor: Mengdi Wang, Le Cong
Mentor: Joseph Kim, Huazheng Wang

University of Southern California
Research intern
Advisor: Cyrus Shahabi
Mentor: Jiao Sun, Mingxuan Yue

Selected Awards

Internship Experience


I enjoy (and perhaps good at) doing sports. During undergrad, I was an active member in the track team and soccer team in my school, getting 1st place in 4*400m relay, 3rd place in 1500m, women’s soccer champion etc. I’m also a fan of literature and classical music. I enjoy travelling and tasting local delicacies.

Powered by Jekyll and Minimal Light theme.