Hao Wang Profile
Hao Wang

@HaoGarfield

Followers
551
Following
288
Media
30
Statuses
234

Assistant Professor of Machine Learning at @RutgersCS, Scientist, MLer. Previously Postdoc at MIT. https://t.co/4NToayWNhi

New Brunswick, NJ
Joined October 2011
Don't wanna be here? Send us removal request.
@HaoGarfield
Hao Wang
6 days
By popular request, here’s the recording of my Test of Time Award presentation at KDD 2025 @kdd_news : https://t.co/sJshmXO6VG and here are the slides https://t.co/bOUjEpln6d Many thanks to Geoff Webbs again, for the warm introduction!
@HaoGarfield
Hao Wang
3 months
#KDD2025 #TestOfTimeAward Deeply honored to receive the ACM SIGKDD Test of Time Award at KDD 2025 for our 2015 paper “Collaborative Deep Learning for Recommender Systems.”
1
1
8
@HaoGarfield
Hao Wang
1 month
It is gratifying to see how an idea that began with precipitation nowcasting has grown into part of a foundation for many exciting directions, thanks to the creativity and efforts of so many in the community. @sxjscience
0
0
2
@HaoGarfield
Hao Wang
1 month
especially in Earth science and weather forecasting, e.g., Google DeepMind’s GraphCast (in @ScienceMagazine and GenCast (in @Nature).
1
0
1
@HaoGarfield
Hao Wang
1 month
contributing to video generation, prediction, sequence modeling, and computer vision. It also found real-world use, for example, in Microsoft’s MSN Weather forecasting models. Over the past decade, it has helped inspire impactful work in AI for Science,
1
0
0
@HaoGarfield
Hao Wang
1 month
Ten years ago at @NeurIPSConf 2015, we introduced ConvLSTM. It’s humbling to see it ranked as the most cited paper from NeurIPS 2015 by Semantic Scholar: https://t.co/kas6LmKKcT This was one of the first deep models for spatiotemporal learning,
3
0
8
@HaoGarfield
Hao Wang
2 months
who ensured vibrant discussions and engaging scientific exchange despite the challenges. I am immensely grateful for the resilience and collaborative spirit of our community, and deeply appreciative of the organizers for making the workshops such a success.
1
1
3
@HaoGarfield
Hao Wang
2 months
Although the workshop days almost completely overlapped with the Air Canada strike, we were still able to run the program smoothly. This was possible thanks to the extraordinary dedication, flexibility, and commitment of all 30 workshop organizing teams,
1
0
2
@HaoGarfield
Hao Wang
2 months
It has been an inspiring and rewarding experience to oversee 30 workshops at @IJCAIconf as the Workshop Chair, together with Gemma Moran.
1
0
5
@HaoGarfield
Hao Wang
2 months
Average: 3.45 Median: 3.13 Maximum: 4.75 Minimum: 2.25 25th percentile: 2.63 75th percentile: 4.44
1
0
3
@HaoGarfield
Hao Wang
2 months
(1) a score of 4.0 is at the 64th percentile; (2) many papers were withdrawn before the rebuttal, compared to previous ICML/ICLR cycles. Before rebuttal: Average: 3.16 Median: 3.13 Maximum: 4.00 Minimum: 2.25 25th Percentile: 2.63 75th Percentile: 3.69 After rebuttal:
1
0
4
@HaoGarfield
Hao Wang
2 months
Some statistics from my #NeurIPS AC batch: a huge boost in scores after the rebuttal, possibly related to my encouragement of active discussion. A few interesting facts from my batch:
1
0
8
@HaoGarfield
Hao Wang
3 months
Thank you to everyone who’s been part of this journey. KDD 2015 Paper: https://t.co/RJR161EpvF Review/Survey papers on (Hierarchical) Bayesian Deep Learning: https://t.co/sSb0P1HCT7 and
0
0
0
@HaoGarfield
Hao Wang
3 months
From the technical perspective, It unifies classic collaborative filtering (hence “collaborative” in the title) and modern deep learning into a Bayesian deep learning framework.
1
0
0
@HaoGarfield
Hao Wang
3 months
Our Collaborative Deep Learning (CDL) paper introduced one of the first deep learning methods for recommender systems, effectively kick-starting the era of deep recommender systems.
1
0
0
@HaoGarfield
Hao Wang
3 months
After a decade, it is humbling and exciting to see the lasting impact of our work in the field of deep-learning (and LLMs) recommender systems. It would not have been possible without my incredible coauthors back in 2015, Naiyan Wang and Dit-Yan Yeung, and the broader community.
1
0
0
@HaoGarfield
Hao Wang
3 months
#KDD2025 #TestOfTimeAward Deeply honored to receive the ACM SIGKDD Test of Time Award at KDD 2025 for our 2015 paper “Collaborative Deep Learning for Recommender Systems.”
3
1
12
@HaoGarfield
Hao Wang
6 months
If you happen to be at ICLR, come check out our poster #228 in Hall 3 & Hall 2B at 3pm-5:30pm, Apr 25 (Friday). My student Zihao Xu will be presenting: ) This is joint work with Zhuowei Li, Zihao Xu, Ligong Han, Yunhe Gao, Song Wen, Di Liu, and my colleague Dimitris Metaxas.
1
0
0
@HaoGarfield
Hao Wang
6 months
I2CL summarizes in-context examples and performs probabilistic in-context learning in the latent space, making it much more efficient. In fact, our method can scale to hundreds of in-context examples with minimal computational overhead.
1
0
0
@HaoGarfield
Hao Wang
6 months
#ICLR2025 #BayesDL #LLM #ICL Can LLMs enjoy the accuracy of many-shot in-context learning (ICL) with only the inference cost of zero-shot learning. To address this question, we proposed implicit in-context learning (I2CL).
1
3
16