Shuiwang Ji Profile Banner
Shuiwang Ji Profile
Shuiwang Ji

@ShuiwangJi

Followers
2,561
Following
3,136
Media
3
Statuses
1,636

Machine Learning, AI for Science, Professor and Presidential Impact Fellow, Texas A&M University Fellow, IEEE and AIMBE

College Station, TX
Joined June 2012
Don't wanna be here? Send us removal request.
Explore trending content on Musk Viewer
@ShuiwangJi
Shuiwang Ji
3 years
Two ICML papers on GNN explainability and generation got accepted, with code available: On Explainability of Graph Neural Networks via Subgraph Explorations GraphDF: A Discrete Flow Model for Molecular Graph Generation 1/2
2
9
137
@ShuiwangJi
Shuiwang Ji
3 years
NeurIPS 2021 Track Datasets and Benchmarks Round2 AC recommendation. Very hard to see any logic: "While all reviewers agree on acceptance, this is still a borderline paper, tending to reject, due to the lack of enthusiasm from the reviewers."
7
11
92
@ShuiwangJi
Shuiwang Ji
3 years
DIG: A research-oriented library that includes unified and extensible implementations of algorithms for (1) graph generation, (2) self-supervised learning on graphs, (3) explainability of graph neural networks, and (4) deep learning on 3D graphs.
1
23
74
@ShuiwangJi
Shuiwang Ji
10 months
🔬 An in-depth yet intuitive discussion on symmetry, as well as explainability, out-of-distribution generalization, large language models, and uncertainty. 📖 Access categorized lists of resources to enhance learning and education. 🌐 Website: .
0
17
70
@ShuiwangJi
Shuiwang Ji
3 years
Interested in GNN and molecular property prediction? We have been working on a major project with many talented students. The result is a comprehensive ML/DL software package for graphs, sequences, and molecular property prediction, combining new methods with turnkey software.
Tweet media one
2
11
69
@ShuiwangJi
Shuiwang Ji
5 months
I will be at NeurIPS 12/12-12/16 and speak at the Generative AI workshop. Excited to chat with anyone on AI/LLMs for science.
1
4
72
@ShuiwangJi
Shuiwang Ji
3 years
Proud to join other distinguished colleagues as a Distinguished Member of ACM.
2
0
42
@ShuiwangJi
Shuiwang Ji
9 months
Tweet media one
Tweet media two
1
2
39
@ShuiwangJi
Shuiwang Ji
1 year
Changing my hat from ICML author to AC.
0
0
37
@ShuiwangJi
Shuiwang Ji
2 years
Delighted and honored to be elected to the AIMBE College of Fellows.
@aimbe
AIMBE
2 years
Congratulations to the 153 individuals elected to the AIMBE College of Fellows Class of 2022! We're so glad to welcome you to the AIMBE community. You can browse the full list of the new cohort, here:
5
17
120
3
2
39
@ShuiwangJi
Shuiwang Ji
6 months
@SharonYixuanLi Sometimes, slow is fast.
2
0
36
@ShuiwangJi
Shuiwang Ji
2 years
New work on Crystal Material Property Prediction using transformer, accepted to NeurIPS 2022.
@Arxiv_Daily
arXiv Daily
2 years
Periodic Graph Transformers for Crystal Material Property Prediction by @KeqiangY et al. including @lost_his_way , @ShuiwangJi #ComputerScience #Learning
0
6
20
3
5
32
@ShuiwangJi
Shuiwang Ji
3 years
Proud to join the fifth class of Presidential Impact Fellows
5
0
31
@ShuiwangJi
Shuiwang Ji
3 years
DIG appears in JMLR
0
2
32
@ShuiwangJi
Shuiwang Ji
3 years
Interested in self-supervised learning and denoising? A demo of our NeurIPS paper Noise2Same by Abubakar Abid @abidlabs . Thank you!
1
7
26
@ShuiwangJi
Shuiwang Ji
4 years
Interested in building deeper GNN? here you go.
@mengliu_1998
Meng Liu
4 years
Glad to share our #kdd2020 work "Towards Deeper Graph Neural Networks". Joint work with @xiao9aba2008 and @ShuiwangJi . Paper: Code: (0/4)
Tweet media one
Tweet media two
Tweet media three
Tweet media four
1
7
32
2
8
23
@ShuiwangJi
Shuiwang Ji
7 months
The Department of Computer Science and Engineering at Texas A&M University invites applications for two full-time tenure-track or tenured positions in the area of Data Science.
0
10
24
@ShuiwangJi
Shuiwang Ji
1 year
Will be at NeurIPS next week and giving an invited talk to the AI for Science Workshop () on Modeling Complete Potentials in Crystal Materials
0
1
24
@ShuiwangJi
Shuiwang Ji
4 years
Check out our work accepted to NeurIPS 2020, including paper, code.
@_akhaliq
AK
4 years
Noise2Same: Optimizing A Self-Supervised Bound for Image Denoising pdf: abs: github:
Tweet media one
0
5
46
0
1
23
@ShuiwangJi
Shuiwang Ji
10 months
highlights: 🌟 A 263-page paper by 63 authors from 14 institutions, including 41 figures and 36 tables. 🔍 A spectrum of scales: subatomic (wavefunctions, electron density), atomic (molecules, proteins, materials, interactions), and macro (fluid, climates, subsurface).
0
2
23
@ShuiwangJi
Shuiwang Ji
3 years
0
2
22
@ShuiwangJi
Shuiwang Ji
4 years
Line Graph Neural Networks for Link Prediction
@Arxiv_Daily
arXiv Daily
4 years
Line Graph Neural Networks for Link Prediction by Lei Cai et al. including @ShuiwangJi #DeepLearning #ComputerScience
0
3
5
0
3
14
@ShuiwangJi
Shuiwang Ji
4 years
Congrats to Hongyang! Great job!
Tweet media one
0
0
12
@ShuiwangJi
Shuiwang Ji
3 years
Our new work on Augmented Equivariant Attention Networks for Electron Microscopy Image Super-Resolution
@Arxiv_Daily
arXiv Daily
3 years
Augmented Equivariant Attention Networks for Electron Microscopy Image Super-Resolution by Yaochen Xie et al. including @ShuiwangJi #AttentionModels #DeepLearning
0
0
1
0
4
14
@ShuiwangJi
Shuiwang Ji
1 year
Two news dominate twitter: ICML papers and US News Rankings.
1
0
14
@ShuiwangJi
Shuiwang Ji
4 years
Our new work accepted to KDD 2020 on dramatically accelerating attention models on high order data.
@Arxiv_Daily
arXiv Daily
4 years
Kronecker Attention Networks by Hongyang Gao et al. #Vector #Tensor
0
6
25
1
1
12
@ShuiwangJi
Shuiwang Ji
4 years
Capturing long-range info in GNN? check our recent work
@urialon1
Uri Alon
4 years
The reason is the exponentially-growing receptive field, that squashes exponentially-growing information into fixed-length vectors. If you want to pass a message to a distance of K - the cost is O(degree^K) messages that are squashed into a single vector. (2/n)
1
0
6
0
4
12
@ShuiwangJi
Shuiwang Ji
4 years
We have a complete explanation of PCA/SVD using only linear algebra. If you are teaching PCA/SVD, try our material
@badityap
B. Aditya Prakash
4 years
This is amazing. Have not seen this type of true excitement for SVD since my PhD days with my advisor :-) cc @vagelispapalex @PoloChau @danaikoutra @alexbeutel
1
0
8
1
1
11
@ShuiwangJi
Shuiwang Ji
2 years
@NeurIPSConf @kchonyc Will reviewer-AC discussions be visible to authors? If yes, will authors be able to respond to discussions if necessary? SO many authors have worked very hard on rebuttal, but many reviewers did not even click the ACK box.
2
2
12
@ShuiwangJi
Shuiwang Ji
3 years
@jure @BaiduResearch @DeepMind @Synerise Thank you for organizing this great event. We are excited to be one of the few teams from academia on the winner's list. Our tool is available at which also include code and tools to achieve #1 on the AI Cures
0
1
12
@ShuiwangJi
Shuiwang Ji
2 years
New paper alert: Data Augmentations for Graphs
@Arxiv_Daily
arXiv Daily
2 years
Automated Data Augmentations for Graph Classification by @LuoYouzhi et al. including @ShuiwangJi #Statistics #Probability
0
3
15
0
1
12
@ShuiwangJi
Shuiwang Ji
3 years
Interested in Interpretable Metric learning? Here you go: Towards Improved and Interpretable Deep Metric Learning via Attentive Grouping
@Arxiv_Daily
arXiv Daily
3 years
Towards Improved and Interpretable Deep Metric Learning via Attentive Grouping by Xinyi Xu et al. including @ShuiwangJi #EvaluationMetrics #ConvolutionalNeuralNetwork
0
0
3
0
0
11
@ShuiwangJi
Shuiwang Ji
4 years
Our new work on DL for image transformations.
@Arxiv_Daily
arXiv Daily
4 years
Global Voxel Transformer Networks for Augmented Microscopy by Zhengyang Wang et al. #DeepLearning #NeuralNetwork
0
24
23
0
1
11
@ShuiwangJi
Shuiwang Ji
3 years
@SergeyI49013776 Ultimately research should be evaluated by how it impacts science and society, not where it was published.
0
0
11
@ShuiwangJi
Shuiwang Ji
3 years
NeurIPS: I have a reviewer who raised a few concerns a few days ago. Most of these concerns are caused by misunderstandings. Once we pointed these out in response, this reviewer lowered his/her score without even responding with a word. What can we do?
2
0
10
@ShuiwangJi
Shuiwang Ji
2 years
0
0
10
@ShuiwangJi
Shuiwang Ji
3 years
2/2 MoleculeKit: Machine Learning Methods for Molecular Property Prediction and Drug Discovery Paper: Code:
0
3
10
@ShuiwangJi
Shuiwang Ji
4 years
Our team DIVE @TAMU is now ranked #1 on the AI Cures open challenge leaderboard
0
1
9
@ShuiwangJi
Shuiwang Ji
4 years
Interested in GNN and pooling on graphs? Built on our work on graph U-Nets, we push the field one step forward by incorporating graph topology. Enjoy.
@Arxiv_Daily
arXiv Daily
4 years
Topology-Aware Graph Pooling Networks by Hongyang Gao, @lost_his_way , and @ShuiwangJi #NaturalLanguageProcessing #ComputerVision
0
0
2
0
2
7
@ShuiwangJi
Shuiwang Ji
2 years
0
0
9
@ShuiwangJi
Shuiwang Ji
3 years
The Department of Computer Science and Engineering at Texas A&M University invites applications for multiple full-time tenured or tenure-track positions. Open rank, open area.
0
6
9
@ShuiwangJi
Shuiwang Ji
2 years
@denny_zhou At least the reviews are manually written.
1
0
7
@ShuiwangJi
Shuiwang Ji
5 months
@chaitjo I think the major challenge of apply GDL to physics and science problems are how to encode physics priors like symmetries into the model in an flexible and efficient manner. Of course the GNN community might have a different focus/challenge.
1
0
9
@ShuiwangJi
Shuiwang Ji
3 years
Is anyone aware of reliable Drug-target Interaction datasets?
3
1
9
@ShuiwangJi
Shuiwang Ji
4 years
Congrats Yi Liu for the great job!
@cremi_challenge
CREMI Challenge
4 years
We have new leaders in the synaptic cleft detection category of the #CREMIChallenge . Congratulations to groups DIVE and CleftKing!
1
1
3
0
0
7
@ShuiwangJi
Shuiwang Ji
3 years
@wellingmax @jbrandi6 @robdhess @ElisevanderPol @erikjbekkers Great to see a leap after our submission, which is #2 now.
0
1
8
@ShuiwangJi
Shuiwang Ji
9 months
0
0
8
@ShuiwangJi
Shuiwang Ji
3 years
@ylecun Small batch size makes learning quick, but could be noisy. Quick science might be good, but is too noisy. If I have to make a binary decision between quick and slow science, I will chose slow.
2
1
8
@ShuiwangJi
Shuiwang Ji
4 years
My 3DCNN work has been ranked as one of TPAMI's top 5 most popular articles since June 2016.
0
0
7
@ShuiwangJi
Shuiwang Ji
2 years
@tommmitchell Finally, the most expensive ML book is free.
1
0
7
@ShuiwangJi
Shuiwang Ji
2 years
@RuxandraTeslo This is what I have and many people like it:
0
0
7
@ShuiwangJi
Shuiwang Ji
4 years
Second-Order Pooling for Graph Neural Networks, TPAMI
@Arxiv_Daily
arXiv Daily
4 years
Second-Order Pooling for Graph Neural Networks by Zhengyang Wang et al. #NeuralNetwork #ComputerScience
0
14
41
0
3
6
@ShuiwangJi
Shuiwang Ji
1 year
@LogConference Please try to be inclusive and accept all good papers. This area is burgeoning with a lot of good papers each year.
1
0
7
@ShuiwangJi
Shuiwang Ji
3 years
Full texts are available now
0
1
7
@ShuiwangJi
Shuiwang Ji
2 years
@beenwrekt One cause is the large number of submissions, leading to large numbers of (thus potentially noisy) reviewers, AC etc. How about limit the number of submissions under each name?
1
0
7
@ShuiwangJi
Shuiwang Ji
1 year
Looking forward to the ICML 2023 machine learning conference starting July 17.
0
0
6
@ShuiwangJi
Shuiwang Ji
6 months
@jmuiuc Let's focus on solving problems, and do not worry about numbers.
0
0
6
@ShuiwangJi
Shuiwang Ji
2 years
Action and decisions are reversed, but hurt is done and can never be reversed.
@yangyang_cheng
Yangyang Cheng
2 years
DOJ just ended the China Initiative for "broader approach" to counter threats incl foreign exploitation of US science. I write @WIRED why fixation on borders & national interest obscures fundamental ethical concerns & incurs much more profound loss on all:
3
32
137
0
1
6
@ShuiwangJi
Shuiwang Ji
1 year
ICML 2023 accepted papers:
0
1
6
@ShuiwangJi
Shuiwang Ji
1 year
@jo_brandstetter GPT-4 is coming timely to help with ICML rebuttals.
0
0
6
@ShuiwangJi
Shuiwang Ji
3 years
@SimonShaoleiDu We have a lecture notes that formally describes the relations between DL and kernel methods.
0
0
6
@ShuiwangJi
Shuiwang Ji
5 months
@PetarV_93 If symmetry is not important, please use MLP only, not even CNN, for everything.
0
0
6
@ShuiwangJi
Shuiwang Ji
2 years
@GalChechik Both the author (failed to disclose COI) and the reviewer should be banned from this and related conferences for N years.
0
0
6
@ShuiwangJi
Shuiwang Ji
1 year
@ylecun @ericxing @YiMaTweets There is randomness in any review process, and limiting the number of accepted papers may only increase randomness. I found many of the criticisms we received during peer review are constructive, and addressing them led to better quality work.
1
0
6
@ShuiwangJi
Shuiwang Ji
3 years
Nice. Although I am not aware of a formal paper/document on this, I found there have been some discussions and hints on the relations. I compiled a lecture note used in my DL/ML class here
@pmddomingos
Pedro Domingos
3 years
Astonishing - deep networks are just kernel machines, regardless of architecture:
14
54
264
0
2
5
@ShuiwangJi
Shuiwang Ji
2 years
@mmbronstein There seems to be quite significant mismatch between papers and reviewers this year. We got a generic, one-sentence comment for a paper.
1
0
5
@ShuiwangJi
Shuiwang Ji
5 months
@chaitjo If there is a known prior, you better encode it explicitly, rather than letting data and optimization figure out. If optimization works perfectly, do we need ResNet?
0
0
7
@ShuiwangJi
Shuiwang Ji
2 years
@tdietterich @percyliang “Foundation Models” is certainly not my favorite name as it carries zero info.
1
0
4
@ShuiwangJi
Shuiwang Ji
2 years
@thegautamkamath this book has an e-chapter on NN with great explanations of BP.
0
0
5
@ShuiwangJi
Shuiwang Ji
2 years
@kchonyc @NeurIPSConf Any reason for excluding authors? The author-reviewer discussions are not mediated by AC/SAC and thus many reviewers did not even read rebuttals. It will be beneficial to allow author participation during reviewer-AC discussions.
0
0
5
@ShuiwangJi
Shuiwang Ji
9 months
@peter_richtarik Very true and the root cause is the large number of submissions. Either group leaders should do their QC or conferences should put a limit on the number of submissions each author can do.
1
1
5
@ShuiwangJi
Shuiwang Ji
3 years
@mmbronstein @PetarV_93 This could happen sometimes, but Petar's situation is just absurd.
1
0
4
@ShuiwangJi
Shuiwang Ji
3 years
@ylecun All your reasoning makes sense if "noise" does not lead to unnecessary death.
0
0
4
@ShuiwangJi
Shuiwang Ji
4 years
@badityap @UIowaCS @bijayaAdh Great. My student Hongyang Gao will join CS of Iowa State. Congrats!
0
0
4
@ShuiwangJi
Shuiwang Ji
10 months
@AnimaAnandkumar @jo_brandstetter @rejuvyesh @Mniepert Not a surprise. We have worked very extensively on all three and have tried to optimize everything we can. One thing we observe was that conclusions can be overturned easily if models are not optimized properly.
0
0
4
@ShuiwangJi
Shuiwang Ji
9 months
@MountainOfMoon not a surprise. I would explain every abbreviations other than LLM.
0
0
4
@ShuiwangJi
Shuiwang Ji
4 years
postdoc opening to work with Dr. Jafari and myself:
0
1
3
@ShuiwangJi
Shuiwang Ji
1 year
@pmddomingos Electricity?
0
0
4
@ShuiwangJi
Shuiwang Ji
4 years
@ylecun can you share slides? Thank you!
1
0
4