SriIpsit Profile Banner
Ipsit @ CVPR 2025 Profile
Ipsit @ CVPR 2025

@SriIpsit

Followers
478
Following
176
Media
13
Statuses
125

Research Intern @INSIGHTLabBGU || Interned @SonyAI_global Tokyo || MS'25 @PurdueCS || CMMRS'23 || BTech EE'22 @IITBombay

Beer-Sheva, Israel
Joined July 2020
Don't wanna be here? Send us removal request.
@SriIpsit
Ipsit @ CVPR 2025
22 days
Excited to be at #CVPR2025 in Nashville! We're presenting our paper, "DiTASK: Multi-Task Fine-Tuning with Diffeomorphic Transformations." ๐Ÿงต(1/n). Come chat with us at our poster session: .๐Ÿ—“๏ธ Sun, Jun 15 โฐ 10:30 AM - 12:30 PM CDT .๐Ÿ“ ExHall D, Poster #399
Tweet media one
1
3
5
@SriIpsit
Ipsit @ CVPR 2025
22 days
๐Ÿ”— Project Page: ๐Ÿ“ฐ Paper: ๐Ÿง‘โ€๐Ÿ’ป Code: (n/n).
0
0
3
@SriIpsit
Ipsit @ CVPR 2025
22 days
We're here @CVPR all week and would love to meet! Let's talk about #MTL, #PEFT, #VisionTransformers, and diffeomorphisms. Reply here to connect with me and the team (@ChaimBaskin and Moshe Eliasof)!. #AI #ComputerVision #MachineLearning #DeepLearning #CVPR2025 #Diffeomorphisms.
1
1
2
@SriIpsit
Ipsit @ CVPR 2025
22 days
The Results: DiTASK sets a new S.O.T.A. with 75% fewer parameters! .โœ… +3.22% avg. improvement on PASCAL MTL .โœ… >2x improvement on NYUDv2 tasks .โœ… 22% faster inference than MTLORA .โœ… The performance boost gets even bigger with larger models!
Tweet media one
Tweet media two
1
0
2
@SriIpsit
Ipsit @ CVPR 2025
22 days
Our solution: ๐ƒ๐ข๐“๐€๐’๐Š! We introduce a new way to fine-tune that preserves the core visual knowledge in pre-trained models. We freeze the singular vectors (feature directions) and only adapt the singular values (feature importance) using efficient, learnable diffeomorphisms.
Tweet media one
1
0
1
@SriIpsit
Ipsit @ CVPR 2025
22 days
๐“๐ก๐ž ๐๐ซ๐จ๐›๐ฅ๐ž๐ฆ: Fine-tuning large models for Multi-Task Learning (MTL) is hard. Methods like LoRA force different tasks to compete within a limited subspace, which can hurt performance. This "task interference" is a major hurdle.
Tweet media one
1
0
2
@SriIpsit
Ipsit @ CVPR 2025
1 month
Excited to announce Iโ€™m joining @INSIGHTLabBGU w/ @ChaimBaskin at BGUโ€™s School of ECE this summer! Iโ€™ll be working on ODE-inspired methods to advance representation learning! Huge thanks to Prof. Baskin for making this possible!.
0
0
4
@SriIpsit
Ipsit @ CVPR 2025
2 months
RT @beabevi_: ๐Ÿšจ New paper at #ICLR2025!. ๐ŸงตIntroducing Holographic Node Representations, pretrained general-purpose node embeddings that adaโ€ฆ.
0
9
0
@SriIpsit
Ipsit @ CVPR 2025
5 months
This is the sign I needed today! ๐Ÿคก
Tweet media one
0
0
2
@SriIpsit
Ipsit @ CVPR 2025
5 months
2025 is also Silver Jubilee of my life ๐Ÿคช.
0
0
0
@SriIpsit
Ipsit @ CVPR 2025
5 months
It just hit me - This Valentine's day marks Silver Jubilee of no gf. I deserve a commemorative plaque and a ceremony. ๐Ÿคก๐Ÿ˜ญ๐ŸŽŠ ๐Ÿ’.
1
0
2
@SriIpsit
Ipsit @ CVPR 2025
6 months
RT @brunofmr: Slides of my presentation "Mathematical Foundations of Graph Foundation Models" yesterday at the AMS Session of the #JMM2025.โ€ฆ.
0
30
0
@SriIpsit
Ipsit @ CVPR 2025
7 months
Thrilled to present DiGRAF at @NeurIPSConf with @brunofmr and Moshe Eliasof. Big thanks to everyone who stopped by, asked questions, and offered feedback. DM to grab a coffee and discuss ideas. Weโ€™re excited about the directions this work will takeโ€”stay tuned for more to come!
Tweet media one
0
0
7
@SriIpsit
Ipsit @ CVPR 2025
7 months
RT @beabevi_: If you are at #NeurIPS2024, donโ€™t miss Moshe Eliasof presenting our ๐†๐‘๐€๐๐Ž๐‹๐€: ๐€๐๐š๐ฉ๐ญ๐ข๐ฏ๐ž ๐๐จ๐ซ๐ฆ๐š๐ฅ๐ข๐ณ๐š๐ญ๐ข๐จ๐ง ๐Ÿ๐จ๐ซ ๐†๐ซ๐š๐ฉ๐ก ๐๐ž๐ฎ๐ซ๐š๐ฅ ๐๐ž๐ญ๐ฐ๐จ๐ซ๐ค๐ฌโ€ฆ.
0
7
0
@SriIpsit
Ipsit @ CVPR 2025
7 months
๐Ÿ‘‡Below we show DiGRAF learns distinct activation functions for different graphs. DiGRAF learns a diffeomorphism using an additional shallow GNN on the input graph and use it as the activation function. We show consistent performance improvement across various tasks and faster
Tweet media one
Tweet media two
0
0
1
@SriIpsit
Ipsit @ CVPR 2025
7 months
๐Ÿ“ฃStop by our @NeurIPSConf Poster #3103 Today at 11 AM! for DiGRAF: Diffeomorphic Graph Adaptive Activation Function - an activation function that adapts to the graph structure by learning a diffeomorphism. Joint Work with Xinzhi Wang (equal contrib.), @caromitreka, @brunofmr ,
Tweet media one
2
5
23
@SriIpsit
Ipsit @ CVPR 2025
7 months
Graph structure typically influences only node embeddings in GNNs. What if activation functions were graph-adaptive too? Introducing DiGRAF: a novel activation function for GNNs powered by learnable diffeomorphisms. Discover more at our poster #33 tomorrow at @LogConference ๐Ÿง ๐Ÿ“ˆ
Tweet media one
0
1
8
@SriIpsit
Ipsit @ CVPR 2025
7 months
This appโ€™s icon is similar to \mathbb{X}.
0
0
0
@SriIpsit
Ipsit @ CVPR 2025
8 months
Entered my buzz cut era. ๐Ÿ™„.
0
0
1
@SriIpsit
Ipsit @ CVPR 2025
8 months
Browsing PASCAL dataset to cheerypick predictions is fun. Lots of meme-worthy images. @CVPR #CVPR2025.
0
0
1