
Danfei Xu
@danfei_xu
Followers
8K
Following
4K
Media
67
Statuses
853
Faculty at Georgia Tech @ICatGT, researcher at @NVIDIAAI | Ph.D. @StanfordAILab | Making robots smarter | all opinions are my own
Atlanta, GA
Joined August 2013
RT @kexinrong: 🎉 Super proud of my student Peng Li (co-advised with Chu Xu) for receiving the 2025 Jim Gray Doctoral Dissertation Award for….
0
5
0
Happening now!.
Excited to announce EgoAct🥽🤖: the 1st Workshop on Egocentric Perception & Action for Robot Learning @ #RSS2025 in LA!. We’re bringing together researchers exploring how egocentric perception can drive next-gen robot learning!. Full info: @RoboticsSciSys
0
1
8
It’s also pretty cool that we are now bounded by hardware limit — for example, gripper not close fast enough to match the arm motion.
We also found that SAIL is often bounded by hardware limits, such as gripper speed. SAIL could feasibly result in faster motions with more capable hardware. 9/
0
1
8
Despite advances in end-to-end policies, robots powered by these systems operate far below industrial speeds. What will it take to get e2e policies running at speeds that would be productive in a factory?. It turns out simply speeding up NN inference isn't enough. This requires.
Tired of slow-moving robots? Want to know how learning-driven robots can move closer to industrial speeds in the real world?.Introducing SAIL - a system for speeding up the execution of imitation learning policies up to 3.2x on real robots. A short thread:.1/
0
3
14
RT @NadunRanawakaA: Tired of slow-moving robots? Want to know how learning-driven robots can move closer to industrial speeds in the real w….
0
8
0
Yayyy! Really really really excited to have Sidd join us at @ICatGT !.
Thrilled to share that I'll be starting as an Assistant Professor at Georgia Tech (@ICatGT / @GTrobotics / @mlatgt) in Fall 2026. My lab will tackle problems in robot learning, multimodal ML, and interaction. I'm recruiting PhD students this next cycle – please apply/reach out!
1
0
12
Super cool demos. Congrats @andyzeng_ and @peteflorence !. Robot learning is a full-stack domain. Even with e2e learning, need capable robots & good controllers to move fast at high precision.
Today we're excited to share a glimpse of what we're building at Generalist. As a first step towards our mission of making general-purpose robots a reality, we're pushing the frontiers of what end-to-end AI models can achieve in the real world. Here's a preview of our early
1
0
13
Great design. Looking forward to it!.
OpenArm is a fully open-sourced humanoid robot arm built for physical AI research and deployment in contact-rich environments. Check out what we’ve been building lately for the upcoming release. Launching v1.0 within a month 🚀.#robotics #OpenSource #physicalAI #humanoid
0
0
2
Join us at the RSS 2025 EgoAct workshop June 21st morning session, where the @meta_aria team will demonstrate the Aria Gen2 device and talk about its awesome features for robotics and beyond! .
Aria Gen 2 glasses mark a significant leap in wearable technology, offering enhanced features and capabilities that cater to a broader range of applications and researcher needs. We believe researchers from industry and academia can accelerate their work in machine perception,
3
11
44
RT @AIatMeta: Aria Gen 2 glasses mark a significant leap in wearable technology, offering enhanced features and capabilities that cater to….
0
191
0
RT @DoubleHan07: Collecting manipulation data with DexUMI. Let’s scale up together! 📈.Big shoutout to my amazing project co-lead Mengda @me….
0
2
0
RT @LerrelPinto: Imagine robots learning new skills—without any robot data. Today, we're excited to release EgoZero: our first steps in tr….
0
57
0
For those of you who are at ICRA: the team will be showing off the robot today (Wed) 1-5pm at the demo area in the exhibition hall. Simar Kareer will be giving a talk tomorrow (Thu) at Imitation Learning for Manipulation 1 session (room 411) at 10am! Poster session in the same.
Looking forward to officially present EgoMimic at ICRA25! The entire team will be there. Reach out and chat about cool new idea on human data for robot learning!.
0
0
5
I wonder what the human-robot data ratio is 👀.
I want to make clear how crazy impressive this result is. We can now do bi-manual, dexterous manipulation across a wide range of tasks with barely any data on these skills coming from teleoperation. As we know, teleop does not scale! But turns out human video does!. This means.
2
0
27
Looking forward to officially present EgoMimic at ICRA25! The entire team will be there. Reach out and chat about cool new idea on human data for robot learning!.
Introducing EgoMimic - just wear a pair of Project Aria @meta_aria smart glasses 👓 to scale up your imitation learning datasets!. Check out what our robot can do. A thread below👇
3
6
52
Looking forward to welcoming fellow roboticists to ATL!.
The robots 🤖 are coming. to Atlanta. The College's faculty and students are contributing to the latest in robotics research at the field's largest and most cutting-edge conference, #ICRA2025, May 19-23. Discover all of @GeorgiaTech's experts at @ieee_ras_icra, with
0
1
26