
Xiaomeng Xu
@XiaomengXu11
Followers
714
Following
319
Media
11
Statuses
52
PhD student in robotics @Stanford | Interning @ToyotaResearch | Prev @Tsinghua_Uni
Palo Alto, CA
Joined July 2022
RT @ThomasYuxinChen: š”Can we let an arm-mounted quadrupedal robot to perform task with both arms and legs? . Introducing ReLIC: Reinforcemeā¦.
0
50
0
RT @yswhynot: Missed our RSS workshop? Our recordings are online: All talks were awesome, and we had a very fun paā¦.
0
3
0
RT @StanfordAILab: Robot learning has largely focused on standard platformsābut can it embrace robots of all shapes and sizes? In @Xiaomengā¦.
0
23
0
Happening right now at EEB 248!.
Enjoying the first day of #RSS2025? Consider coming to our workshop š¤Robot Hardware-Aware Intelligence on Wed! @RoboticsSciSys . Thank you to everyone who contributed š We'll have 16 lightning talks and 11 live demos! More info:
0
2
15
RT @StanfordAILab: In Los Angeles for RSS 2025? š¤ š“Be sure to check out the great work by students from the Stanford AI Lab! .
0
2
0
I'll present RoboPanoptes at #RSS2025 tomorrow 6/22 š.Spotlight talk: 9:00-10:30am (Bovard Auditorium).Poster: 12:30-2:00pm, poster #31 (Associates Park).
Can robots leverage their entire body to sense and interact with their environment, rather than just relying on a centralized camera and end-effector?. Introducing RoboPanoptes, a robot system that achieves whole-body dexterity through whole-body vision.
0
14
106
RT @yswhynot: Enjoying the first day of #RSS2025? Consider coming to our workshop š¤Robot Hardware-Aware Intelligence on Wed! @RoboticsSciSyā¦.
0
7
0
Perception is inherently active. š§ š.With a flexible neck, our robot learns how humans adjust their viewpoint to search, track, and focusāunlocking more capable manipulation. Check out Vision in Action š.
Your bimanual manipulators might need a Robot Neck š¤š¦. Introducing Vision in Action: Learning Active Perception from Human Demonstrations. ViA learns task-specific, active perceptual strategiesāsuch as searching, tracking, and focusingādirectly from human demos, enabling robust
0
2
34
Steering diffusion policy at inference time with dynamics guidance!.
Normally, changing robot policy behavior means changing its weights or relying on a goal-conditioned policy. What if there was another way?. Check out DynaGuide, a novel policy steering approach that works on any pretrained diffusion policy. š§µ
1
0
6
RT @ZhaoMandi: Our lab at Stanford usually do research in AI & robotics, but very occasionally we indulge in being functional alcoholics --ā¦.
0
10
0
RT @priyasun_: How can we move beyond static-arm lab setups and learn robot policies in our messy homes?.We introduce HoMeR, an imitation lā¦.
0
49
0
RT @lukas_m_ziegler: It's a 3D printer, and 3D assembly station! šØļø. The Functgraph developed at Meiji University starts as a regular 3D prā¦.
0
272
0
RT @ZhaoMandi: How to learn dexterous manipulation for any robot hand from a single human demonstration?. Check out DexMachina, our new RLā¦.
0
102
0
DexUMI exoskeleton makes YOUR hand move like the robot hand, so demonstrations you collect transfer directly to the robot. Zero retargeting! š„.
Can we collect robot dexterous hand data directly with human hand?. Introducing DexUMI:. 0 teleoperation and 0 re-targeting dexterous hand data collection system ā autonomously complete precise, long-horizon and contact-rich tasks. Project Page:
1
0
14
RT @SongShuran: This is so cool š¤Æ! Imagine pairing this robot hardware platform with generative hardware design (like the one from @Xiaomeā¦.
0
6
0
RT @calvinyluo: Internet-scale datasets of videos and natural language are a rich training source!. But can they be used to facilitate noveā¦.
0
9
0
RoboPanoptes is accepted to #RSS2025!.Everything is open-sourced:
Can robots leverage their entire body to sense and interact with their environment, rather than just relying on a centralized camera and end-effector?. Introducing RoboPanoptes, a robot system that achieves whole-body dexterity through whole-body vision.
0
3
51
We're hosting the #RSS2025 Robot Hardware-Aware Intelligence Workshop! . Join us to explore how hardware design𦾠+ learningš§ unlock new robot capabilities. Submit your latest paper and demo!.
Excited to announce the 1st Workshop on Robot Hardware-Aware Intelligence @ #RSS2025 in LA! Weāre bringing together interdisciplinary researchers exploring how to unify hardware design and intelligent algorithms in robotics! Full info: @RoboticsSciSys
0
0
11
RT @bi1nwo: Want a haptic force feedback glove? .Meet DOGlove! šāØ A precise, low-cost (~$600), open-source glove for dexterous manipulationā¦.
0
33
0
RT @HaochenShi74: Time to democratize humanoid robots!.Introducing ToddlerBot, a low-cost ($6K), open-source humanoid for robotics and AI rā¦.
0
108
0