Qi Wu
@Qi_Wu577
Followers
436
Following
118
Media
1
Statuses
25
robotics PhD student @Cornell, prev @stanford @Tsinghua_Uni
california
Joined August 2023
Introducing Helpful DoggyBot🐕, a legged mobile manipulation system: - A quadruped with a mouth - Agile whole-body skills like climbing and tilting - Open-world object fetching using VLMs - No real-world training data required!
6
42
256
8
67
398
Everything you love about generative models — now powered by real physics! Announcing the Genesis project — after a 24-month large-scale research collaboration involving over 20 research labs — a generative physics engine able to generate 4D dynamical worlds powered by a physics
562
3K
16K
🚀 Introducing MENTOR: Mixture-of-Experts Network with Task-Oriented Perturbation for Visual Reinforcement Learning! 🌟 We propose a strong model-free visual RL algorithm that can learn robust visuomotor policies from scratch – in the real world! 💪🤖 🌐 Check out the project
1
32
149
Open-ended object retrieval combining: - sim2real for low-level locomotion skills - VLMs for high-level semantic understanding Project page: https://t.co/ZelaTV331O Open-source code:
github.com
Helpful DoggyBot: Open-World Object Fetching using Legged Robots and Vision-Language Models - WooQi57/Helpful-Doggybot
Introducing Helpful DoggyBot🐕, a legged mobile manipulation system: - A quadruped with a mouth - Agile whole-body skills like climbing and tilting - Open-world object fetching using VLMs - No real-world training data required!
4
22
153
Check out our "Doggybot" series of works: Helpful Doggybot. 1 DoF gripper + whole-body movements enables helpful fetching and agile traversal of home environments! Feel free to include "Doggybot" in your project to explore more what quadrupeds can do!
Introducing Helpful DoggyBot🐕, a legged mobile manipulation system: - A quadruped with a mouth - Agile whole-body skills like climbing and tilting - Open-world object fetching using VLMs - No real-world training data required!
1
12
68
Happy to introduce Doggybot project Playful DoggyBot🐶: Learning Agile and Precise Quadrupedal Locomotion We open-sourced everything. Hope others can build on our code and start a series of projects named "xxx DoggyBot".
We can easily see a trained dog expertly chasing after a fast-moving frisbee and leaping up to catch it just before it hits the ground. Now, can robot join the fun? Introduce Playful DoggyBot🐶: Learning Agile and Precise Quadrupedal Locomotion 1/3
2
3
17
Check out the other DoggyBot project: Playful DoggyBot🐶- Learning Agile and Precise Quadrupedal Locomotion
We can easily see a trained dog expertly chasing after a fast-moving frisbee and leaping up to catch it just before it hits the ground. Now, can robot join the fun? Introduce Playful DoggyBot🐶: Learning Agile and Precise Quadrupedal Locomotion 1/3
0
1
7
Introduce DoggyBot🐕series: quadrupeds can also do manipulation. It's a fruitful 4yr journey working on robot dogs from walking, to parkour, to now useful agility. We open-sourced everything. Hope others can build on our code and start a series of projects named "xxx DoggyBot".
Introducing Helpful DoggyBot🐕, a legged mobile manipulation system: - A quadruped with a mouth - Agile whole-body skills like climbing and tilting - Open-world object fetching using VLMs - No real-world training data required!
2
24
125
Project website: https://t.co/MNh29RuEwq Authors:@Qi_Wu577 @zipengfu @xuxin_cheng @xiaolonw and @chelseabfinn
0
0
7
The system uses pre-trained vision-language models (VLMs) with a third-person fisheye and an egocentric RGB camera for semantic understanding and command generation. We evaluate our system in two unseen environments without any real-world data collection or training.
2
0
5
Our system uses a front-mounted gripper for object manipulation. The low-level controller is trained in simulation using egocentric depth for agile skills like climbing and whole-body tilting.
1
0
6
I’ve been training dogs since middle school. It’s about time I train robot dogs too 😛 Introducing, UMI on Legs, an approach for scaling manipulation skills on robot dogs🐶It can toss, push heavy weights, and make your ~existing~ visuo-motor policies mobile!
13
83
442
Have you seen a humanoid wearing a shoe? Our 6ft humanoid at Stanford can put on a Nike shoe, tie shoelaces, stand up, and walk autonomously! It’s been a super fun project to work on.
Our 6ft humanoid HumanPlus at Stanford can autonomously put on a Nike skateboard shoe, tie shoelaces, stand up and walk. Using two transformers & dual RGB vision, it integrates two recipes of general robotics end-to-end: - imitating humans in real world - large-scale RL in sim
2
9
46
Our robot can autonomously put on a shoe, tie shoelaces, stand up and walk!
Our 6ft humanoid HumanPlus at Stanford can autonomously put on a Nike skateboard shoe, tie shoelaces, stand up and walk. Using two transformers & dual RGB vision, it integrates two recipes of general robotics end-to-end: - imitating humans in real world - large-scale RL in sim
3
7
49
Really exciting progress towards fully autonomous humanoids! Fantastic work, team HumanPlus!
Introduce HumanPlus - Shadowing part Humanoids are born for using human data. We build a real-time shadowing system using a single RGB camera and a whole-body policy for cloning human motion. Examples: - boxing🥊 - playing the piano🎹/ping pong - tossing - typing Open-sourced!
1
7
26
How can we train full-size humanoid robots? New paper introducing: - learned controller for shadowing humans - imitation learning of demos collected via shadowing Website with code & videos:
Introduce HumanPlus - Autonomous Skills part Humanoids are born for using human data. Imitating humans, our humanoid learns: - fold sweatshirts - unload objects from warehouse racks - diverse locomotion skills (squatting, jumping, standing) - greet another robot Open-sourced!
4
29
171
Introduce HumanPlus - Shadowing part Humanoids are born for using human data. We build a real-time shadowing system using a single RGB camera and a whole-body policy for cloning human motion. Examples: - boxing🥊 - playing the piano🎹/ping pong - tossing - typing Open-sourced!
16
158
743
It's an honor to work on this project! HumanPlus is a full-stack system for imitation and autonomous skills from human data. Check out our website:
Introduce HumanPlus - Autonomous Skills part Humanoids are born for using human data. Imitating humans, our humanoid learns: - fold sweatshirts - unload objects from warehouse racks - diverse locomotion skills (squatting, jumping, standing) - greet another robot Open-sourced!
0
3
14