
Runpei Dong
@RunpeiDong
Followers
392
Following
1K
Media
3
Statuses
86
CS PhD student @UofIllinois | Previously @Tsinghua_IIIS and XJTU | Interested in robot learning & machine learning
Champaign, IL
Joined April 2020
When @Xialin_He and I started working on our new G1 robot, we often found that every time the robot fell, picking it up manually was exhausting. Although the robot might get a few scratches, we were the ones getting a serious workout💪 from lifting it repeatedly (save a need to
1
2
18
RT @CyberRobooo: AGIBOT X2-N (Nazhe) new video. Shows the ability to carry goods blindly on stairs and slopes📦. The robot autonomously swit….
0
38
0
#RSS2025 Excited to be presenting our HumanUP tomorrow at the Humanoids Session (Sunday, June 22, 2025).📺 Spotlight talk: 4:30pm–5:30pm, Bovard Auditorium.📜Poster: 6:30pm-8:00pm, #3, Associates Park.
When @Xialin_He and I started working on our new G1 robot, we often found that every time the robot fell, picking it up manually was exhausting. Although the robot might get a few scratches, we were the ones getting a serious workout💪 from lifting it repeatedly (save a need to
0
1
9
Motion tracking is a hard problem, especially when you want to track a lot of motions with only a single policy. Good to know that MoE distilled student works so well, congrats @C___eric417 on such exciting results!.
🚀Introducing GMT — a general motion tracking framework that enables high-fidelity motion tracking on humanoid robots by training a single policy from large, unstructured human motion datasets. 🤖A step toward general humanoid controllers. Project Website:
1
1
3
RT @TheHumanoidHub: This is 🤯. Figure 02 autonomously sorting and scanning packages, including deformable ones. The speed and dexterity are….
0
380
0
RT @SOTAMak1r: Can you imagine playing various games through an AI model? Like BlackMyth: Wukong.🤩.Sharing our latest work: DeepVerse, an a….
0
33
0
Very impressive results! I want my G1 to serve me a beer as well🍻.
🤖Can a humanoid robot carry a full cup of beer without spilling while walking 🍺?. Hold My Beer !. Introducing Hold My Beer🍺: Learning Gentle Humanoid Locomotion and End-Effector Stabilization Control. Project: See more details below👇
1
0
2
RT @omarsar0: Reasoning Models Thinking Slow and Fast at Test Time. Another super cool work on improving reasoning efficiency in LLMs. The….
0
65
0
Thank @AK for sharing! I will post an introduction to our new work, AlphaOne, soon! Stay tuned!.
1
2
15
RT @TongZha22057330: 🤖 Can a humanoid robot hold extreme single-leg poses like Bruce Lee's Kick or the Swallow Balance? 🤸 . 💥 YES. Meet H….
0
62
0
RT @Yuanhang__Zhang: 🦾How can humanoids unlock real strength for heavy-duty loco-manipulation?. Meet FALCON🦅: Learning Force-Adaptive Human….
0
53
0
RT @arthurallshire: our new system trains humanoid robots using data from cell phone videos, enabling skills such as climbing stairs and si….
0
112
0
RT @ZeYanjie: 🤖Introducing TWIST: Teleoperated Whole-Body Imitation System. We develop a humanoid teleoperation system to enable coordinat….
0
87
0
Now we are really seeing a humanoid robot working in daily human life. What exciting progress was made by @xuxin_cheng. Looking forward to seeing their presentation at RSS!.
Meet 𝐀𝐌𝐎 — our universal whole‑body controller that unleashes the 𝐟𝐮𝐥𝐥 kinematic workspace of humanoid robots to the physical world. AMO is a single policy trained with RL + Hybrid Mocap & Trajectory‑Opt. Accepted to #RSS2025. Try our open models & more 👉
0
0
2
RT @t_k_233: Humanoid robots should not be black boxes 🔒 or budget-busters 💸!. Meet Berkeley Humanoid Lite!.▹ 100% open source & under $5k….
0
92
0
I am not at #ICLR this year, but my collaborator from Tsinghua, @yuang_peng will be presenting our work DreamBench++ this afternoon, 3-5:30 pm! If you are interested in benchmarking image generation, don't miss our poster at📍Hall 3 + Hall 2B, #142.
0
0
1