@zhengyiluo
Zhengyi “Zen” Luo
3 months
Early footage of motion imitation for the H1 humanoid developed in the PHC codebae. Spoiler: these motion does not translate to real (as of today; not sure if ever).
3
2
62

Replies

@zhengyiluo
Zhengyi “Zen” Luo
3 months
To checkout what is transferable to the real robot, checkout our work H2O: Threads:
@zhengyiluo
Zhengyi “Zen” Luo
3 months
🤔 Ever wondered if simulation-based animation/avatar learnings can be applied to real humanoid in real-time? 🤖 Introducing H2O (Human2HumanOid): - 🧠 An RL-based human-to-humanoid real-time whole-body teleoperation framework - 💃 Scalable retargeting and training using large…
4
55
252
0
1
4
@zhaomingxie
Xie Zhaoming
3 months
@zhengyiluo interesting! It will be very cool if these can be put on the robot! I wonder how you decide which motion is transferable? Have you tried a lot of them and pick manually?
1
0
2
@zhengyiluo
Zhengyi “Zen” Luo
3 months
@zhaomingxie These can’t haha. The ones in H2O are more plausible. We use an imitator similar to this one to filter out the implausible ones.
0
0
2
@philfung
Philip Fung
3 months
@zhengyiluo how do you think this compares to DeepMimic or AMP?
0
1
0