Tianhao Wei Profile
Tianhao Wei

@wei_tianhao

Followers
124
Following
92
Media
1
Statuses
21

PhD at Carnegie Mellon. @ICL_at_CMU @CMU_Robotics @CMU_ECE. Robotics, safe control, embodied AI, neural network verification.

Pittsburgh, PA
Joined August 2018
Don't wanna be here? Send us removal request.
@wei_tianhao
Tianhao Wei
1 year
🤔 How can a robot accomplish heterogeneous tasks, with precision and guarantees?. 🚀 Introducing Meta-Control: Automating model-based control synthesis for heterogeneous skills with LLM! .
2
13
55
@wei_tianhao
Tianhao Wei
4 months
RT @Xiaofeng2Guo: 🚀Can we have a freely moving hand in the air for the manipulation policy to directly command in the real world?. We intro….
0
28
0
@wei_tianhao
Tianhao Wei
6 months
RT @TairanHe99: 🚀 Can we make a humanoid move like Cristiano Ronaldo, LeBron James and Kobe Byrant?. YES!. 🤖 Introducing ASAP: Aligning Sim….
0
200
0
@wei_tianhao
Tianhao Wei
9 months
RT @YifanSun98: [1/4] 🌟Sneak Peek: SPARK in Action! 🦾.Previewing Safe Protective & Assistive Robot Kit (SPARK)—a modular toolbox designed t….
0
34
0
@wei_tianhao
Tianhao Wei
1 year
Compositional system is the key for versatile and reliable control synthesis. We share a similar insight in our recent work Meta-Control. Congrats @ShuoCheng94 @AjayMandlekar !.
@ShuoCheng94
Shuo Cheng
1 year
Can we teach a robot hundreds of tasks with only dozens of demos?.Introducing NOD-TAMP: A framework that chains together manipulation skills from as few as one demo per skill to compositionally generalize across long-horizon tasks with unseen objects and scenes. (1/N)
0
1
5
@wei_tianhao
Tianhao Wei
1 year
Opening cabinets can be unexpectedly hard for robots. It's one of the major motivations of our recent work on LLM empowered model-based cobtrol. Congrats @arjun__gupta !.
@arjun__gupta
Arjun Gupta @ RSS 2025
1 year
Introducing: Opening Cabinets and Drawers in the Real World using a Commodity Mobile Manipulator. We develop a system to open unseen cabinets and drawers *zero-shot* from novel environments using the Stretch RE2:
0
0
2
@wei_tianhao
Tianhao Wei
1 year
RT @MiZawalski: 🤖Can robots think through complex tasks step-by-step like language models?.We present Embodied Chain-of-Thought Reasoning (….
0
21
0
@wei_tianhao
Tianhao Wei
1 year
Zero-shot deployment withou human crafted skills is very important. Congrats @DJiafei !.
@DJiafei
Jiafei Duan
1 year
🚀 Excited to share our latest work: MANIPULATE-ANYTHING! 🦾 This scalable method pushes the boundaries of real-world robotic manipulation through zero-shot task execution and automated BC data generation. Here's a quick overview:👇.
0
0
2
@wei_tianhao
Tianhao Wei
1 year
Language encodes so many things about how the world works that we are even not aware of. There is definitely much more to explore.
@svlevine
Sergey Levine
1 year
Chain-of-thought reasoning is a powerful tool to enable language models to work through complex problems. Can we use this with robots? With embodied chain-of-thought, vision-language-action (VLA) models can think through perception and planning!. A 🧵👇
0
0
0
@wei_tianhao
Tianhao Wei
1 year
Very impressive work. Congrats @Ying_yyyyyyyy @QinYuzhe !.
@Ying_yyyyyyyy
Ying Yuan
1 year
Code released for Robot Synesthesia: .In-Hand Manipulation with Visuotactile Sensing!.Project page: Code:
0
0
1
@wei_tianhao
Tianhao Wei
1 year
Glad to see robots are actually making the world better. Congrats @priyasun_ !.
@priyasun_
Priya Sundaresan
1 year
Introducing FLAIR: Feeding via Long-horizon AcquIsition of Realistic dishes! Our system merges a library of skills with foundation models for efficient robotic feeding, tailored to user preferences. 🔗📃To Appear at RSS ’24. 1/N
0
0
0
@wei_tianhao
Tianhao Wei
1 year
RT @xiaolonw: Imagine if you have an object in hand, you can rotate the object by feeling without even looking. This is what we enable the….
0
44
0
@wei_tianhao
Tianhao Wei
1 year
Audio for tactile sensing is defintely underexplored. I still remember how amazed I was when I first learned this direction from prof. @OliverKroemer. Congrats @Liu_Zeyi_ !.
@Liu_Zeyi_
Zeyi Liu
1 year
🔊 Audio signals contain rich information about daily interactions. Can our robots learn from videos with sound?. Introducing ManiWAV, a robotic system that learns contact-rich manipulation skills from in-the-wild audio-visual data. See thread for more details (1/4) 👇
1
0
1
@wei_tianhao
Tianhao Wei
1 year
Very interesting work, congrats @Xiaofeng2Guo !.
@Xiaofeng2Guo
Xiaofeng Guo
1 year
We introduce 𝐅𝐥𝐲𝐢𝐧𝐠 𝐂𝐚𝐥𝐥𝐢𝐠𝐫𝐚𝐩𝐡𝐞𝐫, an aerial manipulation system that can draw various calligraphy artworks:. 🎯Contact-aware trajectory planning and hybrid control.✏️Intuitive user interface and novel end-effector design.🧑‍🎨UAM can draw letters with changing
1
0
0
@wei_tianhao
Tianhao Wei
1 year
Tactile sensing sim2real is a crucial step in generating Internet-level data for robotics. Congrats @HaozhiQ !.
@HaozhiQ
Haozhi Qi
1 year
Introducing tactile skin sim-to-real for dexterous in-hand translation!. We propose a simulation model for ReSkin, a magnetic tactile sensing skin. It can simulate ternary shear and binary normal forces. More:
1
0
1
@wei_tianhao
Tianhao Wei
1 year
A big step towards robot foundation model! Congrats @TairanHe99.
@TairanHe99
Tairan He
1 year
Introduce OmniH2O, a learning-based system for whole-body humanoid teleop and autonomy:.🦾Robust loco-mani policy.🦸Universal teleop interface: VR, verbal, RGB.🧠Autonomy via @chatgpt4o or imitation.🔗Release the first whole-body humanoid dataset.
0
0
1
@wei_tianhao
Tianhao Wei
1 year
Meta-Control harnesses LLM's power to automate expert thought processes, creating customized model-based control for heterogeneous skills, paving the way for universal robotic foundations.
0
0
1
@wei_tianhao
Tianhao Wei
1 year
🦾 Diverse heterogeneous manipulation tasks? ✔️.🛠️ Customized state representations & control strategies? ✔️.🤖 Achieving higher autonomy without human intervention? ✔️.📊 Rigorous analysis, generalizability, & robustness? ✔️.⏱️ Real-time, efficient reliable execution? ✔️.
1
0
1