huihan_liu Profile Banner
Huihan Liu Profile
Huihan Liu

@huihan_liu

Followers
3K
Following
884
Media
29
Statuses
255

PhD @UTAustin | 👩🏻-in-the-Loop Learning for 🤖 | prev @AIatMeta @MSFTResearch @berkeley_ai | undergrad @UCBerkeley 🐻

Austin, TX
Joined November 2020
Don't wanna be here? Send us removal request.
@huihan_liu
Huihan Liu
3 months
Meet Casper👻, a friendly robot sidekick who shadows your day, decodes your intents on the fly, and lends a hand while you stay in control! . Instead of passively receiving commands, what if a robot actively sense what you need in the background, and step in when confident? (1/n)
6
37
154
@huihan_liu
Huihan Liu
1 month
Excited that Casper 👻 is accepted to CoRL 2025! #CoRL2025 A big thank you to all the collaborators :).
@huihan_liu
Huihan Liu
3 months
Meet Casper👻, a friendly robot sidekick who shadows your day, decodes your intents on the fly, and lends a hand while you stay in control! . Instead of passively receiving commands, what if a robot actively sense what you need in the background, and step in when confident? (1/n)
3
2
77
@huihan_liu
Huihan Liu
2 months
RT @siqi_shang: 3D print tactile sensors anywhere inside your fin-ray fingers! We present FORTE - a solution to sensorize compliant fingers….
0
36
0
@huihan_liu
Huihan Liu
2 months
RT @agiachris: What makes data “good” for robot learning? We argue: it’s the data that drives closed-loop policy success!. Introducing CUPI….
0
20
0
@huihan_liu
Huihan Liu
2 months
📢 Our #RSS2025 workshop on OOD generation in robotics is happening live now! . 📍EEB 132. Join us with a superb lineup of invited speakers and panelists: .@lschmidt3 @DorsaSadigh @andrea_bajcsy @HarryXu12 @MashaItkina @Majumdar_Ani @KarlPertsch.
@RohanSinhaSU
Rohan Sinha
4 months
📢 Excited for the second workshop on Out-of-Distribution Generalization in Robotics: Towards Reliable Learning-based Autonomy at RSS! #RSS2025. 🎯 How can we build reliable robotic autonomy for the real world?. 📅 Short papers due 05/25/25. 🌐 🧵(1/4).
0
2
13
@huihan_liu
Huihan Liu
2 months
RT @ArthurKZhang: Interested in deploying real robots in open-world, outdoor environments? Come to our presentation this Tuesday at 9:30AM,….
0
3
0
@huihan_liu
Huihan Liu
2 months
RT @DoubleHan07: Excited to present DOGlove at #RSS2025 today!.We’ve brought the glove with us, come by and try it out!.📌 Poster: All day a….
0
4
0
@huihan_liu
Huihan Liu
2 months
RT @arjun__gupta: How can we build mobile manipulation systems that generalize to novel objects and environments?. Come check out MOSART at….
0
10
0
@huihan_liu
Huihan Liu
2 months
RSS Pioneer poster happening live on grass @USC!! 😛😛come to Associate Park, poster #8 to chat more about continual robot learning, human-in-the-loop, and reliable deployment! #RSS2025
Tweet media one
@huihan_liu
Huihan Liu
4 months
Honored to be part of the RSS Pioneers 2025 cohort! Looking forward to meeting everyone @RoboticsSciSys in LA this year!.
0
5
67
@huihan_liu
Huihan Liu
3 months
Workshop on Mobile Manipulation in #RSS2025 happening now!! Join us at Hughes Aircraft Electrical Engineering Center, Room 132 if you’re here in person, or join us on Zoom. Website:
Tweet media one
Tweet media two
0
1
16
@huihan_liu
Huihan Liu
3 months
RT @rkjenamani: Most assistive robots live in labs. We want to change that. FEAST enables care recipients to personalize mealtime assistan….
0
69
0
@huihan_liu
Huihan Liu
3 months
Checkout our paper and website for more details. A huge thank you to the team @rutavms @dafeijing Jack Pittenger @kiwi_sherbet @YuchenCui1 @ybisk @RobobertoMM @yukez !. @texas_robotics @UCLAComSci @CSDatCMU.
0
0
3
@huihan_liu
Huihan Liu
3 months
🔑Key insight from the user studies: VLM-based commonsense reasoning is crucial for diverse intent inference in real-world assistive tasks. Casper consistently outperforms the baselines on user workload and user satisfaction, as well as task performance metrics. (7/n)
Tweet media one
1
0
4
@huihan_liu
Huihan Liu
3 months
🙋🏻‍♀️We conduct extensive user studies on multi-step mobile manipulation tasks. At each step, the robot disambiguates user intent among multiple plausible goals, selecting the correct one based on user inputs and visual context. (6/n)
Tweet media one
Tweet media two
1
0
3
@huihan_liu
Huihan Liu
3 months
Casper's key idea #2: Use a parameterized skill library to fulfill intents. Once confirmed by the user, Casper executes the corresponding skill with estimated parameters. (5/n)
Tweet media one
1
0
5
@huihan_liu
Huihan Liu
3 months
Casper's key idea #1: Use VLM commonsense reasoning to infer diverse human intents. Casper generates task candidates from observations and infers intent from user inputs among the task candidates, repeating until predictions are self-consistent. (4/n)
Tweet media one
1
0
4
@huihan_liu
Huihan Liu
3 months
Given user teleoperation input, Casper predicts user intent in real time. Upon user confirmation, it fulfills the intent with autonomous execution. Casper's background reasoning runs in parallel with foreground human control to minimize user disruption. (3/n)
Tweet media one
1
0
6
@huihan_liu
Huihan Liu
3 months
RT @ZEYULIU10: LLMs trained to memorize new facts can’t use those facts well.🤔. We apply a hypernetwork to ✏️edit✏️ the gradients for fact….
0
66
0