kenny__shaw Profile Banner
Kenny Shaw Profile
Kenny Shaw

@kenny__shaw

Followers
1K
Following
12K
Media
24
Statuses
768

2nd Year PhD (Fall 2024) in Robotics at Carnegie Mellon with Prof. Deepak Pathak. Working on Robot Hands such as LEAP Hand. NSF GRF.

Pittsburgh PA
Joined June 2014
Don't wanna be here? Send us removal request.
@kenny__shaw
Kenny Shaw
8 months
Teaching bimanual robot hands to perform very complex tasks has been notoriously challenging. In our work, Bidex: Bimanual Dexterity for Complex Tasks, we’ve developed a low-cost system that completes a wide range of highly dexterous tasks in real-time.
7
54
244
@kenny__shaw
Kenny Shaw
1 day
RT @_tonytao_: Want to add diverse, high-quality data to your robot policy?. Happy to share that the DexWild Dataset is now fully public, h….
0
8
0
@kenny__shaw
Kenny Shaw
9 days
RT @TheHumanoidHub: Got to visit the Robotics Institute at CMU today. The institute has a long legacy of pioneering research and pushing t….
0
22
0
@kenny__shaw
Kenny Shaw
14 days
RT @mangahomanga: Presenting DemoDiffusion: An extremely simple approach enabling a pre-trained 'generalist' diffusion policy to follow a h….
0
47
0
@kenny__shaw
Kenny Shaw
15 days
RT @xiaolonw: 3 years of dexterous manipulation workshops down the road since 2023: Great to see the progress in t….
0
1
0
@kenny__shaw
Kenny Shaw
15 days
Excited to be organizing the dexterous manipulation workshop at #RSS2025 — great energy and lots of interest in dexterous manipulation! Come by in OHE 122!
Tweet media one
Tweet media two
1
3
33
@kenny__shaw
Kenny Shaw
16 days
RT @HaozhiQ: We are excited to host the 3rd Workshop on Dexterous Manipulation at RSS tomorrow!. Join us at OHE 122 starting at 9:00 AM! Se….
0
7
0
@kenny__shaw
Kenny Shaw
16 days
RT @JasonJZLiu: Come check out the LEAP Hand and DexWild live in action at #RSS2025 today!
0
10
0
@kenny__shaw
Kenny Shaw
22 days
Cool idea, nice robot neck 🦒.
@Haoyu_Xiong_
Haoyu Xiong
22 days
Your bimanual manipulators might need a Robot Neck 🤖🦒. Introducing Vision in Action: Learning Active Perception from Human Demonstrations. ViA learns task-specific, active perceptual strategies—such as searching, tracking, and focusing—directly from human demos, enabling robust
0
2
12
@kenny__shaw
Kenny Shaw
24 days
RT @_tonytao_: 🦾 DexWild is now open-source!. Scaling up in-the-wild data will take a community effort, so let’s work together. Can’t wait….
0
31
0
@kenny__shaw
Kenny Shaw
29 days
RT @tennyyin: 🔎Can robots search for objects like humans?.Humans explore unseen environments intelligently—using prior knowledge to activel….
0
49
0
@kenny__shaw
Kenny Shaw
1 month
RT @unnatjain2010: ✨New edition of our community-building workshop series!✨ . Tomorrow at @CVPR, we invite speakers to share their stories,….
0
15
0
@kenny__shaw
Kenny Shaw
1 month
RT @priyasun_: How can we move beyond static-arm lab setups and learn robot policies in our messy homes?.We introduce HoMeR, an imitation l….
0
49
0
@kenny__shaw
Kenny Shaw
1 month
From high-level design, we are still between motors in the joints vs tendon driven. The former often leads to hands that are big or not strong enough. While tendon driven has strong wrist motors, it's still often too complicated to produce, maintain and control.
@adcock_brett
Brett Adcock
2 months
Singapore's Sharpa unveiled SharpaWave, a lifelike robotic hand. —Features 22 DOF to balance for dexterity and strength.—Each fingertip has 1,000+ tactile sensing pixels and 5 mN pressure sensitivity.—AI models adapt the hand's grip and modulate force.
0
1
39
@kenny__shaw
Kenny Shaw
1 month
This is a nice demo, but there's still a lot to go to unlock true dexterity, use of the individual fingers and surfaces to do more complex manipulation tasks.
@TheHumanoidHub
The Humanoid Hub
1 month
Brett Adcock says the latest autonomous demo of Figure 02 is fully end-to-end and uses a single neural network – camera frames in, actions out. “You cannot code your way out of this problem.”
1
0
11
@kenny__shaw
Kenny Shaw
1 month
RT @HaozhiQ: We have extended our submission deadline to 11:59 PM (PT) on June 1! Please consider submitting your work to our workshop.
0
3
0
@kenny__shaw
Kenny Shaw
1 month
RT @SongShuran: Meet the newest member of the UMI family: DexUMI! .Designed for intuitive data collection — and it fixes a few things the o….
0
17
0
@kenny__shaw
Kenny Shaw
1 month
RT @mihirp98: Excited to share our work: Maximizing Confidence Alone Improves Reasoning. Humans rely on confidence to learn when answer key….
0
37
0
@kenny__shaw
Kenny Shaw
2 months
RT @HaozhiQ: Recording avaialble at
0
4
0
@kenny__shaw
Kenny Shaw
2 months
RT @NVIDIARobotics: Congratulations to Shuran Song, Abhishek Gupta, and @yukez on receiving the @ieeeras 2025 Early Academic Career Award f….
0
11
0
@kenny__shaw
Kenny Shaw
2 months
Nice work from Xela robotics integrating their touch sensors into LEAP Hand! @RoboticsXela
Tweet media one
2
1
43