talking_kim Profile Banner
Daehwa Kim Profile
Daehwa Kim

@talking_kim

Followers
1K
Following
2K
Media
12
Statuses
88

PhD student @cmuhcii | Prev @Apple Robotics, @Meta @RealityLabs, @hcikaist. 🦾 Making sense of sensing, for people!

Joined April 2018
Don't wanna be here? Send us removal request.
@talking_kim
Daehwa Kim
25 days
RT @ACMUIST: šŸ“£ The Student Volunteer lottery for #UIST2025 is now open! Be part of the team that brings UIST to life and make valuable conn….
0
9
0
@talking_kim
Daehwa Kim
4 months
Research is done in amazing collaboration with @nneonneo @hciprof ✨ You can find more details from our recently published paper at #CHI2025. Here are the preprint, full video, and code:
Tweet card summary image
figlab.com
PatternTrack: Multi-Device Tracking Using Infrared, Structured-Light Projections from Built-in LiDAR
0
0
6
@talking_kim
Daehwa Kim
4 months
This localization enables on-the-go shared AR without pre-registration!
1
0
7
@talking_kim
Daehwa Kim
4 months
To do this, we extract square regions from the captured dot pattern. Using the PnP algorithm, we can match each square against a known ā€œtemplateā€ that we reverse-engineered from the iPhone’s LiDAR. Once we find the best match, the resulting transformation matrix gives us the 6DOF
1
0
3
@talking_kim
Daehwa Kim
4 months
As your device moves, the LiDAR dot patterns distort with angle and distance—basically encoding where it is. We decode that to find the device’s position and orientation in 3D. These LiDARs are already in your iPhone, iPad, Vision Pro, and Quest.
1
0
3
@talking_kim
Daehwa Kim
4 months
Yes, your iPhone already has a projector—using invisible light! šŸ“±āœØ In my latest research at @acm_chi, we repurpose its LiDAR dot pattern as a visual marker, enabling paired AR experiences without any online communication or pre-registration.
@d_feldman
Daniel šŸ¦‹
1 year
Anyway people never bought the "cell phones with projectors" idea. but your iPhone does have a laser in it, for producing a fixed dot pattern that it used with a low-resolution LiDAR for face ID. it's just infrared so you don't see it
Tweet media one
4
16
70
@talking_kim
Daehwa Kim
6 months
RT @huang_peide: šŸš€ New Research on Human-Robot Interaction! šŸ¤–. How can humanoid robots communicate beyond words? Our framework, EMOTION, le….
0
89
0
@talking_kim
Daehwa Kim
7 months
RT @seeedstudio: How can humanoid robots move smarter in crowded spaces? 🦾 Meet ARMOR —an egocentric perception system developed by @talkin….
0
2
0
@talking_kim
Daehwa Kim
7 months
RT @lukas_m_ziegler: 🚨 Apple joins the robotics race!. Researchers from @CarnegieMellon and @Apple have developed a robot collision avoidan….
0
175
0
@talking_kim
Daehwa Kim
7 months
RT @simonkalouche: Occlusion will be a big challenge for robots operating in dense, obstacle rich environments common in manufacturing. U….
0
46
0
@talking_kim
Daehwa Kim
8 months
RT @fly51fly: [RO] ARMOR: Egocentric Perception for Humanoid Robot Collision Avoidance and Motion Planning.D Kim, M Srouji, C Chen, J Zhang….
0
2
0
@talking_kim
Daehwa Kim
8 months
RT @OWW: ARMOR: Egocentric Perception for Humanoid Robot Collision Avoidance and Motion Planning.
0
1
0
@talking_kim
Daehwa Kim
8 months
RT @CarnegieMellon: Say goodbye to dead batteries!āš”ļø New tech being developed at @SCSatCMU uses your body to charge wearable gadgets. The….
Tweet card summary image
fastcompany.com
A new Power-over-Skin technology invented at Carnegie Mellon University could change the way we charge our wearables.
0
3
0
@talking_kim
Daehwa Kim
8 months
This work was done at Carnegie Mellon and Apple, partially during my internship at Apple Robotics. I deeply appreciate my internship manager Mario Srouji and amazing collaborators Chen Chen and Jian Zhang. @SCSatCMU.Paper, videos and code (coming soon!) at
0
0
4
@talking_kim
Daehwa Kim
8 months
We hope that the robotic community continues to expand upon the ARMOR perception to further advance the capability of mobile humanoid robots.
1
0
5
@talking_kim
Daehwa Kim
8 months
We train a transformer-based imitation learning policy in simulation to perform collision-free motion planning, by leveraging around 86 hours worth of human realistic motions from the AMASS dataset.
1
0
3
@talking_kim
Daehwa Kim
8 months
Humanoid robots have significant gaps in their sensing and perception, making it hard to perform motion planning in dense environments. Our distributed perception approach enhances the robot’s spatial awareness, and facilitates more agile motion planning.
1
0
3
@talking_kim
Daehwa Kim
8 months
Can we make wearable sensors for humanoid robots and augment their perception?. We are introducing ARMOR, a novel egocentric perception system that integrates both hardware and software, specifically incorporating wearable-like depth sensors.
3
16
120
@talking_kim
Daehwa Kim
9 months
RT @realkaranahuja: šŸ”¬ My lab at @NorthwesternU has a new website! Visit to see our latest research from CHI, ECCV &….
Tweet card summary image
spice-lab.org
Explore our latest research projects and news at SPICE Lab.
0
68
0
@talking_kim
Daehwa Kim
9 months
RT @ACMUIST: šŸ“ø #UIST2024 albums are here! Huge thanks to our fantastic Photography SVs for capturing all the unforgettable moments. Check….
0
10
0