
Hyungjun Yoon
@hyung_jun_yoon
Followers
43
Following
23
Media
4
Statuses
6
Ph.D. student @KAIST EE. Research interest in Applied Machine Learning and Mobile Computing. Currently working at @MSFTResearch as a research intern.
Joined February 2022
⚠️ Pre-trained models suffer from domain shift when fine-tuned on unseen users. 🎯 We investigate this challenge and propose a domain adaptation approach tailored for self-supervised models. 📄 Paper: Let's chat if you're attending SenSys! (2/2).
arxiv.org
Self-supervised learning has emerged as a method for utilizing massive unlabeled data for pre-training models, providing an effective feature extractor for various mobile sensing applications....
0
0
1
Pleased to share that our paper, “SelfReplay: Adapting Self-Supervised Sensory Models via Adaptive Meta-Task Replay,” is accepted to #SenSys2025!. Huge thanks to my collaborators, Jaehyun, Biniyam, @GaoleDai4, Prof. Mo Li, @Taesik_MobileAI, @kimin_le2, and @wewantsj! (1/2)
1
2
3
👋ChatGPT can recognize your hand gestures from your smartwatch! .I’m on my way to #EMNLP2024! .Excited to present our By My Eyes: Grounding LLMs with Sensor Data via Visual Prompting 🎉. Find me at Poster Session E (Riverfront) on 11/13, 16:00-17:30. Let’s chat!.📽️Demo:.(1/2)
1
1
17
@Taesik_MobileAI @kimin_le2 @wewantsj 💻 Our code is available here. The camera-ready version will be released soon, see you at the conference! (2/2).
github.com
Contribute to diamond264/ByMyEyes development by creating an account on GitHub.
0
0
3
🎉 Thrilled to share our paper “By My Eyes: Grounding Multimodal Large Language Models with Sensor Data via Visual Prompting” is accepted at #EMNLP2024 Main! 🙌 Huge thanks to Biniyam, Prof. @Taesik_MobileAI, @kimin_le2, and @wewantsj!. 📄 Preprint: (1/2)
1
1
15