
Eric Rosen
@_ericrosen
Followers
1K
Following
1K
Media
216
Statuses
1K
Robotics Research Scientist @ Robotics and AI Institute (RAI) | Making robots smarter for everyone | CS PhD from @BrownUniversity 🤖
Boston, MA
Joined July 2019
RT @GeorgiaChal: 🚨 We're hiring a Postdoc in Robot Learning @ PEARL Lab, TU Darmstadt 🚨. Join our ERC-funded project SIREN (Structured Inte….
0
12
0
RT @NaveenManwani17: 🚨Paper Alert 🚨. ➡️Paper Title: Verifiably Following Complex Robot Instructions with Foundation Models. 🌟Few pointers f….
0
3
0
Excited to see more works combine foundation models with task and motion planning for robots! . Great job @Benedict_Q !.
🚨 What is the best way to use foundation models in robotics?. Our new work shows that combining LLMs & VLMs with ideas from formal methods leads to robots that can verifiably follow complex, open-ended instructions in the real world. 🌍. We evaluate on over 150 tasks🚀. 🧵 (1/4)
1
1
16
RT @lucacarlone1: If you are applying for #gradschool and have last-minute questions about your application, I'm willing to offer office ho….
0
119
0
🤖 🧠 If you’re interested in learning abstractions and planning, definitely check out the #LEAP2024 workshop @corl_conf and consider submitting!. Looking forward to #CoRL2024 !.
0
1
16
If you're not familiar with NLMap, check out the thread below, but to take a quote from the paper:. "NLMap is an open-vocabulary, queryable semantic representation based on ViLD and CLIP". Detections from VLMs are back-projected into a 3D scene. 2/🧵
How can we ground large language models (LLM) with the surrounding scene for real-world robotic planning?. Our work NLMap-Saycan allows LLMs to see and query objects in the scene, enabling real robot operations unachievable by previous methods. Link: 1/6.
1
0
0
I love robot learning approaches that embrace modularity, skill libraries make me feel 🥰. Awesome job!
🚀 Google unveils "Achieving Human Level Competitive Robot Table Tennis"!. 🤖 The robot won 100% vs. beginners and 55% vs. intermediate players, showcasing solid amateur human-level performance. Check out the details:
0
0
10
Survey paper on grounding language for robots. Looks at the spectrum from grounding to discrete symbols, to continuous embeddings, and everything between!.
How do robots understand natural language?. #IJCAI2024 survey paper on robotic language grounding. We situated papers into a spectrum w/ two poles, grounding language to symbols and high-dimensional embeddings. We discussed tradeoffs, open problems & exciting future directions!
0
0
9
RT @thomas_weng: The AI Institute is hiring! Check out the careers page and feel free to reach out to me :).
0
5
0
Sounds like a great PhD opportunity to me if you’re interested in spatial AI!.
This is an opportunity to do a PhD with me at Imperial College, fully funded and starting in October this year. Apply via the link below by 12th June next week. On-sensor vision will be very important to the future of low power vision in robotics + AR/VR.
0
0
0
How to represent task specifications is an important problem in robotics, and especially interesting with the increased popularity of LLMs!. Looking forward to this RSS workshop! 😊.
Submit to our #RSS2024 workshop on “Robotic Tasks and How to Specify Them? Task Specification for General-Purpose Intelligent Robots” by June 12th. Join our discussion on what constitutes various task specifications for robots, in what scenarios they are most effective and more!
0
0
6