Dima Yanovsky Profile
Dima Yanovsky

@yanovskyd

Followers
726
Following
247
Media
13
Statuses
51

cs @mit

SF
Joined March 2012
Don't wanna be here? Send us removal request.
@yanovskyd
Dima Yanovsky
13 days
4/4 We currently have 5 Apple Vision Pros, which means our throughput is roughly 40 hours of teleoperation data per day. This means we can collect more data per day than any existing open dataset in this category has ever collected. Here's how we did it:
Tweet media one
1
0
8
@yanovskyd
Dima Yanovsky
13 days
3/4 Not having those kinds of resources, we built an interface to teleoperate bimanual Shadow Hands in Apple Vision Pro. Robotics always starts with failure, so we spent weeks failing over and over again until we got the setup running smoothly at 30Hz.
1
0
7
@grok
Grok
6 days
Join millions who have switched to Grok.
242
482
4K
@yanovskyd
Dima Yanovsky
13 days
2/4 Many things in robotics are hard, but some are insanely hard. Collecting large datasets of dextrous manipulation is the latter. If we wanted to collect a dataset using real two Shadow Hands (one of the most dexterous hands in the world) we'd have to spend $200K+ on hardware.
Tweet media one
1
0
5
@yanovskyd
Dima Yanovsky
13 days
1/4 We recreated a $200k teleoperation setup in VR for just ~$2k. Now we can collect more dextrous manipulation data in a single day (40 hrs/day) than any existing open dataset has ever collected.
6
7
31
@yanovskyd
Dima Yanovsky
3 months
we outta here 🫡 @MIT
Tweet media one
Tweet media two
11
2
185
@yanovskyd
Dima Yanovsky
3 months
5/ App Store: Project website: The DART project was developed at the Improbable AI Lab at MIT CSAIL, under @younghyo_park and @pulkitology. DM me if you want to try in-person teleoperation in Vision Pro. I’ll be in SF this summer.
0
0
6
@yanovskyd
Dima Yanovsky
3 months
4/ runs beautifully on mobile devices as well, since it is powered by MuJoCo compiled to WebAssembly. And this isn’t just a video recording!! It’s the actual physics engine running in your browser, replaying the demonstrations in real time.
1
0
8
@yanovskyd
Dima Yanovsky
3 months
3/ Datasets you record go to where you can curate your data and use it to train models. We're also releasing a small library of prerecorded teleoperations. Go play around with it!
1
0
3
@yanovskyd
Dima Yanovsky
3 months
2/We built Vision Pro app to teleoperate robots in AR. Vision Pro's SOTA hand detection/AR models deliver unmatched dexterity and user experience in teleoperation. We successfully ported MuJoCo physics engine by @Deepmind to run locally on Vision Pro chip. The entire tech stack
1
0
4
@yanovskyd
Dima Yanovsky
3 months
1/Robotics foundation models need internet-scale data to train, and we bet simulation datasets will help reach this scale. It's time to build tools enabling this. So, we're releasing first public app for robot data collection in Vision Pro and a platform to curate datasets
2
3
25
@yanovskyd
Dima Yanovsky
5 months
🤝.
@AI_Arav
Arav Kumar
5 months
i got to teleoperate robot hands with my hands using the vision pro. @yanovskyd is literally the coolest roboticst ever. crazy new robotics foundation models will be made with this data. we're so back.
Tweet media one
Tweet media two
0
1
21
@yanovskyd
Dima Yanovsky
5 months
RT @anzahorodnii: 1/ Built an open-source version of @Neuralink demos with @yanovskyd using the brain of a monkey named Jenkins!.Leader mov….
0
26
0
@yanovskyd
Dima Yanovsky
6 months
7/ The challenge was that default solvers have no awareness of the robot's surroundings, sometimes creating trajectories through the table that caused the arm to slam into it. We used the ikpy library to create kinematics chains for both Koch V1.1 (@taurobotics) robotic arms.
Tweet media one
Tweet media two
0
1
21
@yanovskyd
Dima Yanovsky
6 months
6/ One big hurdle on the robotics side was working with inverse kinematics. The robots received coordinates to move to, then inverse kinematics solvers calculated servo angle positions to reach that point.
1
0
13
@yanovskyd
Dima Yanovsky
6 months
5/ Check out @anzahorodnii for the breakdown of the neuroscience ML behind the project. Here is the web game to generate brain data:. Here is the project page:. Arxiv: .
Tweet card summary image
arxiv.org
Project Jenkins explores how neural activity in the brain can be decoded into robotic movement and, conversely, how movement patterns can be used to generate synthetic neural data. Using real...
1
1
21
@yanovskyd
Dima Yanovsky
6 months
4/ We then decided to cook more. What if we could go the other way and train a model to take robot movements and generate synthetic neural data that would correspond to thinking about these movements? After many days of work, @anzahorodnii trained the right model that can do it!.
1
0
14
@yanovskyd
Dima Yanovsky
6 months
3/ It all started when we took a dataset of monkey brain activity during various hand movements. @anzahorodnii trained a model to decode this into velocities, and I configured a robot arm to recreate the movements this monkey was thinking about.
1
0
20
@yanovskyd
Dima Yanovsky
6 months
2/ We made the brain generation model run entirely in-browser using @onnxruntime. We coded a game where you can move a joystick and generate synthetic brain data in real time. This is the first OSS brain data generation model running in browser!. Try it:
1
1
30
@yanovskyd
Dima Yanovsky
6 months
1/ We figured out how to use monkey brain data to operate robotic arms. We then trained a model to generate synthetic brain data from robotic arm movements and made the first-ever synthetic brain data generation browser game. Project Jenkins (open-source) by @anzahorodnii & me
34
102
703