
Dima Yanovsky
@yanovskyd
Followers
726
Following
247
Media
13
Statuses
51
5/ App Store: Project website: The DART project was developed at the Improbable AI Lab at MIT CSAIL, under @younghyo_park and @pulkitology. DM me if you want to try in-person teleoperation in Vision Pro. I’ll be in SF this summer.
0
0
6
2/We built Vision Pro app to teleoperate robots in AR. Vision Pro's SOTA hand detection/AR models deliver unmatched dexterity and user experience in teleoperation. We successfully ported MuJoCo physics engine by @Deepmind to run locally on Vision Pro chip. The entire tech stack
1
0
4
🤝.
i got to teleoperate robot hands with my hands using the vision pro. @yanovskyd is literally the coolest roboticst ever. crazy new robotics foundation models will be made with this data. we're so back.
0
1
21
RT @anzahorodnii: 1/ Built an open-source version of @Neuralink demos with @yanovskyd using the brain of a monkey named Jenkins!.Leader mov….
0
26
0
7/ The challenge was that default solvers have no awareness of the robot's surroundings, sometimes creating trajectories through the table that caused the arm to slam into it. We used the ikpy library to create kinematics chains for both Koch V1.1 (@taurobotics) robotic arms.
0
1
21
5/ Check out @anzahorodnii for the breakdown of the neuroscience ML behind the project. Here is the web game to generate brain data:. Here is the project page:. Arxiv: .
arxiv.org
Project Jenkins explores how neural activity in the brain can be decoded into robotic movement and, conversely, how movement patterns can be used to generate synthetic neural data. Using real...
1
1
21
4/ We then decided to cook more. What if we could go the other way and train a model to take robot movements and generate synthetic neural data that would correspond to thinking about these movements? After many days of work, @anzahorodnii trained the right model that can do it!.
1
0
14
3/ It all started when we took a dataset of monkey brain activity during various hand movements. @anzahorodnii trained a model to decode this into velocities, and I configured a robot arm to recreate the movements this monkey was thinking about.
1
0
20
2/ We made the brain generation model run entirely in-browser using @onnxruntime. We coded a game where you can move a joystick and generate synthetic brain data in real time. This is the first OSS brain data generation model running in browser!. Try it:
1
1
30
1/ We figured out how to use monkey brain data to operate robotic arms. We then trained a model to generate synthetic brain data from robotic arm movements and made the first-ever synthetic brain data generation browser game. Project Jenkins (open-source) by @anzahorodnii & me
34
102
703