
Andreas Tolias Lab @ Stanford University
@AToliasLab
Followers
5K
Following
4K
Media
60
Statuses
1K
to understand intelligence and develop technologies by combining neuroscience and AI
Palo Alto, CA
Joined May 2017
RT @SebastianSeung: Taxpayers: you reap enormous rewards from your modest investment in scientific research. American science is in grave p….
nytimes.com
They hoped to make tomorrow’s medicines. Then came Trump.
0
34
0
RT @BernsteinNeuro: ❓ How does @AToliasLab use artificial intelligence to understand the brain? Join us at the #BernsteinConference to find….
0
2
0
RT @AndrewYNg: There is now a path for China to surpass the U.S. in AI. Even though the U.S. is still ahead, China has tremendous momentum….
0
1K
0
RT @naturecomputes: Join us in San Diego this NeurIPS for a workshop on Foundation Models for the Brain and Body - bringing together resear….
0
16
0
RT @james_y_zou: 📢New conference where AI is the primary author and reviewer! Current venues don't allow AI-writte….
0
131
0
RT @SuryaGanguli: @vkhosla Maybe you and other tech VCs could join together and speak up in defense of science funding. At the rate of toda….
0
6
0
RT @KonstantinWille: For anyone making it this far in the thread: We're training frontier models on 1T++ tokens of brain data, and we're lo….
0
3
0
RT @KonstantinWille: Thanks to the enigma ml team: @AdrianoCardace @alexrgil14 @AtakanBedel @vedanglad 💪. Special thanks to our partner @ml….
0
1
0
Congrats everyone.
Thanks to the enigma ml team: @AdrianoCardace @alexrgil14 @AtakanBedel @vedanglad 💪. Special thanks to our partner @mlfoundry for the support and their affordable H100 nodes! 💙. [6/6].
0
0
7
Congrats @KonstantinWille and the Enigma Project team !!!.
New NanoGPT training speed world record from the Enigma Project 🎉 (@AToliasLab, @naturecomputes, . We improve the efficiency of gradient all_reduce. Short explainer of our method 👇. [1/6].
0
0
6
🚀 Congrats to @KonstantinWille and the Enigma Project team for being NanoGPT speed‑run world record-champions for a glorious day 🏆🔥.
New NanoGPT training speed record: 3.28 FineWeb val loss in 2.990 minutes on 8xH100. Previous record: 3.014 minutes (1.44s slower).Changelog: Accelerated gradient all-reduce. New record-holders: @KonstantinWille et al. of The Enigma project
0
4
18
RT @demishassabis: My (brief) thoughts on the definition of AGI, why we aren't there yet, and what is missing. Will be writing a short and….
0
246
0
RT @SuryaGanguli: It makes no sense to train brilliant students and then make it difficult for them to join the American workforce, e….
0
22
0
RT @viajake: We will likely be looking for someone with expertise with optical methods in vivo, cranial window surgeries in mice, analysis….
0
10
0
RT @goyal__pramod: I want to write a blog talking about the most important & mind breaking ML equations. Anytime someone asks me about ML….
0
54
0
Curious how foundation models and brain “digital twins” could change the game in diagnosis and treating neurological disease? Check out this discussion between @dyamins and Nicholas Weiler (@StanfordBrain).
Can we simulate the human brain with AI?. In today's podcast, Wu Tsai Neuro Faculty Scholar @dyamins discusses what it would take to build a simulation of the human brain and how they could help us understand core algorithms for perception and cognition.
0
5
14
RT @ashleevance: Looking to hire (full-time) an early career science writer who would also like to be on camera. Undergrad or masters in a….
0
17
0