
Adrian Scott | LLMs AI
@adrianscottcom
Followers
4K
Following
392K
Media
983
Statuses
28K
LLMs, ML, A.I.; Math PhD, nonlinear opt; I code. A social networking pioneer: Ryze; Napster; Longevity: Question assumptions - Opportunity. ๐ฑ ๐ ; next?
Joined January 2009
Biodefense is dramatically more urgent than any climate issue. #biodefense #climatechange
npr.org
Labs are churning out more and more synthetic DNA for scientists who want to use it to reprogram cells. Some say the technology has outpaced government safety guidelines put in place a decade ago.
3
7
121
More importantly this is a perfect time to #endtheTSA The TSA is unconstitutional @Sec_Noem what they do is legally engage in sexual assault on a daily basis. There is no need for govt ran airport security. This would save so much money! I wish @DOGE would advocate for this.
The video, playing in airports across the country, is the latest effort by Trump administration officials to put blame for the shutdown on Democrats.
1
1
10
First results on "Potree-Next", a WebGPU-based rewrite of Potree. Still work to do before it is production-ready, but over the course of the next year it will catch up with the current Potree 1.8, and eventually replace it. https://t.co/N2EefILGvw
https://t.co/roGGJopRDO
1
8
74
The Kimi K2 vendor verifier( https://t.co/0hBnw8I3Pr) has been updated. > You can visually see the difference in tool call accuracy across providers. We've updated the providers counts from 9 to 12, and open-sourced more data entries. We're preparing the next benchmark round and
28
43
438
Introducing linear scaling of reasoning: ๐๐ก๐ ๐๐๐ซ๐ค๐จ๐ฏ๐ข๐๐ง ๐๐ก๐ข๐ง๐ค๐๐ซ Reformulate RL so thinking scales ๐(๐ง) ๐๐จ๐ฆ๐ฉ๐ฎ๐ญ๐, not O(n^2), with O(1) ๐ฆ๐๐ฆ๐จ๐ซ๐ฒ, architecture-agnostic. Train R1-1.5B into a markovian thinker with 96K thought budget, ~2X accuracy ๐งต
13
195
898
Introducing RND1, the most powerful base diffusion language model (DLM) to date. RND1 (Radical Numerics Diffusion) is an experimental DLM with 30B params (3B active) with a sparse MoE architecture. We are making it open source, releasing weights, training details, and code to
99
231
1K
enjoying the new interview w/ @1517fund 's @William_Blake two things are missing from his model: - Keith Johnstone's message of the damage of bad education, see intro of Impro - Barabasi's work on age independence of innovation https://t.co/bSKiOaWiwD
Michael Gibson studied philosophy at NYU, UChicago, and Oxford. Now, heโs waging war against elite universities. Michael was part of the founding team of the Thiel Fellowship, which funded talented youngsters to drop out. He considers universities to be part of what he calls
0
0
1
Player B: Sam Altman https://t.co/AXGDYSEKhM
Sam Altman on Sora, Energy, and Building an AI Empire From GPT-5 to Sora, OpenAI has been making a dizzying amount of bets recently. It now acts as a frontier AGI research lab, a big tech product company with nearly a billion users, and a driver of the largest infrastructure
0
0
0
Player A: Greg Brockman https://t.co/ojHgFWIMKl
My interview with Greg Brockman (@gdb), Cofounder and President of @OpenAI. 0:00 Intro 1:06 Scaling Sora 2:18 Are transformer models the future? 4:29 Building with AMD 5:50 Other kinds of compute 7:28 Bottlenecks 9:17 Deciding where to invest 11:38 Decoupling of the internet
1
0
0
New pods just dropped. Choose your player: A) Greg Brockman B) Sam Altman
1
0
0
A danger of the memory feature in ChatGPT. Donโt fall for this, donโt lock yourself into an echo chamber with only you and your AI. Context: You asked whether certain sleeping positions cause scoliosis in babies. The AI began with a cautious, public-health-style answer
2
1
5
LFM2-8B-A1B Liquid AIโs first on-device MoE, with 8.3B total parameters and 1.5B active per token. It matches 3โ4B dense model quality while running faster than Qwen3-1.7B. Architecture - 18 gated short-conv blocks, 6 GQA blocks (LFM2 backbone) - Sparse MoE feed-forward layers
LFM2-Audio-1.5B Liquid AIโs first end-to-end audio foundation model, built for real-time conversation at only 1.5B parameters. Competitive with much larger models, it unifies speech and text without separate ASR or TTS. Architecture - LFM2 multimodal backbone - FastConformer
1
16
108
UNLIMITED Sora 2 & Sora 2 PRO are LIVE on Higgsfield! Unlimited, unrestricted, & available worldwide. Audio-synchronization, 1080p quality & multi-scene reasoning. World's most wanted model UNLIMITED exclusively on Higgsfield. For 9h: retweet & reply - get 150 creds in DM.
3K
4K
6K
OpenAI says that Codex wrote most of the UI code for their their new agent builder the ui is a hot garbage fire. it makes sense now.
itโs difficult to overstate how important Codex has been to our teamโs ability to ship new products. for example: the drag and drop agent builder we launched today was built end to end in under 6 weeks, thanks to Codex writing 80% of the PRs
131
103
2K
Text diffusion
Today my team at @SFResearch drops CoDA-1.7B: a text diffusion coding model that outputs tokens bidirectionally in parallel. โก๏ธ Faster inference, 1.7B rivaling 7B. ๐ 54.3% HumanEval | 47.6% HumanEval+ | 55.4% EvalPlus ๐คHF: https://t.co/rTwknFtFMN Any questions, lmk!
0
0
0
DataLoader Dispatching When constrained by a variety of reasons to where you can't include multiple copies (or mmaps) of datasets in memory, be it too many concurrent streams, low resource availability, or a slow CPU, dispatching is here to help. Dispatching works by keeping
2
3
31
I used to work 10-12 hours a day, 6 days a week, writing software for some soulless SaaS company in my early 20s. Honestly? It sucking fucked, dude.
I used to work 6am - 6pm, 6 days a week, on a construction site in my early 20s. Honestly? It fucking sucked, dude. I would sit in my car outside the site at 530am, desperately drinking a coffee, telling myself over and over again, "god I wish I was in sciences" Because every
1
0
5
LLM's are the bootloader for the forbidden physics
0
0
0
Are you really an Asian tech founder if you don't do a launch video?
0
0
0
good evening
@PalmerLuckey we need to be working on sphincter controlled deviced for quad amputees
1
1
1