
Michael Bommarito
@mjbommar
Followers
1K
Following
3K
Media
535
Statuses
4K
RT @LSCtweets: Great news! The Senate Appropriations Committee just approved increased funding for LSC—$566M for FY 2025, a $6M boost from….
0
8
0
you need to spend more time watching C-SPAN2 and less time reading tpot.
The AI challenge is a moral one as much as anything. Are we going to protect individual rights and the working person, or are we going to allow the big corporations to run roughshod over the little guy and take whatever they want? We need to protect We the People
0
0
3
mark my words - a bipartisan coalition of strange bedfellows will make this is a first-order campaign topic into the midterms next year. if you talk to real people or pay attention to polling, you should not be surprised.
The Senate Judiciary Committee is gearing up to grill Big Tech. About time. These companies, and their AI arms, didn’t innovate. They stole. That’s not progress. It’s piracy. They knew it was illegal, and they did it anyway. Why? Because they thought they were untouchable. 17.
0
0
2
we've been mostly focused on allowing "larger" tokens in domain-specific applications, but it's interesting to see work that goes the opposite direction. lots of questions re: token->embedding, but the task results with ensemble are all that really matter in some sense (ignoring
Do you ever wish all LLMs used the same tokenizer?🧑🤝🧑. We present an *efficient, lossless* method to convert any LM into a byte-level model at inference time. This fixes weird tokenization artifacts at the prompt boundary and enables ensembles of LMs with mismatched tokenizers! 🧵
0
2
1
in one of our for-profit co's, we spent ~11k on opus from may to june before switching to $200/mo max plan for 3 people. .
Premium newsletter: Based on user data and their gross profit margins, I believe that Anthropic's most popular product, Claude Code, is haemorraging money on every user, with some costing the company hundreds - even thousands - of dollars in a few days.
0
0
1
these latest tabular foundation models don't get enough attention.
👨🎓🧾✨#icml2025 Paper: TabICL, A Tabular Foundation Model for In-Context Learning on Large Data.With @JingangQu, @DHolzmueller and @MarineLeMorvan . TL;DR: a well-designed architecture and pretraining gives best tabular learner, and more scalable.1/9
0
0
2