Duarte Alves Profile
Duarte Alves

@DuarteMRAlves

Followers
53
Following
9
Media
2
Statuses
31

Joined November 2023
Don't wanna be here? Send us removal request.
@DuarteMRAlves
Duarte Alves
11 days
RT @N1colAIs: EuroBERT is going to @COLM_conf 2025! Can’t wait to be in Montreal with @gisship and @DuarteMRAlves to see all the great res….
0
4
0
@DuarteMRAlves
Duarte Alves
18 days
RT @N1colAIs: 🚨 Should you only pretrain encoder models with Masked Language Modeling (MLM)?. Spoiler: definitely not!. Let’s revisit a fou….
0
2
0
@DuarteMRAlves
Duarte Alves
18 days
RT @gisship: 🚨 New paper drop: Should We Still Pretrain Encoders with Masked Language Modeling?.We revisit a foundational question in NLP:….
0
4
0
@DuarteMRAlves
Duarte Alves
1 month
RT @UTTERProject: 🚀 Proud moment!. Prof. @andre_t_martins represented @UTTERProject & #EuroLLM at #GTCParis + #VivaTech2025, showcasing t….
0
3
0
@DuarteMRAlves
Duarte Alves
4 months
RT @N1colAIs: The EuroBERT training library is live! 🚀.Additionally, as weekends are perfect for experimentation, we’ve released a tutorial….
0
1
0
@DuarteMRAlves
Duarte Alves
4 months
RT @tomaarsen: An assembly of 18 European companies, labs, and universities have banded together to launch 🇪🇺 EuroBERT! . It's a state-of-t….
0
16
0
@DuarteMRAlves
Duarte Alves
4 months
RT @N1colAIs: 🇪🇺 One month after the AI Action Summit 2025 in Paris, I am thrilled to announce EuroBERT, a family of multilingual encoder e….
0
48
0
@DuarteMRAlves
Duarte Alves
4 months
🧵 (7/7). 📖 Check out our blog post for more insights: 📄 Read more in our paper:
0
1
6
@DuarteMRAlves
Duarte Alves
4 months
🧵 (6/7). 🙏 Huge thanks also to all our collaborators: @CentraleSupelec @Diabolocom @artefact @sardine_lab_it @istecnico @itnewspt @Lisbon_ELLIS @Unbabel @AMD @CINESFrance.
1
0
5
@DuarteMRAlves
Duarte Alves
4 months
🧵 (5/7). @N1colAIs @gisship @andre_t_martins @AyoubHammal @UndefBehavior Céline Hudelot, Emmanuel Malherbe, Etienne Malaboeuf @Fannyjrd_ Gabriel Hautreux @joao97_alves Kevin El-Haddad @ManuelFaysse @peyrardMax Nuno M. Guerreiro @psanfernandes @RicardoRei7 @PierreColombo6.
1
1
8
@DuarteMRAlves
Duarte Alves
4 months
🧵 (4/7). 🤝 This work is the result of an incredible joint effort by a talented team from multiple institutions, props to everyone!.
1
0
3
@DuarteMRAlves
Duarte Alves
4 months
🧵 (3/7). 🌐 EuroBERT is open-source:. 👉 Models (210M, 610M, 2.1B params).👉 Training snapshots.👉 Full training framework. Explore here: [(. Code coming soon! [(.
1
2
7
@DuarteMRAlves
Duarte Alves
4 months
🧵 (2/7). 📊 EuroBERT shines across benchmarks:. ✔️ Retrieval (MIRACL, MLDR).✔️ Classification (XNLI, PAWS-X).✔️ Regression (SeaHorse).✔️ Strong in code/math understanding (CodeSearchNet)
Tweet media one
Tweet media two
1
0
4
@DuarteMRAlves
Duarte Alves
4 months
🧵 (1/7). 📚 Why EuroBERT?. ✅ Extensive multilingual coverage.✅ Longer context handling (up to 8,192 tokens).✅ Improved architecture.✅ Specialized for math and coding. Ideal for retrieval, classification, and regression tasks!.
1
2
5
@DuarteMRAlves
Duarte Alves
4 months
🚀 Excited to announce EuroBERT: a new multilingual encoder model family for European & global languages! 🌍. 🔹 EuroBERT is trained on a massive 5 trillion-token dataset across 15 languages and includes recent architecture advances such as GQA, RoPE & RMSNorm. 🔹
Tweet media one
1
12
59
@DuarteMRAlves
Duarte Alves
6 months
RT @andre_t_martins: Good to see @EU_Commission promoting OS LLMs in Europe. However (1) "OpenEuroLLM" is appropriating a name (#EuroLLM) w….
0
13
0
@DuarteMRAlves
Duarte Alves
7 months
0
4
0
@DuarteMRAlves
Duarte Alves
7 months
RT @PontiEdoardo: Is sparsity the key to conditional computation, interpretability, long context/generation, and more in foundation models?….
0
25
0
@DuarteMRAlves
Duarte Alves
7 months
RT @andre_t_martins: Come to our #NeurIPS2024 tutorial on Tuesday Dec 10 9:30 AM to find the answers to these questions (or to ask new ques….
0
11
0
@DuarteMRAlves
Duarte Alves
8 months
RT @sardine_lab_it: Today the EuroLLM-9B was released. This work had the participation of Sardine Lab members @andre_t_martins @psanfernand….
0
7
0