Anya Belz
@anyabelz
Followers
890
Following
2K
Media
72
Statuses
843
Professor of Computer Science, @adaptcentre @dcucomputing @dcu. Working in #AI, #NLProc, #NLG. #evaluationmatters
Joined May 2019
a preprint with @anyabelz : Does Explicitly Exercising the Induction Circuit Improve ICL Performance? We investigate whether explicitly exercising induction circuit during pretraining improves in-context learning (ICL), or whether natural text alone is sufficient when compute is
1
1
3
Exciting news, international applications to DCU for Sept 2025 are open! Apply today to join a diverse community at Ireland’s Top Young University🏆 Follow our application portal guide https://t.co/n4W62R0LWH If you have any questions contact us at dcuglobalrecruitment@dcu.ie
1
8
14
Happy to report that my Karen Spärck Jones number is 3 https://t.co/nBBc1JsQv2
0
0
0
At #AMTA24 this week @InacioVieira and Will Allred from our first #NLProc MSc cohort @DCU @AdaptCentre present their work on finetuning #LLMs with translation memories for organisation-specific machine translation #MT with improvements across all metrics.
aclanthology.org
Inacio Vieira, Will Allred, Séamus Lankford, Sheila Castilho, Andy Way. Proceedings of the 16th Conference of the Association for Machine Translation in the Americas (Volume 1: Research Track). 2024.
0
0
4
11yo and me are loving Series 2 of Uncharted with @FryRSquared bedtime stories for #STEM people the kid calls it. Favourite episode so far Whispers from the Cosmos about Jocelyn Bell Burnell's discovery of pulsars👇 https://t.co/1W1aGoHgxU
bbc.co.uk
A few odd squiggles on a graph lead to a Nobel prize-winning discovery.
0
3
22
Very pleased to say DCU-NLG group @DCU @dcucomputing @AdaptCentre won the #INLG2024 @inlgmeeting best demo prize for our Wikipedia Gap Filler tool🥳With options to generate in Irish #gaeilge and English as well as articles from @WikiWomenInRed lists👇
At #INLG2024 tomorrow we have 3 DCU-NLG group papers 1 Filling Gaps in Wikipedia: Leveraging Data-to-Text Generation to Improve Encyclopedic Coverage of Underrepresented Groups @simonmille @ThomsonSoftware @M___Sabry @HuidromRudali @MichelaLorandi et al.👉 https://t.co/Rv01MGbbrB
1
4
11
Last day of the #INLG2024 conference in Tokyo🗼with 4 more papers from the DCU-NLG @adaptcentre crew, all in the #GEM2024 shared task competition: the results report by @simon_mille @yufanghou et al and 3 team reports - and that's a wrap for @inlgmeeting from us v proud 🎉🥳
0
1
1
@SimonMille @ThomsonSoftware @HuidromRudali and 3 (Mostly) Automatic Experiment Execution for Human Evaluations of NLP Systems with @ThomsonSoftware in the second pm oral session at 16:15 #INLG2024 👉 https://t.co/ylov03iwke
aclanthology.org
Craig Thomson, Anya Belz. Proceedings of the 17th International Natural Language Generation Conference. 2024.
1
1
1
2 QCET: An Interactive Taxonomy of Quality Criteria for Comparable and Repeatable Evaluation of NLP Systems with @simonmille @ThomsonSoftware @HuidromRudali👉 https://t.co/j4TXvZcnb0
aclanthology.org
Anya Belz, Simon Mille, Craig Thomson, Rudali Huidrom. Proceedings of the 17th International Natural Language Generation Conference: System Demonstrations. 2024.
1
1
1
At #INLG2024 tomorrow we have 3 DCU-NLG group papers 1 Filling Gaps in Wikipedia: Leveraging Data-to-Text Generation to Improve Encyclopedic Coverage of Underrepresented Groups @simonmille @ThomsonSoftware @M___Sabry @HuidromRudali @MichelaLorandi et al.👉 https://t.co/Rv01MGbbrB
aclanthology.org
Simon Mille, Massimiliano Pronesti, Craig Thomson, Michela Lorandi, Sophie Fitzpatrick, Rudali Huidrom, Mohammed Sabry, Amy O’Riordan, Anya Belz. Proceedings of the 17th International Natural...
1
1
4
@HuidromRudali @ThomsonSoftware @simon_mille Next in the @DCU-NLG group #INLG2004 line up: our amazing @HuidromRudali kicks off the first oral session of the main @inlgmeeting conference presentating her work on Differences in Semantic Errors Made by Different Types of Data-to-text Systems😍
0
2
2
Tomorrow from 9.30am the DCU-NLG group are putting on the #INLG2024 Tutorial on Human Evaluation of #NLProc System Quality with Joao Sedoc, @HuidromRudali, @ThomsonSoftware, and @simon_mille. Summary, paper, resources here:
2
2
3
Next up in the DCU-NLG research group #INLG2024 line up, @MichelaLorandi and @ThomsonSoftware take part in a panel on #LLMs in #NLG, evaluation, life, the universe and everything.
1
0
8
Wow do we have a line up at #INLG2024 from the DCU-NLG research group this year🤩 First up is @ThomsonSoftware with a keynote on Remaining Challenges in Complex Data-to-Text Generation at the Practical D2T Workshop on Monday at 9:40am
practicald2t.github.io
Practical D2T at INLG 2024 Tokyo, Japan, 23 Sept, 2024
1
2
8
Come join us for the #INLG2024 Tutorial on Human Evaluation of #NLProc System Quality on 24 Sep in Tokyo🗼🇯🇵!! Programme and summary paper available👇@JoaoSedoc @HuidromRudali @ThomsonSoftware @simon_mille @AdaptCentre @dcucomputing @inlgmeeting
0
3
6
Another brilliant thing students have achieved on our MSc in #NLProc at @DCU @dcucomputing 😍 Well done James O'Doherty and Cian Nolan!! @yufanghou
https://t.co/Zho0ecDSnJ
0
3
14
This is what can happen if you do an MSc in #NLProc with us at @DCU @dcucomputing 😍 Well done @InacioVieira and Will Allred from our first NLP cohort on
dcu.ie
.@alpha_crc and @DCU researchers demonstrate that fine-tuning #LLMs with #TMs improves 💫 #translation quality, cuts down turnaround times, ⏳ and provides significant cost savings 💰 @InacioVieira @seamuslankford @_SheilaCastilho @tarfandy @AdaptCentre
https://t.co/dkmW5fR6Ss
0
8
8
Applications for our Semester 2 Micro-Credentials are open! HCI Funding up to 80% available on selected courses. Learn more and register now - https://t.co/Yosgsiwy6R
#microcreds #furthereducation #cpd #microcredentials
@DCU @MicroCreds @IUAofficial
0
5
8
Looking forward to welcoming our second cohort of students on the MSc in Natural Language Processing this autumn! Focus on Language Models and strong industry engagement. It's not too late to join us: #NLProc @DCU
https://t.co/teTE15LsuL
0
0
5
My #PhD student @M___Sabry is working on very exciting stuff: our #ACL2024nlp Findings paper shows, for the first time, that PEFT-tuned parameters are structurally and functionally sufficiently modular to be portable from one host model to another👇 https://t.co/8DnlSjUbmT
0
1
8