UKRI CDT in Natural Language Processing
@Edin_CDT_NLP
Followers
1K
Following
312
Media
18
Statuses
277
We bring together researchers in NLP, speech, linguistics, cognitive science, and design informatics from across the University of Edinburgh.
Edinburgh
Joined October 2019
🥳🌟Bonus photo-grab on their way to collect well-deserved degrees! Congrats from all @Edin_CDT_NLP to Drs Conklin, Winther, Carter, Hosking and Lindemann as well as, in absentia: Drs Chi and Rubavicius 🎓🙌🥳 @InfAtEd
1
0
8
TLDR; We are kindly inviting you to participate in our survey—running till the 9th of July—related to Arabizi (sometimes referred to as Arabish or Franco-Arabic): https://t.co/yJ90lFvxMQ
@Amrkeleg Ahmed Amine @taha_yssne Imane Guellil @Chadi_Helwe
@nedjmaou (1/3) 🧵
4
13
17
📢 CDTNLP @InfAtEd student Eddie Ungless has created a survey on ethics resources. Anyone in UK who does research with LLMs, in academia or industry, building models or using them as tools, is eligible. Takes <10 mins. Chance to win £100 voucher. Survey:
0
0
1
🏆🥳🙌Congratulations to (l-r) Dr Burchell @very_laurie, Dr Cardenas and Dr Moghe @nikita_moghe on their recent graduation from the UKRI CDT in NLP @InfAtEd ! All the very best in your respective careers! 🌠 🏆
1
7
31
This paper is finally published in TACL! Go check it out: https://t.co/x0pgkOnuvB
direct.mit.edu
Abstract. We propose a method for unsupervised abstractive opinion summarization, that combines the attributability and scalability of extractive approaches with the coherence and fluency of Large...
📣 New paper! 📣 Hierarchical Indexing for Retrieval-Augmented Opinion Summarization (to appear in TACL) Discrete latent variable models are scalable and attributable, but LLMs are wayyy more fluent/coherent. Can we get the best of both worlds? (yes) 👀 https://t.co/kK5OhqTwlw
1
3
20
If you have an MS or PhD in the #AI field, share your expertise, insights, and guidance with the next generation of AI leaders by volunteering with our Emerging Leaders in AI program! 🔗Register here: https://t.co/e8g09xvlrK Questions? Email academic@blackinai.org
0
14
34
I'm trying to get a feel for what sectors or companies cognitive and computational psych grads work in after PhD. If you are or know someone in that category, please can we have a half hour chat?
0
3
3
Ready to boost your ML/AI career or share your expertise? Join our Women in Machine Learning Mentoring Program! 🤖✨ Sign up now! Get guidance from experts and advance your career. 👩🏫 Mentors: Empower the next generation of ML/AI researchers. 🔗 Apply by Sept 15: Register here
0
11
26
After having a great time presenting our work, receiving an outstanding paper award made it even better. Special thanks to Sharon Goldwater for all her efforts and help, and to @Walid_Magdy for his continuous mentorship! 🎉🎉 📜 https://t.co/bR04ynyseb
#ACL2024NLP
@Edin_CDT_NLP
0
0
10
This way, we get the best of both worlds! The encoder/indexer can focus on learning a high quality index, and the LLM deals with the generation. HIRO learns a higher quality embedding space than previous work, AND generates higher quality summaries! See paper for lots more exps
1
1
3
We propose HIRO - our method learns a discrete hierarchical index over sentences, uses the index to 'retrieve' clusters of sentences containing related and popular opinions, and passes these clusters to an LLM to do the generation work.
1
1
5
📣 New paper! 📣 Hierarchical Indexing for Retrieval-Augmented Opinion Summarization (to appear in TACL) Discrete latent variable models are scalable and attributable, but LLMs are wayyy more fluent/coherent. Can we get the best of both worlds? (yes) 👀 https://t.co/kK5OhqTwlw
arxiv.org
We propose a method for unsupervised abstractive opinion summarization, that combines the attributability and scalability of extractive approaches with the coherence and fluency of Large Language...
2
10
51
Happy to share that our paper won the @cogsci_soc computational modeling award for perception and action! #CogSci2024 @Edin_CDT_NLP
Paper with @larryniven4, Naomi Feldman, and Sharon Goldwater to appear in CogSci 2024: "A predictive learning model can simulate temporal dynamics and context effects found in neural representations of continuous speech" https://t.co/Nz4y1Rhrsh (1/7)
4
8
55
The calm before the storm - launch of Joint Conference between UKRI CDTs NLP & @sltcdt! Thanks to @colemanhaley22 , Nick Ferguson + team (NLP & SLT) for organising the two day event - and WELCOME to all SLT students/staff! @InfAtEd
0
5
16
@MrinankSharma Our paper on the limits of human feedback was accepted too 😁 ( https://t.co/YVZBpdr2LM) We should chat in Vienna! (cc @max_nlp )
arxiv.org
Human feedback has become the de facto standard for evaluating the performance of Large Language Models, and is increasingly being used as a training objective. However, it is not clear which...
0
2
8
Using GPT-4 but the calls are expensive? Distilling your past queries into a student model may help. Introducing: 'Cache & Distil: Optimising API Calls to Large Language Models'
arxiv.org
Large-scale deployment of generative AI tools often depends on costly API calls to a Large Language Model (LLM) to fulfil user queries. To curtail the frequency of these calls, one can employ a...
1
19
38
🗣️Well done @very_laurie & co whose research is now being used by Wikimedia to improve their language detection: https://t.co/7Nt9bMEYSa Great outcome! 👏👍 @Edin_CDT_NLP @InfAtEd
0
0
10
I can not express how grateful I am for having our paper “ALDi: Quantifying the Arabic Level of Dialectness of Text” accepted to #EMNLP2023! Special thanks to my supervisors (@Walid_Magdy and Sharon Goldwater) at the University of Edinburgh @EdinburghNLP @Edin_CDT_NLP 🧵 (1/6)
9
10
120
😀Happy to share that work done with advisor @mlapata on "Conversational Semantic Parsing using Dynamic Context Graphs" has been accepted at #EMNLP2023 Main conference. Code and updated paper will be released soon. @EdinburghNLP @Edin_CDT_NLP @emnlpmeeting
1
5
55
Nice one @p_nawrot! 👏
No Train No Gain: Revisiting Efficient Training Algorithms For Transformer-based Language Models ( https://t.co/jHBmI2jlXw) accepted to @NeurIPSConf! I'm very proud of this work : ) Big congrats to @jeankaddour, @oscar__key, @PMinervini , and Matt J. Kusner!
0
1
2