CDT in Data Science
@EdiDataScience
Followers
1K
Following
230
Media
64
Statuses
1K
EPSRC Centre for Doctoral Training in Data Science. Training PhDs in machine learning, databases, stats+opt, & analysis of unstructured data at Edinburgh Uni
Edinburgh, Scotland
Joined November 2013
Congratulations Tom! The hard work pays off!
My PhD thesis "Modelling Cross-lingual Transfer For Semantic Parsing" is finally submitted! 🎉🎉🎉 #NLProc
0
0
1
Poster sessions are happening today! 11:40-1:20 and 5:00-5:30 today in room 208-210. Come chat transfer learning for Gaussian processes and Bayesian optimisation 🌸
Excited to be in 🎊New Orleans🎊 this week for #NeurIPS2023! Ping me to chat Bayesian optimisation, transfer learning or ML and climate. Or if you know any good runs nearby! And if you're interested in the above topics you should stop by our poster on Saturday!
0
8
16
Do you love cross-lingual transfer? Are you interested in putting latent variables everywhere you can? Desperately searching for applied optimal transport research? Come to my poster 16:00 in East Foyer! #EMNLP2023
https://t.co/ESNFGjJJVk w/ @tomhosking + Mirella Lapata
0
9
66
🚨New TACL paper 🚨 Can we use explicit latent variable alignment for cross-lingual transfer? Minotaur uses optimal transport for explicit alignment across languages. w/ @tomhosking, Mirella Lapata. To be presented @emnlpmeeting + @mrl2023 in-person 🇸🇬 https://t.co/PdoDPEIP97
1
14
55
Excited for the poster session this afternoon at @automl_conf! I'll be talking about hyperparameter optimisation for increasing data set sizes and learning between tasks. Work with Huibin Shen, @FrancoisAubet, David Salinas and @kleiaaro #AutoML23
Finding yourself repeatedly training your ML model because you’ve collected more data? Not sure what to do with the hyperparameters? We investigated for you! Check out the paper https://t.co/D4AOSdK1g4 with the wonderful Huibin Shen, @FrancoisAubet, David Salinas and @kleiaaro
0
6
21
I'm really happy to share the news that Meta-Calibration has been accepted to TMLR! Meta-Calibration uses meta-learning as a new way to optimise for uncertainty calibration of neural networks. I've had a very positive experience with TMLR and certainly recommend submitting there!
Meta-Calibration: Learning of Model Calibration Using Differentiable Expected Calibration Error Ondrej Bohdal, Yongxin Yang, Timothy Hospedales. Action editor: Yingzhen Li. https://t.co/BztOt8NvLB
#calibration #prediction #optimise
0
1
28
Life update: I have passed my PhD defense (@alexandrabirch1 and @licwu were excellent examiners), and moved to New York City! Many thanks to everyone who made the PhD experience so fun, especially my CDT cohort, collaborators, and supervisor @driainmurray
2
1
43
I'm very excited to be in Seoul to give a talk and present a poster about our CVPR'23 Meta Omnium paper at the Hyundai Vision Conference! Feel free to take a look at the poster and also a few photos from this great city! Meta Omnium website: https://t.co/EH0tpxR1yh
0
3
23
Interested in what could be the long-term implications of deploying too many AI systems that are not fair? You can learn more in our paper - we've updated it on arXiv with the version that will appear soon in the proceedings of the Stanford Existential Risk Conference!
📢✨ New journal publication alert! ✨📢 @OBohdal, @tmh31 , Phil @OxfordTVG , and I @FazlBarez have dissected a crucial issue: lack of #AIFairness. Our analysis reveals how unfair AI can fuel social stress and unrest over time. 🧵 Here are the main insights: [1/4]
0
2
8
- @sillinuss: Why Do Self-Supervised Models Transfer? On Data Augmentation and Feature Properties https://t.co/QYxO0tiKHh - @sighellan: Bayesian Optimisation Against Climate Change: Applications and Benchmarks https://t.co/syIFsrq8jo 2/2
0
0
1
We've also got papers at ICML workshops 🌺🌺🌺 - Will Toner: Label Noise: Correcting a Correction Loss https://t.co/49eQ33ibDc - @OBohdal: Impact of Noise on Calibration and Generalisation of Neural Networks https://t.co/jKi58HczE2 1/2
1
0
1
Curious about how noise influences uncertainty calibration and generalisation of neural networks? We'll present our new work on this topic at the #ICML2023 SCIS workshop this Saturday! Collaboration with @MartinFerianc, @tmh31, @mrd_rodrigues
Hello from ICML'23! We will be presenting our joint work with @OBohdal, @tmh31, @mrd_rodrigues: Impact of Noise on Calibration and Generalisation of Neural Networks at the Spurious Correlations, Invariance, and Stability Workshop: https://t.co/5WvGkujc4G
0
2
29
The CDT has two papers at #ACL2023 this week thanks to @tomsherborne: - Extrinsic Evaluation of Machine Translation Metrics: An outstanding paper! 🎉🎉🎉 - Meta-Learning a Cross-lingual Manifold for Semantic Parsing
1
0
7
🎉🎉🎉
0
0
0
Congratulations to CDT student Tom! 🎉🎉🎉
We won outstanding paper at @aclmeeting! Major thanks to my co-authors @nikita_moghe @alexandrabirch1 and Mark Steedman #ACL2023NLP #NLProc
0
0
1
I'll be at #ACL2023 next week with 2 papers: ⭐️Extrinsic Evaluation of Machine Translation Metrics (Main conf - Monday Session 2: MT Oral) ⭐️Meta-Learning a Cross-lingual Manifold for Semantic Parsing (TACL - Wednesday Session 7: Semantics) Reach out if you want to say hello!
1
3
31
Finding yourself repeatedly training your ML model because you’ve collected more data? Not sure what to do with the hyperparameters? We investigated for you! Check out the paper https://t.co/D4AOSdK1g4 with the wonderful Huibin Shen, @FrancoisAubet, David Salinas and @kleiaaro
1
7
23