Linus Ericsson
@sillinuss
Followers
144
Following
88
Media
5
Statuses
37
Postdoc researcher at the University of Edinburgh
Edinburgh, Scotland
Joined October 2013
Check out our new work on NAS search spaces! We have created einspace, a new expressive search space that unifies many architectural families, including convnets, transformers and mlp-only architectures. paper: https://t.co/OINqiaYo8y project page: https://t.co/hEvDjIKhsf
1
19
41
Check out our new work on NAS search spaces! We have created einspace, a new expressive search space that unifies many architectural families, including convnets, transformers and mlp-only architectures. paper: https://t.co/OINqiaYo8y project page: https://t.co/hEvDjIKhsf
1
19
41
0
0
0
Thanks to Elliot, Steven and Antreas in particular for forming the early stages of the idea and Shay for help formulating the CFG!
0
0
2
This work was done at the University of Edinburgh with collaborators Miguel Espinosa, @CYang51746374, @AntreasAntonio, @AmosStorkey, Shay B. Cohen, @steve_mcdonagh and Elliot J. Crowley. @SchoolOfEng_UoE , @InfAtEd
Check out our new work on NAS search spaces! We have created einspace, a new expressive search space that unifies many architectural families, including convnets, transformers and mlp-only architectures. paper: https://t.co/OINqiaYo8y project page: https://t.co/hEvDjIKhsf
1
2
13
In our experiments we find new cool architectures as well as improvements on existing ones across a range of diverse datasets from the Unseen NAS suite. Here is an example of the wild (and high-performing) things einspace can generate. code: https://t.co/L91ikzuPn4
0
0
3
Our space is based on a parameterised probabilistic context-free grammar. Through its recursively defined production rules, it's able to generate architectures that are arbitrarily wide and deep, containing branching and routing operations in addition to parameterised layers.
1
0
4
🚨 MemControl: Mitigating Memorization in Medical Diffusion Models via Automated Parameter Selection A new strategy to mitigate memorization in Diffusion models Arxiv: https://t.co/EIGrSOk8DL Work done with @SnchzPedro_ @OBohdal @STsaftaris @tmh31 @BioMedAI_CDT 🧵👇
1
7
19
Delighted to share that my work on establishing the first extensive benchmark for PEFT in Medical Imaging has been accepted in @midl_conference with scores 4,4,5🥳 Link: https://t.co/EoE0bSK9gn cc: @tmh31 @STsaftaris @SnchzPedro_ @sillinuss @BioMedAI_CDT
openreview.net
Foundation models have significantly advanced medical image analysis through the pre-train fine-tune paradigm. Among various fine-tuning algorithms, Parameter-Efficient Fine-Tuning (PEFT) is...
3
7
36
I'm happy to announce the first competition of the #AutoML conference 2024 ( https://t.co/WNvxJORDVX). Neural architecture search #NAS is one of the most well-known fields of AutoML and has enabled fantastic achievements in the last few years.
1
3
10
🚨FairTune: Optimizing PEFT for Fairness in Medical Image Analysis A new framework to finetune your large vision models that improves downstream fairness. Accepted in #ICLR2024 ✨ With: @OBohdal @STsaftaris @tmh31 CC: @vivnat @alvarezvalle @fepegar_ @BoWang87 @BioMedAI_CDT
1
20
83
Excited to have been part of DemoFusion, bringing UHD generation to SDXL on your desktop with no training! With @RuoyiDu @yizhe_song @DL_Chang Project: https://t.co/ygOnvdSJtN, paper: https://t.co/OR77mOvHjD
#GenerativeAI
4
7
27
🚨New TACL paper 🚨 Can we use explicit latent variable alignment for cross-lingual transfer? Minotaur uses optimal transport for explicit alignment across languages. w/ @tomhosking, Mirella Lapata. To be presented @emnlpmeeting + @mrl2023 in-person 🇸🇬 https://t.co/PdoDPEIP97
1
14
57
📢New work on Self-Supervised Representation Learning📢 🎯Rather than using data augmentations to induce invariance, we use them to separate/disentangle. ➡️Improves performance* on downstream tasks, since one task's "style" may be another's "content". https://t.co/MIOH1QcHYX
2
20
87
Is domain adaptation ready for the use of automatic machine learning methods? I’m presenting our paper “Better Practices for Domain Adaptation” in the @automl_conf in poster sessions today 16:00 and tomorrow 10:45, and giving a best paper talk today at 17:30! #AutoML2023
2
3
20
Finding yourself repeatedly training your ML model because you’ve collected more data? Not sure what to do with the hyperparameters? We investigated for you! Check out the paper https://t.co/D4AOSdK1g4 with the wonderful Huibin Shen, @FrancoisAubet, David Salinas and @kleiaaro
1
7
23
Interested in Bayesian optimisation and how it can be applied against climate change? Check out our new short survey. We look at material discovery, wind farm layout planning, optimal renewable control and environmental monitoring. For each we suggest a benchmark to get started.
2
5
12
🚨Parameter Efficient Fine-Tuning has been well researched for NLP, vision, and cross-modal tasks. So why should MedAI be left behind? Presenting the first evaluation on PEFT for medical AI - https://t.co/tYIh7DLNN7 16 PEFT methods, 5 datasets including a text-to-image task 🔥
3
11
40
Just one week till our @NeurIPSConf social on ML and climate change! We’ll have icebreaker bingo, roundtable discussions and a friendly atmosphere.
1
2
7