Sacha Lerch
@LerchSacha
Followers
63
Following
63
Media
0
Statuses
20
PhD student @EPFL in physics / Quantum Information and Quantum Computing
Joined September 2022
Have you ever wondered how much work can be offloaded from quantum computers when simulating an expectation landscape of a parametrized quantum circuit🤔? In our new work “Efficient quantum-enhanced classical simulation for patches of quantum landscapes” we tackle this question.
1
9
51
Our group's work on the trainability of quantum generative models was published today in @npj Quantum Information! We establish a framework that categorizes types of generative loss functions, allowing us to prove surprisingly general trainability results. 🧵👇
1
4
31
Link: https://t.co/7m9xOvhCwp. Thanks to my lovely co-authors @LerchSacha, @s_thanasilp, @oriel_kiss, @GrosQmichi, Sofia Vallecorsa, and @qZoeHolmes. Special shout-out to our Master's student Oxana Shaya who proved cool new results during the review stage.
nature.com
npj Quantum Information - Trainability barriers and opportunities in quantum generative modeling
0
3
9
So happy to see the very first group's work being finally published 🥰 Indeed, it's been a long fun run. Big congrats to @LerchSacha and Oxana for their first publication 🎉🥳
Our group's work on the trainability of quantum generative models was published today in @npj Quantum Information! We establish a framework that categorizes types of generative loss functions, allowing us to prove surprisingly general trainability results. 🧵👇
0
3
10
Super fun discussions today! I love traveling to places to meet and talk to great people 🔥
Great pleasure having an entire crowd from the team around @qZoeHolmes visiting us here in Berlin, discussing notions of variational #quantumalgorithms.
2
3
27
A new classical simulation kid is on the block 👲And we are benchmarking it against everyone's favorite example. Via classical surrogate simulation of 127-qubit systems, we can evaluate new circuit angles in fractions of a second on a laptop. https://t.co/GKghcO8nMU 🧵1/7
8
24
109
Summer in Lausanne 🧡
0
3
69
Aaaand the final day of the #QCQS workshop with Michele Grossi (@GrosQmichi) from @CERN. After a great dinner last night, we are happy to see so many (awake) guests in our morning session.
0
3
10
Such a great collaboration #EPFL and #CERN on #quantumcomputing I really enjoyed this research group, looking forward to future steps! @CERNquantum
What makes quantum generative models trainable? And can trainable loss functions be used to reliably learn complicated data distributions? Find out in our new group's first push into the field of generative QML. https://t.co/4FJFsGMjva
https://t.co/wxYJauE3ET 1/8 🧵
0
2
8
What's the best loss to train a quantum generative model? Take a look at our new work: https://t.co/Vmgf8FHb1P Punchline: ⛔️ Don't use KL Divergence. ⛔️ Don't use the MMD with a constant bandwidth ☑️ Use the MMD with a range of O(1) to O(n) bandwidths See 🧵👇
What makes quantum generative models trainable? And can trainable loss functions be used to reliably learn complicated data distributions? Find out in our new group's first push into the field of generative QML. https://t.co/4FJFsGMjva
https://t.co/wxYJauE3ET 1/8 🧵
4
4
57
Blackboards with Mont Blanc in the background. This can only lead to interesting discussions next week! Quantum-Classical Quantum Simulation (#QCQS workshop) @EPFL_en @BernoulliCenter
Looking forward to learning and exchanging ideas with our international guests at the #QCQS workshop next week in Lausanne. Blackboards so clean need to be used ☀️📓🖋️ Find out more on https://t.co/Hlu67i0KUf
0
3
8
I'm so happy for my first paper ! Thank you to @QuantumManuel, @s_thanasilp and @qZoeHolmes for these beautiful moments. I've lean so much and had a lot of fun with this dream team and it's just the beginning... 😶🌫️❤⚛️🥳🔥
What makes quantum generative models trainable? And can trainable loss functions be used to reliably learn complicated data distributions? Find out in our new group's first push into the field of generative QML. https://t.co/4FJFsGMjva
https://t.co/wxYJauE3ET 1/8 🧵
0
0
6
What makes quantum generative models trainable? And can trainable loss functions be used to reliably learn complicated data distributions? Find out in our new group's first push into the field of generative QML. https://t.co/4FJFsGMjva
https://t.co/wxYJauE3ET 1/8 🧵
1
12
53
It was a blast working on this with @LerchSacha @s_thanasilp @oriel_kiss @GrosQmichi, Sofia Vallecorsa and of course @qZoeHolmes! We learned so much and had great fun along the way. What a way to live 🧡 8/8
1
1
7
Eager to meet you all in Lausanne. Let's definitely fill those clean blackboards with innovative ideas! 🖊️#QCQS
Looking forward to learning and exchanging ideas with our international guests at the #QCQS workshop next week in Lausanne. Blackboards so clean need to be used ☀️📓🖋️ Find out more on https://t.co/Hlu67i0KUf
0
4
7
CALL FOR PAPERS QTML 2023: 7th International Conference on Quantum Techniques in Machine Learning https://t.co/6hPgHk5OCA 🇨🇭CERN, Nov. 19-24, 2023. Local organizers: @GrosQmichi, Sofia Vallecorsa & CERN Quantum Technology Initiative. Program chairs: @Alessdip & myself.
0
27
70
🚨EXCLUSIVE WEBINAR🚨 THURSDAY 13th April 2023 / 4pm - 5pm+ (CEST) 👉Registration https://t.co/LC7XGTwp9M
0
3
3
Suppose you have an unknown target unitary you want to learn a model of… Suppose you have no way of coherently interacting/performing joint quantum measurements on your target and model systems… How well can you do? We address this here https://t.co/50gr8tZrRb See 🧵👇
1
12
106