marceldotsci Profile Banner
marcel βŠ™ Profile
marcel βŠ™

@marceldotsci

Followers
468
Following
1K
Media
57
Statuses
1K

Trying to build better machine-learning surrogate models for DFT. Postdoc @lab_cosmo πŸ‘¨πŸ»β€πŸš€.

Lausanne, Switzerland
Joined November 2019
Don't wanna be here? Send us removal request.
@marceldotsci
marcel βŠ™
9 months
Our (@spozdn & @MicheleCeriotti) recent work on testing the impacts of using approximate, rather than exact, rotational invariance in a machine-learning interatomic potential of water has been published by @MLSTjournal! We find that, for bulk water, approximate invariance is ok. .
@MLSTjournal
Machine Learning: Science and Technology
9 months
Tweet media one
2
6
21
@marceldotsci
marcel βŠ™
1 month
happy to report that our paper is accepted for an oral presentation at ICML. amazing work by Filippo, who will present it in vancouver! final version, with some extra content, here:
openreview.net
The use of machine learning to estimate the energy of a group of atoms, and the forces that drive them to more stable configurations, have revolutionized the fields of computational chemistry and...
@marceldotsci
marcel βŠ™
7 months
new work! we follow up on the topic of testing which physical priors matter in practice. this time, it seems that predicting non-conservative forces, which has a 2x-3x speedup, leads to serious problems in simulation. we run some tests and discuss mitigations!.
0
2
10
@marceldotsci
marcel βŠ™
1 month
πŸ‘€.
@vdbergrianne
Rianne van den Berg
1 month
πŸš€ After two+ years of intense research, we’re thrilled to introduce Skala β€” a scalable deep learning density functional that hits chemical accuracy on atomization energies and matches hybrid-level accuracy on main group chemistry β€” all at the cost of semi-local DFT. βš›οΈπŸ”₯πŸ§ͺ🧬
Tweet media one
0
0
1
@marceldotsci
marcel βŠ™
1 month
obvious in hindsight but news to me: accessing np.load-ed arrays by index is orders of magnitude slower than forcing them to RAM.
0
0
3
@marceldotsci
marcel βŠ™
1 month
ok maybe i just decline further requests to provide reviews then
Tweet media one
0
0
7
@marceldotsci
marcel βŠ™
4 months
this is rather sick.
@n_gao96
Nicholas Gao
4 months
I am truly excited to share our latest work with @MScherbela, @GrohsPhilipp, and @guennemann on "Accurate Ab-initio Neural-network Solutions to.Large-Scale Electronic Structure Problems"!.
1
0
4
@marceldotsci
marcel βŠ™
4 months
πŸ‘€.
@MarkNeumannnn
Mark Neumann
4 months
Excited to announce Orb-v3, a new family of universal Neural Network Potentials from me and my team at @OrbMaterials, led by Ben Rhodes and @sanderhaute! These new potentials span the Pareto frontier of models for computational chemistry.
0
1
8
@marceldotsci
marcel βŠ™
4 months
πŸ‘€.
@ChengBingqing
Bingqing Cheng
4 months
Guess what? By learning from energies and forces, machine learning interatomic potentials can now infer electrical responses like polarization and BECs! This means we can perform MLIP MD simulations under electric fields!
0
0
1
@marceldotsci
marcel βŠ™
4 months
πŸ‘€.
@FrankNoeBerlin
Frank Noe
4 months
Simulating full quantum mechanical ground- and excited state surfaces with deep quantum Monte Carlo by Zeno SchΓ€tzle, Bernat Szabo and Alice Cuzzocrea. πŸ§΅β¬‡οΈ
Tweet media one
0
0
2
@marceldotsci
marcel βŠ™
4 months
RT @lab_COSMO: 🀫 you can get a better universal #machinelearning potential by training on fewer than 100k structures. too good to be true?….
0
13
0
@marceldotsci
marcel βŠ™
5 months
Chatty G seems very uninformed about JAX. Or it knows something I don't.
Tweet media one
1
0
1
@marceldotsci
marcel βŠ™
5 months
πŸ‘€.
@ShuiwangJi
Shuiwang Ji
5 months
1/5: Our new materials foundation model, HIENet, combines invariant and equivariant message passing layers to achieve SOTA performance and efficiency on materials benchmarks and downstream tasks.
0
0
0
@marceldotsci
marcel βŠ™
5 months
can stefan be stopped? it’s unclear. congrats!.
@stevain
Dr. St❦fan Gugler
5 months
We have a new paper on diffusion!πŸ“„. Faster diffusion models with total variance/signal-to-noise ratio disentanglement! ⚑️. Our new work shows how to generate stable molecules in sometimes as little 8 steps and match EDM’s image quality with a uniform time grid. 🧡
Tweet media one
2
0
8
@marceldotsci
marcel βŠ™
5 months
πŸ‘€.
@ask1729
Aditi Krishnapriyan
5 months
1/ Machine learning force fields are hot right now πŸ”₯: models are getting bigger + being trained on more data. But how do we balance size, speed, and specificity? We introduce a method for doing model distillation on large-scale MLFFs into fast, specialized MLFFs!.
0
0
5
@marceldotsci
marcel βŠ™
5 months
πŸ‘€.
@acaruso0
Alessandro Caruso
5 months
We just shared our latest work with.@CecClementi and @FrankNoeBerlin!.RANGE addresses the "short-sightedness" of MPNNs via virtual aggregations, boosting the accuracy of MLFFs at long-range with linear time-scaling. Next-gen force-fields are here! πŸ§ πŸš€.
0
0
1
@marceldotsci
marcel βŠ™
6 months
anyone at the lausanne applied ML days?.
0
0
0
@marceldotsci
marcel βŠ™
6 months
the recording of the talk i gave a while back about the "invariance is easy to learn" paper with @spozdn and @MicheleCeriotti is out!.
@OrbMaterials
Orbital
6 months
We hosted a webinar series all about unlocking the future of #AI in materials science, and we’ve added the recordings to YouTube for anyone who wasn’t able to attend. Check out episode 1 here, hosted by @marceldotsci and Jonathan Schmidt:
Tweet media one
1
3
27
@marceldotsci
marcel βŠ™
6 months
apparently there is a race condition in the shutdown procedure πŸ˜”.
0
0
0