I'll present Manifold Diffusion Fields as an oral at the Diffusion Models workshop
@NeurIPSConf
, today 2:30 pm New Orleans time!!
work with
@YuyangW95
@jsusskin
and
@itsbautistam
Don't forget to join and let's know your questions and feedback!
I'll present Manifold Diffusion Fields as an oral at the Diffusion Models workshop
@NeurIPSConf
, today 2:30 pm New Orleans time!!
work with
@YuyangW95
@jsusskin
and
@itsbautistam
Don't forget to join and let's know your questions and feedback!
📢Very excited to share our work Graph Anisotropic Diffusion, which I will present today at two
#ICLR2022
workshops! (GTRL and MLDD) 🎉🥳🥳🥳
Extremely grateful to my advisor
@mmbronstein
and my collaborators
@GabriCorso
and
@HannesStaerk
🤗
Personal news: Happy to announce that I’ve joined
@Apple
MLR as a research intern!🎉🥳
I’ll be in
#Cupertino
office joining the team of
@jsusskin
. So excited about this, and looking forward to working with an awesome team at Apple!
Very excited to present Manifold Diffusion Fields (MDF): A generalization of diffusion generative models over continuous functions defined on manifolds!! Work done through my internship with a wonderful team
@jsusskin
and
@itsbautistam
at
@Apple
MLR!🥳
Introducing Manifold Diffusion Fields (MDF), our new work on learning generative models over fields defined on curved geometries. This is joint work with our intern
@Ahmed_AI035
(who hasn’t even started his PhD yet!) and
@jsusskin
at
@Apple
MLR 🧵
Following our
#GDL
series, In a new post at
@TDataScience
coauthored with
@Mohamed87290109
, we show that symmetry alone is not sufficient to break the curse of dimensionality.
A new post
@TDataScience
discusses the concept of symmetry in ML and GDL, as well as several mathematical ideas such as abstract groups, group actions, & group representations, and ends up with invariant & equivariant networks in deep learning.
Happy to present Manifold Diffusion Fields at
@HannesStaerk
reading group today at 4 pm UK!
If you are interested in generative models for 3D meshes and graphs, join us in ~2h!
I’ll not say it is the end but rather is a new start! Great time with a wonderful people at
@Apple
MLR, I really enjoyed it! Special thanks to
@jsusskin
and
@itsbautistam
!
Exciting news coming soon!
In this paper, we propose a linear diffusion layer with a learnable kernel size combined with an anisotropic filter, in a novel GNN architecture that we call Graph Anisotropic Diffusion (GAD).
1/n New preprint alert! Introducing Generative Molecular Conformer Fields (MCF) a generative model for molecular conformer generation that obtains state-of-the-art results without using any domain specific inductive biases!
Very sad to see war breaking out in my country Sudan between the Rapid Support Forces and the Sudanese Army Forces. Even worse, this is happening within cities and states starting from Khartoum.
MDF achieved superior results and has been able to learn different distributions of functions over diverse geometries and with high-quality samples. Example: MDF can generate a moving Gaussian Mixture at different locations on the paw and tail of the cat geometry.
Its victims are innocent, defenseless residents who have nothing to do with power or politics. May Allah stops the bloodshed and we fully support the Sudanese Armed Forces in maintaining security and stability in the country.
These posts are based on the project “Implicit Neural Representation (INR) based on the Geometric Information of Shapes” during SGI 2022, under the mentor of Dena Bazazian and Shaimaa Abdelhafez.
In this post we review some of the basics in statistical learning tasks, the curse of dimensionality, and lastly we introduce the geometric domains and their assumptions on the input data.
In MDF, we represent training samples as functions that go from a manifold M to signal space Y. Then we use the eigenfunctions of the Laplacian as a local coordinate system for points of M which consider an intrinsic representation.
Based on INR concept, we then discuss and compare the Sinusoidal Representation Network (SIREN) and its enhanced version Divergence Guided Shape Implicit Neural Representation (DiGS) from the papers and
In later posts we will explain in detail how by the two properties Symmetry and Scale Separation we can develop a GDL Blueprint that can serve as a framework for current state-of-the-art architectures.
We will discuss the so-called Geometric Domains or the 5 Gs which include Grids, Groups, Graphs, Geodesics, and Gauges, and their appropriate structure, in the pipeline of the GDL Blueprint.
Then we discuss the scale separation prior, how it rises from the multiscale structure, and how it is crucial to break the curse. And finally, we conclude with the GDL Blueprint as a general framework that can be applied to various geometric domains.
In this post, we review the definition of the word symmetry among various mathematicians touching on some historical context, the appearance of the Erlangen Programme, and how it came to deep learning in the name Geometric Deep Learning
Happy for any comments!