🥳Lecture 2 is out!
🤓Steerable G-CNNs
I did my best to keep it as intuitive as possibl; went all out on colorful figs/gifs/eqs, even went through the hassle of rewriting harmonic nets
@danielewworrall
as regular G-CNNs to make a case!😅Hope y'all like it!
@erikjbekkers
@danielewworrall
In my estimation, DL became widely accessible only after the teaching materials (like CS231n in 2016) caught up. This series of lectures is going to do much the same for equivariant learning :)
@neeldey
@danielewworrall
I hope it lives up to these expectations😅 But on a serious note, I do find it important to share knowledge and understanding -not just in condensed (conference) paper format which often expect the reader to be an expert already- and I am happy to see that this is appreciated😃
@erikjbekkers
@erikjbekkers
I finally got round to watching this and what a wonderful set of lectures they are. It's quite the honour to have H-Nets showcased at the end too! For me this is now THE reference on G-convs and equivariance
@erikjbekkers
@danielewworrall
Awesome! I wasn't super aware or conscious of the perspective that "everything's a special case of regular group convolution", so I definitely learned something here :-)
@erikjbekkers
@danielewworrall
And harmonic networks are always great as a first step. When I wanted to find a general method to solve the kernel constrained, I also started with harmonic networks --- everything else was literally just "scaling up the ideas".