
Simon Prince
@SimonPrinceAI
Followers
10K
Following
289
Media
50
Statuses
369
Professor of Computer Science, University of Bath
Toronto, Ontario
Joined July 2019
Last year I wrote seven tutorials for @RBCBorealis on infinite-width neural networks. Topics included the neural tangent network, Bayesian neural networks and Neural Network Gaussian processes. Includes working code and many novel figures.
2
3
22
My friend @TylerJohnMills is looking for collaborators to work on the ARC-AGI competition. This benchmark is interesting and encourages creative approaches to AI. Tyler helped me with my book and would be a fun person to work with. Get in touch directly if you are interested.
1
2
3
RT @RBCBorealis: 👋 CALLING ALL STUDENTS! @RBCBorealis is excited to announce our 2025 Fall Technical #Coop Program. This is your chance to….
0
1
0
Exciting news! @TravisLacroix (who co-wrote the chapter on ethics in Understand Deep Learning) has a new book out "AI and Value Alignment". Recommended reading for anyone serious about ethics and AI. Details at:. Buy it here: .
2
9
55
RT @RBCBorealis: 👋 Work with us! At @RBCBorealis, we are at the forefront of #AI and #data. We build products and #technologies that shape….
0
1
0
Here is part III of my series for @RBCBorealis on ODEs and SDEs in machine learning. This article develops methods for solving first-order ODEs in closed form; we divide ODEs into different families and develop approaches to solve each family.
0
6
11
Here's the 2nd part of my series of articles on ODEs and SDEs in ML for @RBCBorealis. The article describes ODEs, vector ODEs, and PDEs and categorizes ODEs by how their solutions are related. It also describe conditions for an ODE to have a solution.
3
14
50
I'm starting a series of articles on ODEs and SDEs in ML for @RBCBorealis. I'll describe ODEs and SDEs without assuming prior knowledge and present applications including neural ODEs, and diffusion models. Part I: Follow for parts II and III.
1
3
28
These blogs for @RBCBorealis.consider infinite-width networks from 4 viewpoints. We use gradient descent or a Bayesian approach, and focus on either the weights or output function. This leads to the Neural Tangent Kernel, Bayesian NNs and NNGPs. Enjoy!.
0
1
10
Tutorial 4 of 4 on Bayesian methods in ML for @RBCBorealis concerns Neural Network Gaussian Processes (links in comments). Think your network might perform better if you increased the width? NNGPs are networks with INFINITE width! Includes code to train and run them.
1
1
20
Extremely kind words from @justinskycak about "Understanding Deep Learning". Justin himself has a host of useful resources for learning math for ML (see the links in his post) and an interesting summary of the science of learning. See
3
9
79
Blog 3 of 4 on Bayesian methods in ML for @RBCBorealis concerns Bayesian Neural Networks (i.e., Bayesian methods for NNs from a parameter-space perspective):. Parts 1 and 2 (linked in article) introduced Bayesian methods. Coming soon in part 4: NNGPs
0
2
19
This is an interesting idea. Reprints of the most important AI papers, together with a discussion and sometimes even comments from the original authors. If your work isn't being cited much, you might want to consider what they all have in common. .
NEW BOOK: The Artificial Intelligence Papers: Original Research Papers With Tutorial Commentaries. Table of Contents and Chapter 1: "An intellectual string of pearls.'' Karl Friston, FRS.
2
1
37