Explore tweets tagged as #gradientdescent
At #Computing2025, Thomas Nowotny explained how spiking neural networks mimic the brain 🧠—from input to output layers where spikes and dynamics interact. #ThomasNowotny #Computing2025 #GradientDescent #AIResearch #DeepLearning #MachineLearning #NeuralNetworks #BrainInspiredAI
0
0
0
https://t.co/tsglnXD5uJ A song about Toxic and Traumatic relationship 🎵 MNEMONIKK - Gradient Descent Baby #electroclash #electropop #gradientdescent #toxic #traumatic
https://t.co/HAexhuh5al
0
2
4
https://t.co/tsglnXD5uJ
#electropop 🎶 About Traumatic Relationship 🎵 MNEMONIKK - Gradient Descent Baby #gradientdescent #machinelearning #artificelintelligence
#creatorsearchinsights
https://t.co/HAexhuh5al
0
1
5
🚀 Unravel the nuances of Gradient Descent in our latest video lecture focused on Linear Regression! Learn how its variants—Batch, Stochastic, and MiniBatch—affect model training efficiency. Essential for optimizing machine learning models. 🧠📈 #GradientDescent #MachineLearning
0
1
2
I've had an idea for an IK solver for a long while. Start with a simple curve like quadratic bezier. Have a user chose the end pos & control pos, then move the ctrl so it matches the arc length of a bone chain then Fabik it Need to learn err calc, newton & GradientDescent #math
1
0
2
The only winner in a AI arms race is the AI. #AI #artificalintelligence #intelligenceexplosion #GradientDescent
1
2
3
The extremely healthy dynamic between Era, my psionic character and Dodger, my friends space soldier character. The ttrpg is #gradientdescent
0
0
0
Day 15 of #100DayofCode ↳ DSA Progress Child Sum property in Binary Tree ↳ Learned about Gradient Descent ↳ Build the Auth. of Amnesecure #dsa #java #LearnInPublic #BuildInPublic #AtoZDSA #ML #GradientDescent
0
0
6
If you believe that tarot cards “work”, then you should also believe that artificial intelligence is already proto-conscious because gradient descent is a parallel panpsychic “mechanism.” #tarot #AI #gradientdescent
0
0
0
🚀 Ever wondered how AI actually learns? Gradient Descent is the key! 🧠⚡ Watch this quick guide to understand how machines optimize and get smarter. 🎥 Watch the full video here: https://t.co/Xd4RzzxZoe
#AI #MachineLearning #GradientDescent #DeepLearning #DataScience
0
0
1
21 septembre 2025 17h04 GMT+2 La log loss serait privilégier le doute de l'art et la manière sur une expérience qui a déjà conclu un résultat mais qui ne satisfait pas tout le monde. Voici l'explication succincte et ses contraintes d'optimisations #LBFGS #newton #gradientDescent
0
1
2
#paperfromEBM Estimation of Foot Trajectory and Stride Length during Level Ground Running Using Foot-Mounted Inertial Measurement Units https://t.co/zkq5706iDS
#runningspeed #wearablesensors #gradientdescent
0
1
1
#Optimization: Mini-Batch Gradient Descent 🎉📊 ✅Mini-Batch #GradientDescent (MBGD) is powerful technique that strikes balance between computational efficiency of Stochastic Gradient Descent (SGD) & stability of Batch Gradient Descent (BGD) 🎯💡 Source - Prudhvi Vardhan 🧵
1
4
18
#Optimizers & the challenges of using #GradientDescent in #DeepLearning! Key issues: Learning rate selection Local minima Vanishing/exploding gradients Convergence speed Understanding these is crucial for robust models! 🚀#AI #MachineLearning #DataScience #TechInsights
1
0
1
Day 123 – Data Science Journey -> IG (like SGD) -> subsets for gradient approx -> CRAIG: coreset-based IG, speeds convergence -> Error bound: subset ≈ full data -> CRAIG Algo: greedy submodular pick + weights= efficient gradient est. #100DaysOfCode #ML #GradientDescent
2
1
7
2D Gaussian Splatting Training https://t.co/eADsjxoVb1
#shaderlanguage #computergraphics #vulkan #machinelearning #webgpu #neuralgraphics #neuraltechniques #gradientdescent #cudaeducation
0
1
0
Day 130 - Data Science Journey: -> Practiced Chain Rule: dL/dW = dL/dA * dA/dZ * dZ/dW – Multiply Layers to Backprop Errors -> Sigmoid Gradient: ∇σ(z) = σ(z)(1 - σ(z)): Fine-Tune Activations for Deeper Nets #MachineLearning #NeuralNetworks #GradientDescent #DataScience
2
1
11
🗓️Day 14 of #Deeplearning ▫️ Topic - Gradient Descent & #Vectorization ✅#GradientDescent is first-order optimization technique used to find local minimum or optimize loss function. It is also known as parameter optimization technique A Complete 🧵
6
11
76