feather
@feather_ai
Followers
222
Following
36
Media
30
Statuses
69
Join our community for ML news, talks and tutorials: https://t.co/im4B6qjIxt
Joined January 2021
How positional encoding looks once it has been applied to an embedding
1
0
0
Positional Encoding allows us to inject information about the order of the sequence into an otherwise position-indifferent architecture. The original Transformer does this by injecting sinusoids into the word embeddings. https://t.co/toAhkzsnAj
#ArtificialIntelligence Info:👇
1
2
2
In this Transformers from scratch episode, we look at how we can build and code up the Encoder Layer in its entirety: https://t.co/Tgw0MXVwRr
#ArtificialIntelligence #NLProc #MachineLearning #DataScience
1
1
2
Transformers: Residual Connections, Layer Normalization, and Position Wise Feedforward Networks In this episode, we take a look the components shared across all transformer sublayers https://t.co/Ic8xtqJC72
#ArtificialIntelligence #nlproc
2
0
0
Transformers: Multi-head attention In this episode/lesson, we look at upping the self-attention mechanism we looked at previously to the multi-head variant. Includes the theory and code. https://t.co/j9jdQPUu1O
#NLProc #ArtificialIntelligence #MachineLearning #DataScience
1
0
2
Transformers: Self-attention In this video tutorial series, we take a look at the heart of the Transformer, the self-attention mechanism. After an intuitive breakdown of how it works, we code up the mechanism. https://t.co/Q7ygAhBjy3
#NLProc #ArtificialIntelligence #ML #Data
1
2
4