feather_ai Profile Banner
feather Profile
feather

@feather_ai

Followers
222
Following
36
Media
30
Statuses
69

Join our community for ML news, talks and tutorials: https://t.co/im4B6qjIxt

Joined January 2021
Don't wanna be here? Send us removal request.
@feather_ai
feather
4 years
Tired of spending time and money on copywriting and editing images? https://t.co/uLQIJ14Wb0
0
1
5
@feather_ai
feather
4 years
Code:
1
0
0
@feather_ai
feather
4 years
How positional encoding looks once it has been applied to an embedding
1
0
0
@feather_ai
feather
4 years
Theory:
1
0
0
@feather_ai
feather
4 years
Positional Encoding allows us to inject information about the order of the sequence into an otherwise position-indifferent architecture. The original Transformer does this by injecting sinusoids into the word embeddings. https://t.co/toAhkzsnAj #ArtificialIntelligence Info:👇
1
2
2
@feather_ai
feather
4 years
Code:
0
0
0
@feather_ai
feather
4 years
Theory:
1
0
0
@feather_ai
feather
4 years
In this Transformers from scratch episode, we look at how we can build and code up the Encoder Layer in its entirety: https://t.co/Tgw0MXVwRr #ArtificialIntelligence #NLProc #MachineLearning #DataScience
1
1
2
@feather_ai
feather
4 years
Code:
0
0
0
@feather_ai
feather
4 years
Theory:
0
0
0
@feather_ai
feather
4 years
Transformers: Residual Connections, Layer Normalization, and Position Wise Feedforward Networks In this episode, we take a look the components shared across all transformer sublayers https://t.co/Ic8xtqJC72 #ArtificialIntelligence #nlproc
2
0
0
@feather_ai
feather
4 years
Coding up the layers:
0
0
0
@feather_ai
feather
4 years
Theory:
1
0
0
@feather_ai
feather
4 years
Transformers: Multi-head attention In this episode/lesson, we look at upping the self-attention mechanism we looked at previously to the multi-head variant. Includes the theory and code. https://t.co/j9jdQPUu1O #NLProc #ArtificialIntelligence #MachineLearning #DataScience
1
0
2
@feather_ai
feather
4 years
Transformers: Self-attention In this video tutorial series, we take a look at the heart of the Transformer, the self-attention mechanism. After an intuitive breakdown of how it works, we code up the mechanism. https://t.co/Q7ygAhBjy3 #NLProc #ArtificialIntelligence #ML #Data
1
2
4
@feather_ai
feather
4 years
code:
0
0
0
@feather_ai
feather
4 years
Step by step run through
1
0
0
@feather_ai
feather
4 years
Visual mathematical breakdown
1
0
0
@feather_ai
feather
4 years
Snippets. Intuition behind self-attention
1
0
0