Linnea Evanson, PhD Profile
Linnea Evanson, PhD

@EvansonLinnea

Followers
361
Following
23
Media
12
Statuses
27

Postdoc | Former PhD Student @ École Normale Supérieure | Former Research Scientist Intern @ Meta AI | Studying how AI and our brains learn language!

Joined January 2021
Don't wanna be here? Send us removal request.
@EvansonLinnea
Linnea Evanson, PhD
2 months
RT @JeanRemiKing: Very happy to see our latest study featured by @AIatMeta: . 'Emergence of Language in the Developing Brain', by @Evanson….
0
26
0
@EvansonLinnea
Linnea Evanson, PhD
2 months
RT @JeanRemiKing: Very happy to see our latest study featured as a highlight by @AIatMeta:. 'Emergence of Language in the Developing Brain'….
0
11
0
@EvansonLinnea
Linnea Evanson, PhD
2 months
🙏 We would like to thank all the participants and their families for taking part in this study, as well as the doctors, technicians and nurses at the Rothschild Foundation Hospital, and the team at MetaAI for their support.
1
4
20
@EvansonLinnea
Linnea Evanson, PhD
2 months
Together, these findings reveal the maturation of language representations in the developing brain and show that modern AI systems provide a promising tool to model the neural bases of language acquisition, and thus help both fundamental and clinical neuroscience.
1
6
18
@EvansonLinnea
Linnea Evanson, PhD
2 months
Remarkably, this neurodevelopmental trajectory is spontaneously captured by large language models: with training, these AI models learned representations that can only be identified in the adult human brain.
Tweet media one
2
6
14
@EvansonLinnea
Linnea Evanson, PhD
2 months
Crucially, these language representations evolve with age: while fast phonetic features are already present in the superior temporal gyrus of the youngest individuals, slower word-level representations only emerge in the associative cortices of older individuals.
Tweet media one
2
4
15
@EvansonLinnea
Linnea Evanson, PhD
2 months
We find that a hierarchy of linguistic features (phonemes, words) is robustly represented across the cortex, even in 2–5-year-olds.
Tweet media one
2
5
16
@EvansonLinnea
Linnea Evanson, PhD
2 months
Here, we study neural activity recorded from over 7,400 electrodes clinically implanted in the brains of 46 patients, aged from 2 years old to adulthood, as they listened to an audiobook version of “The Little Prince”. We then use neural encoding and decoding models to map the
Tweet media one
3
7
16
@EvansonLinnea
Linnea Evanson, PhD
2 months
The human brain is a remarkable learner: A few million words suffice for children to acquire language. Yet, the brain architecture underlying this unique ability remains poorly understood.
1
4
14
@EvansonLinnea
Linnea Evanson, PhD
2 months
We’re very pleased to release our latest study ‘Emergence of Language in the Developing Brain’. Paper: Blog: The first systematic investigation of how the neural representations of language evolve as the brain develops. A
18
97
442
@EvansonLinnea
Linnea Evanson, PhD
2 years
RT @JeanRemiKing: 'Do children and language models follow similar learning stages?' by @EvansonLinnea et al:.- Thread: .
0
2
0
@EvansonLinnea
Linnea Evanson, PhD
2 years
RT @OleJensenCHBH: Join our online workshop: Neuro(AI) for Developmental Research (Friday Dec 8 at 2:45 GMT)!!!. Ke….
0
13
0
@EvansonLinnea
Linnea Evanson, PhD
2 years
I was thrilled to present my work on "How Language Representations Change Between 3 and 19 years old" at the NeuroAI Developmental Research workshop in Birmingham. Thank you very much to the organisers for the invitation!.
@B_Pomiechowska
Barbara Pomiechowska 🇪🇺
2 years
OPM MEG & neuro(AI) for Developmental Research workshop at @UoB_SoP @UoB_CDS @TheCHBH truly was a blast! .Huge thanks to our speakers @CarolineWitton Ania Kowalczyk @QN_lab Margot Taylor Jianbo Jiao @EvansonLinnea @vayzenberg90 @JeanRemiKing for beautiful inspiring talks💛1/2
Tweet media one
0
1
6
@EvansonLinnea
Linnea Evanson, PhD
2 years
RT @B_Pomiechowska: OPM MEG & neuro(AI) for Developmental Research workshop at @UoB_SoP @UoB_CDS @TheCHBH truly was a blast! .Huge thanks t….
0
7
0
@EvansonLinnea
Linnea Evanson, PhD
2 years
Presenting a poster on this paper @aclmeeting today! . If you're registered to ACL you can join virtually here at 11am EDT:
@EvansonLinnea
Linnea Evanson, PhD
2 years
🔥New paper accepted to ACL 2023!. “Language acquisition: do children and language models follow similar learning stages?”.With @lakretz and @JeanRemiKing. Very happy to share this work from my internship at @MetaAI !. Three key results below 👇.1/8
Tweet media one
0
1
6
@EvansonLinnea
Linnea Evanson, PhD
2 years
Overall, the systematic order to the learning trajectory of deep nets, and the similarity of the stages learned by the models and children suggests an intrinsic hierarchy of linguistic structures that both machines and humans must climb, to master the faculty of language. 8/8.
1
1
6
@EvansonLinnea
Linnea Evanson, PhD
2 years
💡3⃣: We saw that for this particular subset of three linguistic phenomena, 46 of our 48 GPT-2 models learned these skills in the same order as children. Simple subject verb order sentences were learned first, followed by Wh questions, and finally relative clauses. 7/8
Tweet media one
1
0
5
@EvansonLinnea
Linnea Evanson, PhD
2 years
Friedmann et al. (2021) described 3 distinct stages in language acquisition in 2-6 year old children, based on acquisition of the left periphery. We select one test from each stage to help us answer:. ❔3⃣: Is the learning trajectory of LLMs and children similar?. 6/8.
1
0
2
@EvansonLinnea
Linnea Evanson, PhD
2 years
💡2⃣: We grouped the linguistic probes that were learned above chance into 3 empirical groups, and plotted their learning trajectories. We found that the models learned linguistic skills in parallel, not in sequence. 5/8
Tweet media one
1
0
3
@EvansonLinnea
Linnea Evanson, PhD
2 years
The discrete stages of acquisition in children could be explained by "sequential" (a complex skill does not start to be learned before simpler skills are mastered) or "parallel" (all skills learned simultaneously) dynamics. ❔2⃣: Do LLMs learn sequentially or in parallel?. 4/8
Tweet media one
2
0
2