lpjiang97 Profile Banner
Linxing Preston Jiang Profile
Linxing Preston Jiang

@lpjiang97

Followers
328
Following
1K
Media
21
Statuses
323

PhD student @uwcse interested in theoretical neuroscience. Also @lpjiang97.bsky.social

Seattle, WA
Joined March 2015
Don't wanna be here? Send us removal request.
@lpjiang97
Linxing Preston Jiang
2 months
I'm excited to share our latest work โ€” "Data Heterogeneity Limits the Scaling Effect of Pretraining in Neural Data Transformers", where we examined the effect of scaling up pretraining data in neural foundation models carefully.๐Ÿง (1/9). Preprint:
1
13
32
@lpjiang97
Linxing Preston Jiang
2 months
This is joint work with @ChinSengi, Iman Tanumihardja, @XiaochuangHan, @WeijiaShi2, Eric Shea-Brown, and @RajeshPNRao. Please check out the preprint for more details. Any feedback is appreciated! (9/9).
0
0
2
@lpjiang97
Linxing Preston Jiang
2 months
Together, our results show that pretraining with more sessions does not naturally lead to improved downstream performance. We advocate for rigorous scaling analyses in future work on neural foundation models to account for data heterogeneity effects. (8/9).
1
0
1
@lpjiang97
Linxing Preston Jiang
2 months
We note that similar results have been found in NDT3 by @_JoelYe, where several downstream datasets enjoyed little benefit from scale with 100 minutes of finetuning data. (7/9).
1
0
2
@lpjiang97
Linxing Preston Jiang
2 months
We found that models trained with as few as five top-ranked sessions outperformed those with randomly chosen sessions even when the full dataset was used, demonstrating the impact of session-to-session variability in performance scaling. (6/9)
Tweet media one
1
0
2
@lpjiang97
Linxing Preston Jiang
2 months
For the forward-prediction task that did exhibit consistent scaling, we identified implicit data heterogeneity arising from cross-session variability. We proposed a session-selection procedure based on single-session finetuning performances. (5/9)
Tweet media one
1
1
1
@lpjiang97
Linxing Preston Jiang
2 months
In this work, we systematically investigate how data heterogeneity impacts the scaling behavior of neural data transformer. We first found that brain region mismatches among sessions reduced scaling benefits of neuron-level and region-level activity prediction performances. (4/9)
Tweet media one
1
1
1
@lpjiang97
Linxing Preston Jiang
2 months
Yet, previous studies typically lack fine-grained data scaling analyses. It remains unclear whether all sessions contribute equally to downstream performance gains. This is especially important to understand as pretraining scales to thousands of sessions and hours of data. (3/9).
1
0
1
@lpjiang97
Linxing Preston Jiang
2 months
Neural foundation models are gaining increasing attention these days, with the potential to learn cross-session/animal/species representations and benefit from multi-session pretraining. (2/9).
1
0
1
@lpjiang97
Linxing Preston Jiang
3 months
RT @WeijiaShi2: Our previous work showed that ๐œ๐ซ๐ž๐š๐ญ๐ข๐ง๐  ๐ฏ๐ข๐ฌ๐ฎ๐š๐ฅ ๐œ๐ก๐š๐ข๐งโ€‘๐จ๐Ÿโ€‘๐ญ๐ก๐จ๐ฎ๐ ๐ก๐ญ๐ฌ ๐ฏ๐ข๐š ๐ญ๐จ๐จ๐ฅ ๐ฎ๐ฌ๐ž significantly boosts GPTโ€‘4oโ€™s visual reasoningโ€ฆ.
0
40
0
@lpjiang97
Linxing Preston Jiang
4 months
RT @rockpang6: I'm honored and humbled to receive the @IBM Ph.D. fellowship! Thank you to my advisor @_doctor_kat, all my friends, mentorsโ€ฆ.
0
2
0
@lpjiang97
Linxing Preston Jiang
6 months
RT @JieyuZhang20: Excited to share my intern project at Salesforce Research! Huge thanks to everyone on the team!!.
0
15
0
@lpjiang97
Linxing Preston Jiang
7 months
RT @WeijiaShi2: Introducing ๐‹๐ฅ๐š๐ฆ๐š๐…๐ฎ๐ฌ๐ข๐จ๐ง: empowering Llama ๐Ÿฆ™ with diffusion ๐ŸŽจ to understand and generate text and images in arbitrary sequenโ€ฆ.
0
175
0
@lpjiang97
Linxing Preston Jiang
7 months
RT @cloudwaysX: I will be at #NeurIPS2024! ๐ŸšจOn academic/industry job market this year ๐Ÿšจand excited to catch up in person! .My research focโ€ฆ.
0
9
0
@lpjiang97
Linxing Preston Jiang
11 months
Bring attention to the communications committee listed on the website: @neuro_kim @smfleming @somnirons.
0
0
0
@lpjiang97
Linxing Preston Jiang
1 year
RT @RajeshPNRao: Excited to announce that Nature Neuroscience @NatureNeuro has published my article "A sensory-motor theory of the neocorteโ€ฆ.
0
90
0
@lpjiang97
Linxing Preston Jiang
1 year
RT @RajeshPNRao: Final version of our Dynamic Predictive Coding paper with @lpjiang97 is now out in @PLOSCompBiol - explains V1 space-timeโ€ฆ.
0
23
0
@lpjiang97
Linxing Preston Jiang
1 year
RT @ylecun: The 2024 Brain Prize goes to pioneers of computational and theoretical neuroscience: Larry Abbott, @HSompolinsky, and Terry Sejโ€ฆ.
0
39
0
@lpjiang97
Linxing Preston Jiang
1 year
A huge thanks to @RajeshPNRao for supporting me in this process, and @sathish_vishwas Dimi Gklezakos Ares Fisher @DaogaoLiu for discussions!. Any feedback is welcome!.
0
0
1
@lpjiang97
Linxing Preston Jiang
1 year
A deeper three-level model shows the capability of DPC to capture progressively longer temporal regularities in the data. Coincidentally, it offers a neural solution to the exact question proposed by MacKay in his classic "The Epistemological Problem for Automata" (1954)!.
1
0
2