Zirui Chen Profile
Zirui Chen

@ziruichen44

Followers
53
Following
10
Media
2
Statuses
10

PhD student @jhucogsci

Baltimore, MD
Joined July 2021
Don't wanna be here? Send us removal request.
@ziruichen44
Zirui Chen
11 months
RT @michaelfbonner: Can we gain a deep understanding of neural representations through dimensionality reduction? Our new work shows that th….
Tweet card summary image
arxiv.org
How does the human brain encode complex visual information? While previous research has characterized individual dimensions of visual representation in cortex, we still lack a comprehensive...
0
35
0
@ziruichen44
Zirui Chen
1 year
There’s much more in the paper, including the large influence of universal dimensions on conventional similarity measures like RSA. Check it out!.
0
0
3
@ziruichen44
Zirui Chen
1 year
The invariance of these representations implies that they are not primarily governed by the details of a DNN’s design but instead by more general principles of natural image representation in vision systems.
1
0
4
@ziruichen44
Zirui Chen
1 year
This trend was consistently observed across all levels of DNN hierarchies. It was thus not restricted to low-level features in early layers—even higher-level semantic representations appear to be shared across DNNs.
1
0
1
@ziruichen44
Zirui Chen
1 year
While most DNN dimensions are model-specific, a subset of dimensions consistently emerges across DNNs despite variations in initializations, architectures, and task objectives. These universal dimensions of DNN image representation are also strongly shared with the human brain.
Tweet media one
1
1
4
@ziruichen44
Zirui Chen
1 year
We characterized the universality of latent dimensions in DNN activations by measuring their average decodability across other DNNs. We also computed their decodability from human visual cortex representations measured with fMRI.
Tweet media one
1
0
2
@ziruichen44
Zirui Chen
1 year
Many efforts seek to explain model-brain alignment in terms of the DNN’s architecture and task, but an alternative theory is that brain-like representations reflect universal aspects of natural image representation that emerge in systems with diverse optimization constraints.
1
0
3
@ziruichen44
Zirui Chen
1 year
Why do varied DNN designs yield equally good models of human vision? Our preprint with @michaelfbonner shows that diverse DNNs represent images with a shared set of latent dimensions, and these shared dimensions turn out to also be the most brain-aligned.
Tweet card summary image
arxiv.org
Do neural network models of vision learn brain-aligned representations because they share architectural constraints and task objectives with biological vision or because they learn universal...
3
43
127