Lixiang Chen
@lixiangchen_
Followers
164
Following
53
Media
6
Statuses
26
PhD, cognitive neuroscience, EEG/MRI, visual perception, affective disorders.
Berlin, Germany
Joined March 2018
Now out in @RSocPublishing Proceedings B! Check out @Gongting_Wang's EEG work on individual differences in scene perception: Scenes that are more typical for individual observers are represented in an enhanced yet more idiosyncratic way. Link:
royalsocietypublishing.org
Previous research shows that the typicality of visual scenes (i.e. if they are good examples of a category) determines how easily they can be perceived and represented in the brain. However, the...
1
4
12
Hooray! π₯³ @lixiangchen_ successfully defended his PhD. Congratulations Dr. Chen! We're looking forward to hearing about all the great stuff you'll achieve in the future.
0
3
31
Check out @lixiangchen_βs new preprint! We show that the balance between feedforward gamma and feedback alpha rhythms mediates the perception of spatiotemporal coherence in dynamic natural videos. Work done at @jlugiessen & @FU_Berlin. π¬ποΈποΈ Link:
biorxiv.org
How does the brain integrate complex and dynamic visual inputs into phenomenologically seamless percepts? Previous results demonstrate that when visual inputs are organized coherently across space...
0
1
10
Apply now for a 3-year postdoc position in the lab! Please retweet and share with potential candidates. If you're interested in #vision, #EEG, #fMRI, and working with a fun team at the @jlugiessen, feel free to get in touch before applying. Link:
We will soon advertise a 3-year postdoc position in the lab. We're looking for a person that's interested in understanding natural vision, has experience with EEG/fMRI and experience with state-of-the-art analysis methods. Please share and get in touch if you're interested! π§ π
1
36
50
New preprint is out with Lixiang Chen @lixiangchen_ , Radek Cichy and Daniel Kaiser @DKaiserlab . Our paper titled "Enhanced and idiosyncratic neural representations of personally typical scenes ". https://t.co/vNcrYTSa2g
biorxiv.org
Previous research shows that the typicality of visual scenes (i.e., if they are good examples of a category) determines how easily they can be perceived and represented in the brain. However, the...
1
3
7
Preprint alert π¨I am excited about our new paper titled βThe representational nature of spatio-temporal recurrent processing in visual object recognition.β π₯³π https://t.co/5wJJDxUl7V
1
17
57
We will soon advertise a 3-year postdoc position in the lab. We're looking for a person that's interested in understanding natural vision, has experience with EEG/fMRI and experience with state-of-the-art analysis methods. Please share and get in touch if you're interested! π§ π
1
11
34
Our official ad for a PhD position is out. We're looking for someone conducting fMRI and EEG work on the neural representation of visual beauty / visual preferences. Link:
We'll soon advertise a 3-year PhD position on a project that explores the brain processes underlying visual aesthetics using EEG, fMRI, and computational models. Please spread the word if you know someone who could be interested or get in touch if you are! ππποΈπ§
1
18
43
We'll soon advertise a 3-year PhD position on a project that explores the brain processes underlying visual aesthetics using EEG, fMRI, and computational models. Please spread the word if you know someone who could be interested or get in touch if you are! ππποΈπ§
2
20
53
In this study, the integration-related alpha activity was observed not only when snippets from the same video were presented, but also when different video snippets from the same basic-level category were presented, highlighting the flexibility of neural integration processes.
0
0
0
Our brain integrates dynamic inputs across the visual field to create coherent visual experiences. Such integration processes have previously been linked to cortical alpha dynamics (Chen et al. Sci Adv, 2023).
1
0
0
New preprint is out with Radek Cichy and Daniel Kaiser @DKaiserlab. "Coherent categorical information triggers integration-related alpha dynamics" https://t.co/H7mtCVK3IS
biorxiv.org
To create coherent visual experiences, the brain spatially integrates the complex and dynamic information it receives from the environment. We previously demonstrated that feedback-related alpha...
1
2
12
Coherent categorical information triggers integration-related alpha dynamics https://t.co/kHxLs0Cikw
#biorxiv_neursci
0
1
0
Now it's out in @ScienceAdvances
https://t.co/cCr7tMMSAx. Using EEG and fMRI data, we found that top-down feedback mediated by cortical alpha dynamics plays a key role in the construction of coherent visual experiences. With Daniel Kaiser and Radek Cichy. @CCNBerlin
Happy to share our new preprint https://t.co/pEM6mskkAt. Together with Radek Cichy and Daniel Kaiser, we found feedback in the alpha frequency across the visual hierarchy as a key mechanism for integrating spatiotemporally consistent information in natural vision.
0
22
84
We are looking for a new postdoc for Kaiser Lab ( https://t.co/ZqcSt7fJlG). This is a three-year position investigating how brain mechanisms in frontal cortex contribute to the analysis of categorical inputs in visual cortex. Details and application:
danielkaiser.net
Webpage of Prof. Dr. Daniel Kaiser, psychologist and researcher in cognitive neuroscience. Mainly working on real-world vision and the brain processes underlying efficient naturalistic perception. My...
0
14
20
Together this demonstrates how the human brain orchestrates coherent visual experience across space: it uses feedback to integrate information from high-level to early visual cortex through a dedicated rhythmic code in the alpha frequency range.
0
0
1
Using an EEG/fMRI fusion analysis to link the spectral representations in the EEG with spatial representations in the fMRI, we demonstrate that alpha-frequency feedback is directly associated with representations in early visual cortex.
1
0
1
Decoding stimuli from fMRI multi-voxel patterns, we found scene-selective cortex aggregates spatiotemporally consistent information across hemifields, suggesting PPA and MPA as likely generators of feedback signals guiding visual integration.
1
0
1
Decoding stimulus information from frequency-specific EEG patterns, we found a shift from representations in feedforward-related gamma activity for spatiotemporally inconsistent videos to representations in feedback-related alpha activity for spatiotemporally consistent videos.
1
1
2
In EEG and fMRI experiments, we experimentally mimicked the spatially distributed nature of visual inputs by presenting short natural videos through two circular apertures right and left of fixation. Critically, we manipulated the spatiotemporal congruency of the videos.
1
1
1