Lixiang Chen Profile
Lixiang Chen

@lixiangchen_

Followers
164
Following
53
Media
6
Statuses
26

PhD, cognitive neuroscience, EEG/MRI, visual perception, affective disorders.

Berlin, Germany
Joined March 2018
Don't wanna be here? Send us removal request.
@DKaiserlab
Kaiser Lab
8 months
Now out in @RSocPublishing Proceedings B! Check out @Gongting_Wang's EEG work on individual differences in scene perception: Scenes that are more typical for individual observers are represented in an enhanced yet more idiosyncratic way. Link:
Tweet card summary image
royalsocietypublishing.org
Previous research shows that the typicality of visual scenes (i.e. if they are good examples of a category) determines how easily they can be perceived and represented in the brain. However, the...
1
4
12
@DKaiserlab
Kaiser Lab
1 year
Hooray! πŸ₯³ @lixiangchen_ successfully defended his PhD. Congratulations Dr. Chen! We're looking forward to hearing about all the great stuff you'll achieve in the future.
0
3
31
@DKaiserlab
Kaiser Lab
1 year
Check out @lixiangchen_β€˜s new preprint! We show that the balance between feedforward gamma and feedback alpha rhythms mediates the perception of spatiotemporal coherence in dynamic natural videos. Work done at @jlugiessen & @FU_Berlin. πŸŽ¬πŸŽžοΈπŸ‘οΈ Link:
Tweet card summary image
biorxiv.org
How does the brain integrate complex and dynamic visual inputs into phenomenologically seamless percepts? Previous results demonstrate that when visual inputs are organized coherently across space...
0
1
10
@DKaiserlab
Kaiser Lab
1 year
Apply now for a 3-year postdoc position in the lab! Please retweet and share with potential candidates. If you're interested in #vision, #EEG, #fMRI, and working with a fun team at the @jlugiessen, feel free to get in touch before applying. Link:
@DKaiserlab
Kaiser Lab
1 year
We will soon advertise a 3-year postdoc position in the lab. We're looking for a person that's interested in understanding natural vision, has experience with EEG/fMRI and experience with state-of-the-art analysis methods. Please share and get in touch if you're interested! πŸ§ πŸ’œ
1
36
50
@Gongting_Wang
Gongting
1 year
New preprint is out with Lixiang Chen @lixiangchen_ , Radek Cichy and Daniel Kaiser @DKaiserlab . Our paper titled "Enhanced and idiosyncratic neural representations of personally typical scenes ". https://t.co/vNcrYTSa2g
Tweet card summary image
biorxiv.org
Previous research shows that the typicality of visual scenes (i.e., if they are good examples of a category) determines how easily they can be perceived and represented in the brain. However, the...
1
3
7
@seeingxie
Siying Xie
1 year
Preprint alert 🚨I am excited about our new paper titled β€œThe representational nature of spatio-temporal recurrent processing in visual object recognition.” πŸ₯³πŸŒŸ https://t.co/5wJJDxUl7V
1
17
57
@DKaiserlab
Kaiser Lab
1 year
We will soon advertise a 3-year postdoc position in the lab. We're looking for a person that's interested in understanding natural vision, has experience with EEG/fMRI and experience with state-of-the-art analysis methods. Please share and get in touch if you're interested! πŸ§ πŸ’œ
1
11
34
@DKaiserlab
Kaiser Lab
2 years
Our official ad for a PhD position is out. We're looking for someone conducting fMRI and EEG work on the neural representation of visual beauty / visual preferences. Link:
@DKaiserlab
Kaiser Lab
2 years
We'll soon advertise a 3-year PhD position on a project that explores the brain processes underlying visual aesthetics using EEG, fMRI, and computational models. Please spread the word if you know someone who could be interested or get in touch if you are! πŸŒŒπŸŒ‡πŸžοΈπŸ§ 
1
18
43
@DKaiserlab
Kaiser Lab
2 years
We'll soon advertise a 3-year PhD position on a project that explores the brain processes underlying visual aesthetics using EEG, fMRI, and computational models. Please spread the word if you know someone who could be interested or get in touch if you are! πŸŒŒπŸŒ‡πŸžοΈπŸ§ 
2
20
53
@lixiangchen_
Lixiang Chen
2 years
In this study, the integration-related alpha activity was observed not only when snippets from the same video were presented, but also when different video snippets from the same basic-level category were presented, highlighting the flexibility of neural integration processes.
0
0
0
@lixiangchen_
Lixiang Chen
2 years
Our brain integrates dynamic inputs across the visual field to create coherent visual experiences. Such integration processes have previously been linked to cortical alpha dynamics (Chen et al. Sci Adv, 2023).
1
0
0
@biorxiv_neursci
bioRxiv Neuroscience
2 years
Coherent categorical information triggers integration-related alpha dynamics https://t.co/kHxLs0Cikw #biorxiv_neursci
0
1
0
@lixiangchen_
Lixiang Chen
2 years
Now it's out in @ScienceAdvances https://t.co/cCr7tMMSAx. Using EEG and fMRI data, we found that top-down feedback mediated by cortical alpha dynamics plays a key role in the construction of coherent visual experiences. With Daniel Kaiser and Radek Cichy. @CCNBerlin
@lixiangchen_
Lixiang Chen
3 years
Happy to share our new preprint https://t.co/pEM6mskkAt. Together with Radek Cichy and Daniel Kaiser, we found feedback in the alpha frequency across the visual hierarchy as a key mechanism for integrating spatiotemporally consistent information in natural vision.
0
22
84
@LuChunYeh
Lu-Chun Yeh
3 years
We are looking for a new postdoc for Kaiser Lab ( https://t.co/ZqcSt7fJlG). This is a three-year position investigating how brain mechanisms in frontal cortex contribute to the analysis of categorical inputs in visual cortex. Details and application:
danielkaiser.net
Webpage of Prof. Dr. Daniel Kaiser, psychologist and researcher in cognitive neuroscience. Mainly working on real-world vision and the brain processes underlying efficient naturalistic perception. My...
0
14
20
@lixiangchen_
Lixiang Chen
3 years
Together this demonstrates how the human brain orchestrates coherent visual experience across space: it uses feedback to integrate information from high-level to early visual cortex through a dedicated rhythmic code in the alpha frequency range.
0
0
1
@lixiangchen_
Lixiang Chen
3 years
Using an EEG/fMRI fusion analysis to link the spectral representations in the EEG with spatial representations in the fMRI, we demonstrate that alpha-frequency feedback is directly associated with representations in early visual cortex.
1
0
1
@lixiangchen_
Lixiang Chen
3 years
Decoding stimuli from fMRI multi-voxel patterns, we found scene-selective cortex aggregates spatiotemporally consistent information across hemifields, suggesting PPA and MPA as likely generators of feedback signals guiding visual integration.
1
0
1
@lixiangchen_
Lixiang Chen
3 years
Decoding stimulus information from frequency-specific EEG patterns, we found a shift from representations in feedforward-related gamma activity for spatiotemporally inconsistent videos to representations in feedback-related alpha activity for spatiotemporally consistent videos.
1
1
2
@lixiangchen_
Lixiang Chen
3 years
In EEG and fMRI experiments, we experimentally mimicked the spatially distributed nature of visual inputs by presenting short natural videos through two circular apertures right and left of fixation. Critically, we manipulated the spatiotemporal congruency of the videos.
1
1
1