
nilearn
@nilearn
Followers
6
Following
21
Media
16
Statuses
45
I’ve been building Nilearn to make neuroimaging + ML both approachable and powerful. Since my first GitHub drops, it’s evolved fast. Here’s a quick tour of what it can do and what I built into it https://t.co/zMPuvsuVGz
1
0
0
This reveals how networks strengthen or weaken moment to moment a key step toward understanding cognitive flexibility and mental state transitions. It’s like watching the brain rewire itself in real time.
0
0
0
The brain isn’t static its connections fluctuate over time, even at rest. With Nilearn, you can compute time varying connectivity matrices by sliding a window across fMRI data.
1
0
0
Graph-based models bridge neuroscience and network science. Each node represents a brain region, and each edge represents functional coupling. By analyzing these connections, we can detect communities and identify hubs key nodes driving information flow across the brain.
0
0
0
In Nilearn, you can use graph analysis to uncover these properties and visualize neural architecture beyond voxel correlations.
0
0
0
Once you’ve built a brain connectivity matrix, the next step is interpreting the network structure. Metrics like degree, clustering coefficient, and modularity reveal how efficiently different brain regions communicate.
1
0
0
Depression hates a moving target. Keep your body & mind active. Depression thrives in stagnation & rumination.
67
404
3K
Nilearn’s Connectivity Measure, you can compute correlations or partial correlations between brain regions turning raw time series data into network graphs. Then visualize it with plot_matrix(corr_matrix, colorbar=True) to get a clear view of functional coupling across the brain
0
0
0
After mapping seed based correlations, the next step is building full-brain connectivity matrices to see how every region interacts with every other one.
1
0
0
After extracting seed based time series, the next step is mapping how those signals interact across the brain. In Nilearn you can compute these seed to voxel correlations in just a few lines revealing networks like the Default Mode Network.
0
0
1
Once you’ve extracted regional time series, you can map how that region’s activity correlates with every voxel in the brain. This is called seed-to-voxel analysis — it’s how we find functional networks like the Default Mode Network (DMN)
0
0
1
In Nilearn, you can extract signals from specific brain regions (called “seeds”) using NiftiSpheresMasker. It takes 3D coordinates (x, y, z) and creates small spherical masks around them pulling out the average time series from those spots 🧠
0
0
0
Nilearn is still evolving new features, cleaner visualizations, faster pipelines. My next focus: expanding datasets, adding interactive visual tools, and making it easier for students + researchers to explore brain data with just a few lines of code.
0
0
0
From day one my goal with Nilearn was simple: make neuroimaging ML accessible, reproducible, and visual. Every tool, every plot, every example built to help anyone explore the brain with machine learning. If you’ve used Nilearn or built on top of it, @ me I’d love to see :)
0
0
0
Over time, Nilearn grew into a full Example Gallery decoding, GLMs, ICA, dictionary learning, surfaces, connectivity. Each example is runnable code with visual outputs it’s both documentation and a learning resource.
0
0
0
If you want to try this at home, here’s a smooth flow: 1. Install Nilearn 2. Plot an atlas 3. Load Haxby & run decoding 4. Build a connectome 5. Explore plotting extras That’s a full ML pipeline on real brain data, with visuals, in under an afternoon.
0
0
0
The Examples Gallery is the living playbook: decoding (SVM, FREM, multiclass strategies), GLMs, ICA, dictionary learning, surfaces, connectivity… all runnable, all documented : https://t.co/iakf8Qr40P
0
0
0
Surface visualizations are first class: load a cortical surface atlas and project data to mesh for crisp, shareable figures. Great for highlights and covers.
0
0
0
Nilearn also makes functional connectivity fast: extract time series, build connectomes, plot node edge networks (e.g., Power-264). https://t.co/wlZVRmMi8C It’s a smooth jump from signals → graphs.
0
0
0
For stronger baselines, I implemented FREM (Fast Regularized Ensemble of Models) It boosts robustness on multi-class problems while keeping compute manageable handy as projects scale.
0
0
0
I wanted decoding to feel straightforward. With the Haxby dataset, you can train classifiers that predict what the subject saw (faces vs houses, etc.) using familiar sklearn style APIs. https://t.co/3C9d8fOEhV
0
0
0