A Beautiful Brain Visualization Tool
An exciting new browser-based semantic brain visualization tool was released today accompanying a Nature publication from Gallant Lab at UC Berkeley.
Researchers used fMRI to measure brain activity while people listened to Moth Radio Hour. Lead author Alexander Huth and team created models that predict with varying confidence the likelihood that a specific voxel (a tiny 3D volume being imaged) would activate when a listener heard a given word. The cerebral cortex is divided up into roughly 60,000 voxels as you can see in the visualizer.
Colors correspond to different categorical classifications of words. As the research team explains, “bright red and orange voxels are also predicted to respond to social or dramatic words. Darker red and brown voxels are predicted to respond to time and place words.”
Cool. But how does it help science?
One of the main goals of our study was to create an atlas of semantic selectivity in the human brain. We developed a new probabilistic and generative model of areas tiling the cortex (PrAGMATiC) to create this atlas. PrAGMATiC assumes that each subject has the same functional areas, but it allows for some variation across individuals in the precise position and size of each area.
The PrAGMATiC atlas divides the left hemisphere into 192 distinct functional areas, 77 of which are semantically selective. The right hemisphere is divided into 128 functional areas, 63 of which are semantically selective. Click on an area to see more detail about its semantic selectivity (or lack thereof).
Nature describes the research in greater detail:
Where exactly are the words in your head? Scientists have created an interactive map showing which brain areas respond to hearing different words. The map reveals how language is spread throughout the cortex and across both hemispheres, showing groups of words clustered together by meaning. The beautiful interactive model allows us to explore the complex organisation of the enormous dictionaries in our heads.
Enjoy the stellar work and corresponding visualization at: http://gallantlab.org/huth2016/