Action categorization across vision and language


We interrogate the semantic representations of actions depicted through videos and sentences using behavioral data and computational models.

Graph showing how various actions are clustered in the human mind.

Organizing dimensions in naturalistic action perception


We use a data-driven method to recover interpretable dimensions from action similarity judgments across two different experiments. 

Graph showing the analysis pipeline used in the paper.

Intuitive action representations in brain and behaviour


We evaluate the contribution of visual, social, and action features to behavioral and neural (EEG) naturalistic action representations. 

Figures showing how resting-state connectivity in the brain differs between CNV carriers and controls.

MEG hypoconnectivity linked to rare copy number variants


We find decreased MEG resting-state connectivity in adults with rare high-risk copy number variants (CNVs), suggesting a potential common mechanism across genotypes.

Figure showing how different layers of a neural network correlate with neural activity in different brain regions.

Scene representations, from features to categories


We show that a categorical response to scenes emerges in visual cortex within 200 ms, potentially supported by low spatial frequency features, even in the absence of a categorization task.

Figure showing when and where face expression information can be decoded in the human brain.

Spatiotemporal dynamics of expression processing


Sensor and source-space MVPA of whole-brain MEG data shows that the extraction of expression-related features from faces begins within 100 ms in visual cortex in the absence of an expression-related task.

Full publication list: [link]