Using a novel motion index to study the neural basis of event segmentation

Frank Pollick, Katie Noble, Nikhil Darshane, Helen Murphy, Donald Glowinski, Phil McAleer, Corinne Jola, Kedzie Penfield, Antonio Camurri

Research output: Contribution to journalMeeting Abstract


Our understanding of the perceived actions of those around us includes an ability to segment this continuous stream of activity into discrete events. We studied naïve observers' abilities to segment a video of an unfamiliar dance style into events using a combination of behavioural, computational vision and brain imaging methods. A 386 s video of a solo Bharatanatyam dancer was used as the basis for the study. A computational analysis provided us with, for every video frame, a Motion Index (MI) quantifying the movement of the entire dancer. A behavioural analysis using 30 naïve observers provided us with the time points where observers were most likely to place an event boundary. These behavioural and computational data were used to interpret the brain activity of another 11 participants who viewed the dance video while in an MRI scanner. Results showed that the Motion Index predicted brain activity in a single cluster in the right hemisphere that was located close to the Extrastriate Body Area (EBA). Event boundaries in the video were related to extensive clusters of bilateral activity in the Inferior Occipital Gyrus which extended towards the posterior Superior Temporal Sulcus (pSTS). Event boundaries also activated a region in the right Inferior Frontal Gyrus. These results extend our understanding of how movement kinaesthetics modulate action interpretation.
Original languageEnglish
Pages (from-to)225-225
Number of pages1
Issue number4
Publication statusPublished - 1 May 2012


Dive into the research topics of 'Using a novel motion index to study the neural basis of event segmentation'. Together they form a unique fingerprint.

Cite this