Neural Correlates of Spatiotemporal Event Recognition: Application to Brain-Computer Interfaces for Video Exploitation

Detection of events of interest in video involves evidence accumulation across space and time; the observer is required to integrate features from both motion and form to decide whether a behavior constituents a target event. Do such events that extend in time elicit evoked responses of similar strength as evoked responses associated with instantaneous events such as the presentation of a static target image? Using a set of simulated scenarios, with avatars/actors having different behaviors, we identified evoked neural activity discriminative of target vs. distractor events (behaviors) at discrimination levels that are comparable to static imagery. EEG discriminative activity was largely in the time-locked evoked response and not in oscillatory activity, with the exception of very low EEG frequency bands such as delta and theta, which simply represent bands dominating the event related potential (ERP). The discriminative evoked response activity we see is observed in all target/distractor conditions and is robust across different recordings from the same subjects. The results suggest that we have identified a robust neural correlate of target detection in video, at least in terms of the stimulus set we used-i.e., dynamic behavior of an individual in a low clutter environment. We discuss implications for using such a neural correlate for building a brain-computer interface (BCI) to search and annotate video. This work was done with Lucas Parra of the City College of New York (CCNY) and Dan Rosenthal and Paul DeGuzman of Neuromatters, LLC.

Available Online
Accepted 7 November 2014

Latest News & Links

See All News