Technical Program

Paper Detail

Paper: PS-1A.54
Session: Poster Session 1A
Location: H Lichthof
Session Time: Saturday, September 14, 16:30 - 19:30
Presentation Time:Saturday, September 14, 16:30 - 19:30
Presentation: Poster
Publication: 2019 Conference on Cognitive Computational Neuroscience, 13-16 September 2019, Berlin, Germany
Paper Title: The Accumulation of Salient Changes in Visual Cortex Predicts Subjective Time
Manuscript:  Click here to view manuscript
License: Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
DOI: https://doi.org/10.32470/CCN.2019.1302-0
Authors: Maxine Sherman, University of Sussex, United Kingdom; Zafeirios Fountas, University College London, United Kingdom; Anil Seth, Warrick Roseboom, University of Sussex, United Kingdom
Abstract: The mechanisms underlying human estimations of duration are frequently said to rely on conceptually un-realistic ‘internal clocks’ that track elapsed time. Roseboom et al. recently presented a novel model of duration estimation, in which human reports of subjective time were replicated by accumulated salient changes in activity across hierarchically organized perceptual classifiers responding to sensory input [1]. Here we tested this model on human neuroimaging data, acquiring fMRI scans while subjects watched silent videos and estimated their duration. Using a pre-registered, model-based fMRI analysis, we will test whether the accumulation of salient moment-to-moment signal changes in visual cortex voxels predicts human subjective time. We hypothesize that this will not occur when accumulating changes detected by auditory or somatosensory cortices, indicating that our ‘internal clock’ is not instantiated by a specialized system for time, but rather is grounded in the sensory systems with which we perceive our environment.