Dora Angelaki "Vision Sciences Society Annual Meeting 2013: Optimal Integration of Sensory Evidence: Building Blocks and Canonical Computations"

A fundamental aspect of our sensory experience is that information from different modalities is often seamlessly integrated into a unified percept. Recent computational and behavioral studies have shown that humans combine sensory cues according to a statistically optimal scheme derived from Bayesian probability theory; they perform better when two sensory cues are combined. We have explored multisensory cue integration for self-motion (heading) perception based on visual (optic flow) and vestibular (linear acceleration) signals. Neural correlates of optimal cue integration during a multimodal heading discrimination task are found in the activity of single neurons in the macaque visual cortex. Neurons with congruent heading preferences for visual and vestibular stimuli (‘congruent cells’) show improved sensitivity under cue combination. In contrast, neurons with opposite heading preferences (‘opposite cells’) show diminished sensitivity under cue combination. Responses of congruent neurons also reflect trial-by-trial re-weighting of visual and vestibular cues, as expected from optimal integration, and population responses can predict the main features of perceptual cue weighting that have been observed many times in humans. The trial-by-trial re-weighting can be simulated using a divisive normalization model extended to multisensory integration. Deficits in behavior after reversible chemical inactivation provide further support of the hypothesis that extrastriate visual cortex mediates multisensory integration for self-motion perception.

However, objects that move through the environment can distort optic flow and bias perceptual estimates of heading. In biologically-constrained simulations, we show that decoding a mixed population of congruent and opposite cells according to their vestibular heading preferences can allow estimates of heading to be dissociated from object motion. These theoretical predictions are further supported by perceptual and neural responses: (1) Combined visual and vestibular stimulation reduces perceptual biases during object and heading discrimination tasks. (2) As predicted by model simulations, visual/vestibular integration creates a more robust representation of heading in congruent cells and a more robust representation of object motion in opposite cells.

In summary, these findings provide direct evidence for a biological basis of the benefits of multisensory integration, both for improving sensitivity and for resolving sensory ambiguities. The studies we summarize identify both the computations and neuronal mechanisms that may form the basis for cue integration. Diseases, such as autism spectrum disorders, might suffer from deficits in one or more of these canonical computations, which are fundamental in helping merge our senses to interpret and interact with the world.

Duration: 55:07

Posted: Thursday, July 4, 2013

Video tags: Vision Sciences Society, Vision Science Sponsored Talks