Accurate estimate of self-motion through space (i.e. heading perception) requires integration of visual and vestibular cues. We use neurophysiology and computational modeling techniques in awake, behaving nonhuman primates and human psychophysics to study how brain combines multisensory signals to optimize perception. For example, we have trained rhesus monkeys to perform demanding discrimination tasks, in which animals integrate visual and vestibular cues in a statistically optimal way to improve heading judgments. Simultaneous recording from single neurons in cortical areas (e.g. the dorsal media superior temporal area, MSTd) from these monkeys allowed us to directly compare neuronal sensitivity and the animals’ perceptual threshold. We also artificially perturb cortical activity by delivering electrical current or chemicals to examine causal links between neural activity and perceptual decision. Our current research focuses mainly on the following two areas.
Indentifying candidates in the brain involved in multisensory heading perception
There are multiple areas in the brain modulated by both complex optic flow patterns and vestibular stimuli. Our goal is to identify the roles of these areas involved in Bayesian optimal cue integration. In particular, we are interested in how the pursuit area in the frontal eye field (FEFsem) may integrate visual, vestibular, and extra-retinal signals to achieve more robust heading estimate.
Functional interactions among areas during heading perception
Anatomical studies showed that multiple brain areas involved in heading perception, including MSTd, FEFsem, VIP, are reciprocally connected, suggesting that they may functionally interact with one another to coordinate activities to form a unified heading representation. We will perform simultaneous recording from multiple brain areas, analyze coherence between local field potentials (LFP) and spiking activity across multiple areas, and explore possible patterns of information flow in the circuits connecting these areas.