Skip to yearly menu bar Skip to main content


Spotlight

Comparing Bayesian models for multisensory cue combination without mandatory integration

Ulrik Beierholm · Konrad P Kording · Ladan Shams · Wei Ji Ma


Abstract:

Bayesian models of multisensory perception traditionally address the problem of estimating a variable that is assumed to be the underlying cause of two sensory signals. The brain, however, has to solve a more general problem: it has to establish which signals come from the same source and should be integrated, and which ones do not and should be segregated. In the last couple of years, a few models have been proposed to solve this problem in a Bayesian fashion. One of these has the strength that it formalizes the causal structure of sensory signals. We describe these models and conduct an experiment to test human performance in an auditory-visual spatial localization task in which integration is not mandatory. We find that the causal Bayesian inference model accounts for the data better than other models.

Chat is not available.