Information optimization in coupled audio-visual cortical maps

AUTOR(ES)
FONTE

National Academy of Sciences

RESUMO

Barn owls hunt in the dark by using cues from both sight and sound to locate their prey. This task is facilitated by topographic maps of the external space formed by neurons (e.g., in the optic tectum) that respond to visual or aural signals from a specific direction. Plasticity of these maps has been studied in owls forced to wear prismatic spectacles that shift their visual field. Adaptive behavior in young owls is accompanied by a compensating shift in the response of (mapped) neurons to auditory signals. We model the receptive fields of such neurons by linear filters that sample correlated audio-visual signals and search for filters that maximize the gathered information while subject to the costs of rewiring neurons. Assuming a higher fidelity of visual information, we find that the corresponding receptive fields are robust and unchanged by artificial shifts. The shape of the aural receptive field, however, is controlled by correlations between sight and sound. In response to prismatic glasses, the aural receptive fields shift in the compensating direction, although their shape is modified due to the costs of rewiring.

Documentos Relacionados