Audio-visual integration during overt visual attention

Autor(en): Quigley, Cliodhna
Onat, Selim 
Harding, Sue
Cooke, Martin
Koenig, Peter 
Stichwörter: CONVERGENCE; MULTISENSORY INTEGRATION; Ophthalmology
Erscheinungsdatum: 2007
Herausgeber: INT GROUP EYE MOVEMENT RESEARCH
Journal: JOURNAL OF EYE MOVEMENT RESEARCH
Volumen: 1
Ausgabe: 2
Zusammenfassung: 
How do different sources of information arising from different modalities interact to control where we look? To answer this question with respect to real-world operational conditions we presented natural images and spatially localized sounds in (V)isual, Audiovisual (AV) and (A)uditory conditions and measured subjects' eye-movements. Our results demonstrate that eye-movements in AV conditions are spatially biased towards the part of the image corresponding to the sound source. Interestingly, this spatial bias is dependent on the probability of a given image region to be fixated (saliency) in the V condition. This indicates that fixation behaviour during the AV conditions is the result of an integration process. Regression analysis shows that this integration is best accounted for by a linear combination of unimodal saliencies.
ISSN: 19958692

Show full item record

Page view(s)

2
Last Week
0
Last month
1
checked on Feb 21, 2024

Google ScholarTM

Check