Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources

DC FieldValueLanguage
dc.contributor.authorWahn, Basil
dc.contributor.authorKoenig, Peter
dc.date.accessioned2021-12-23T16:13:15Z-
dc.date.available2021-12-23T16:13:15Z-
dc.date.issued2016
dc.identifier.issn16625145
dc.identifier.urihttps://osnascholar.ub.uni-osnabrueck.de/handle/unios/10475-
dc.description.abstractHumans constantly process and integrate sensory input from multiple sensory modalities. However, the amount of input that can be processed is constrained by limited attentional resources. A matter of ongoing debate is whether attentional resources are shared across sensory modalities, and whether multisensory integration is dependent on attentional resources. Previous research suggested that the distribution of attentional resources across sensory modalities depends on the the type of tasks. Here, we tested a novel task combination in a dual task paradigm: Participants performed a self-terminated visual search task and a localization task in either separate sensory modalities (i.e., haptics and vision) or both within the visual modality. Tasks considerably interfered. However, participants performed the visual search task faster when the localization task was performed in the tactile modality in comparison to performing both tasks within the visual modality. This finding indicates that tasks performed in separate sensory modalities rely in part on distinct attentional resources. Nevertheless, participants integrated visuotactile information optimally in the localization task even when attentional resources were diverted to the visual search task. Overall, our findings suggest that visual search and tactile localization partly rely on distinct attentional resources, and that optimal visuotactile integration is not dependent on attentional resources.
dc.description.sponsorshipH2020 - H2020-FETPROACT [2014 641321]; ERC-AdG [269716]; We gratefully acknowledge the support by H2020 - H2020-FETPROACT-2014 641321 - socSMCs (for BW) and ERC-2010-AdG #269716 - MULTISENSE (for PK).
dc.language.isoen
dc.publisherFRONTIERS MEDIA SA
dc.relation.ispartofFRONTIERS IN INTEGRATIVE NEUROSCIENCE
dc.subjectattentional load
dc.subjectattentional resources
dc.subjectAUDIOVISUAL INTEGRATION
dc.subjectBehavioral Sciences
dc.subjectBLINKS
dc.subjectCAPACITY
dc.subjectCAPTURE
dc.subjectCONGRUENCY
dc.subjectMULTISENSORY INTEGRATION
dc.subjectNeurosciences
dc.subjectNeurosciences & Neurology
dc.subjectSELECTIVE ATTENTION
dc.subjectSIGNALS
dc.subjecttactile display
dc.subjecttactile modality
dc.subjectVISION
dc.subjectvisual modality
dc.subjectvisual search
dc.subjectWITHIN-MODALITY
dc.titleAttentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources
dc.typejournal article
dc.identifier.doi10.3389/fnint.2016.00013
dc.identifier.isiISI:000371519300002
dc.description.volume10
dc.contributor.orcid0000-0003-3654-5267
dc.contributor.orcid0000-0002-0318-7160
dc.contributor.researcheridABB-2380-2020
dc.contributor.researcheridAAV-5770-2021
dc.publisher.placeAVENUE DU TRIBUNAL FEDERAL 34, LAUSANNE, CH-1015, SWITZERLAND
dcterms.isPartOf.abbreviationFront. Integr. Neurosci.
dcterms.oaStatusGreen Published, gold
crisitem.author.deptInstitut für Kognitionswissenschaft-
crisitem.author.deptInstitut für Kognitionswissenschaft-
crisitem.author.deptFB 05 - Biologie/Chemie-
crisitem.author.deptidinstitute28-
crisitem.author.deptidinstitute28-
crisitem.author.deptidfb05-
crisitem.author.orcid0000-0002-0318-7160-
crisitem.author.orcid0000-0003-3654-5267-
crisitem.author.parentorgFB 08 - Humanwissenschaften-
crisitem.author.parentorgFB 08 - Humanwissenschaften-
crisitem.author.parentorgUniversität Osnabrück-
crisitem.author.grandparentorgUniversität Osnabrück-
crisitem.author.grandparentorgUniversität Osnabrück-
crisitem.author.netidWaBa169-
crisitem.author.netidKoPe298-
Show simple item record

Page view(s)

3
Last Week
0
Last month
0
checked on May 28, 2024

Google ScholarTM

Check

Altmetric