Recherche
Recherche simple
Recherche avancée
Panier électronique
Votre panier ne contient aucune notice
Connexion à la base
Identification
(Identifiez-vous pour accéder aux fonctions de mise à jour. Utilisez votre login-password de courrier électronique)
Entrepôt OAI-PMH
Soumettre une requête
| Consulter la notice détaillée |
| Version complète en ligne |
| Version complète en ligne accessible uniquement depuis l'Ircam |
| Ajouter la notice au panier |
| Retirer la notice du panier |
English version
(full translation not yet available)
Liste complète des articles
|
Consultation des notices
%0 Conference Proceedings
%A Suied, Clara
%A Bonneel, Nicolas
%A Viaud-Delmon, Isabelle
%T Integration of auditory and visual information in fast recognition of realistic objects
%D 2008
%B ICON X (International Conference on Cognitive Neuroscience)
%C Bodrum
%F Suied08d
%X Recent studies have used complex and meaningful stimuli to investigate the neural mechanisms involved in the processing of sensory information necessary for object perception. However, relatively few studies have focused on behavioural measures of multisensory objects processing. Realistic objects are of particular interest in the study of multisensory integration, since a given object can generally be identified through any of several single sensory modalities. The fact that the same semantic knowledge can be accessed through different modalities allows us to explore the different processing levels that underlie retrieval of supramodal object concepts. Here we studied the influence of semantic congruence on auditory-visual object recognition in a go/no-go task. Participants were asked to react as fast as possible to a target object presented in the visual and/or the auditory modality, and to inhibit their response to a distractor object. The experiment was run under an immersive and realistic virtual environment including 3D images and free-field audio. Reaction times were significantly shorter for semantically congruent bimodal stimuli than predicted by independent processing of information about the objects presented unimodally. Interestingly, this effect was twice as large as found in previous studies that used information-rich stimuli. The processing of bimodal objects was also influenced by their semantic congruence: reaction times were significantly shorter for semantically congruent bimodal stimuli (i.e., visual and auditory stimuli from the same object target) than for semantically incongruent bimodal stimuli (i.e. target represented in only one sensory modality and distractor presented in the other modality). Importantly, an interference effect was observed (i.e. longer reaction times to semantically incongruent stimuli than to the corresponding unimodal stimulus) only when the distractor was auditory. When the distractor was visual, the semantic incongruence did not impair recognition. Our results show that immersive displays may provide large multimodal integration effects, and reveal a possible asymmetry in the attentional filtering of irrelevant auditory and visual information.
%1 7
%2 2
|
|