Ircam-Centre Pompidou

Recherche

  • Recherche simple
  • Recherche avancée

    Panier électronique

    Votre panier ne contient aucune notice

    Connexion à la base

  • Identification
    (Identifiez-vous pour accéder aux fonctions de mise à jour. Utilisez votre login-password de courrier électronique)

    Entrepôt OAI-PMH

  • Soumettre une requête

    Consulter la notice détailléeConsulter la notice détaillée
    Version complète en ligneVersion complète en ligne
    Version complète en ligne accessible uniquement depuis l'IrcamVersion complète en ligne accessible uniquement depuis l'Ircam
    Ajouter la notice au panierAjouter la notice au panier
    Retirer la notice du panierRetirer la notice du panier

  • English version
    (full translation not yet available)
  • Liste complète des articles

  • Consultation des notices


    Vue détaillée Vue Refer Vue Labintel Vue BibTeX  

    Catégorie de document Contribution à un colloque ou à un congrès
    Titre Integration of auditory and visual information in fast recognition of realistic objects
    Auteur principal Clara Suied
    Co-auteurs Nicolas Bonneel, Isabelle Viaud-Delmon
    Colloque / congrès ICON X (International Conference on Cognitive Neuroscience). Bodrum : Septembre 2008
    Comité de lecture Indéterminé
    Année 2008
    Statut éditorial Accepté - publication en cours
    Résumé

    Recent studies have used complex and meaningful stimuli to investigate the neural mechanisms involved in the processing of sensory information necessary for object perception. However, relatively few studies have focused on behavioural measures of multisensory objects processing. Realistic objects are of particular interest in the study of multisensory integration, since a given object can generally be identified through any of several single sensory modalities. The fact that the same semantic knowledge can be accessed through different modalities allows us to explore the different processing levels that underlie retrieval of supramodal object concepts. Here we studied the influence of semantic congruence on auditory-visual object recognition in a go/no-go task. Participants were asked to react as fast as possible to a target object presented in the visual and/or the auditory modality, and to inhibit their response to a distractor object. The experiment was run under an immersive and realistic virtual environment including 3D images and free-field audio. Reaction times were significantly shorter for semantically congruent bimodal stimuli than predicted by independent processing of information about the objects presented unimodally. Interestingly, this effect was twice as large as found in previous studies that used information-rich stimuli. The processing of bimodal objects was also influenced by their semantic congruence: reaction times were significantly shorter for semantically congruent bimodal stimuli (i.e., visual and auditory stimuli from the same object target) than for semantically incongruent bimodal stimuli (i.e. target represented in only one sensory modality and distractor presented in the other modality). Importantly, an interference effect was observed (i.e. longer reaction times to semantically incongruent stimuli than to the corresponding unimodal stimulus) only when the distractor was auditory. When the distractor was visual, the semantic incongruence did not impair recognition. Our results show that immersive displays may provide large multimodal integration effects, and reveal a possible asymmetry in the attentional filtering of irrelevant auditory and visual information.

    Equipe Espaces acoustiques et cognitifs
    Cote Suied08d

    © Ircam - Centre Pompidou 2005.