Recherche
Recherche simple
Recherche avancée
Panier électronique
Votre panier ne contient aucune notice
Connexion à la base
Identification
(Identifiez-vous pour accéder aux fonctions de mise à jour. Utilisez votre login-password de courrier électronique)
Entrepôt OAI-PMH
Soumettre une requête
| Consulter la notice détaillée |
| Version complète en ligne |
| Version complète en ligne accessible uniquement depuis l'Ircam |
| Ajouter la notice au panier |
| Retirer la notice du panier |
English version
(full translation not yet available)
Liste complète des articles
|
Consultation des notices
Catégorie de document |
Contribution à un colloque ou à un congrès |
Titre |
Investigation of Auditory-visual integration in VR environments |
Auteur principal |
Khoa-Van Nguyen |
Co-auteurs |
Clara Suied, Matteo Dellepiane, Olivier Warusfel, Isabelle Viaud-Delmon |
Colloque / congrès |
IMRF - International Multisensory Research Forum. Sydney : Juillet 2007 |
Comité de lecture |
Oui |
Année |
2007 |
Statut éditorial |
Non publié |
Résumé |
Investigating the time and spatial constraints under which visual and auditory stimuli are perceived as a unique percept or as spatially coincident has been a topic of numerous researches. However, these findings have been derived up to now in extremely simplified stimulation context consisting in the combination of elementary auditory and visual stimuli usually displayed in dark and anechoic conditions. The present experiment is conducted in a VR environment using a passive stereoscopic display and binaural audio rendering. Subjects have to indicate the point of subjective spatial alignment (PSSA) between a horizontally moving visual stimulus that crosses the direction of a stationary sound. Auditory stimuli are displayed on headphones using individualized head-related transfer functions and the visual stimulus is integrated in a visual background texture in order to convey visual perspective. Two types of audio stimuli are used to evaluate the influence of auditory localisation acuity on the auditory-visual integration: periodic white noise bursts providing optimal localisation cues and periodic 1kHz tone bursts. The present study will indicate whether previous findings (Lewald et al., Behavioural Brain Research 2001) still hold in more complex audio-visual contexts such as those offered by cutting edge VR environments. |
Equipe |
Espaces acoustiques et cognitifs |
Cote |
Nguyen07a |
|
|