Saturday, October 1, 2:00 pm — 3:30 pm (Rm 409A)
Spatial Music, Virtual Reality, and 360 Media—Enda Bates, Trinity College Dublin - Dublin, Ireland; Francis Boland, Trinity College Dublin - Dublin, Ireland
The following paper documents the composition, recording, and post-production of a number of works of instrumental spatial music for a 360 video and audio presentation. The filming and recording of an orchestral work of spatial music is described with particular reference to the various ambisonic microphones used in the recordings, post production techniques, and the delivery of 360 video with matching 360 audio. The recording and production of a second performance of a newly composed spatial work for an acoustic quartet is also presented and the relationship between spatial music and 360 content is discussed. Finally, an exploration of the creative possibilities of VR in terms of soundscape and acousmatic composition is presented.
This session is part of the co-located AVAR Conference which is not included in the normal convention All Access badge.
Positioning of Musical Foreground Parts in Surrounding Sound Stages—Christoph Hold, Technische Universität Berlin - Berlin, Germany; Lukas Nagel, Technische Universität Berlin - Berlin, Germany; Hagen Wierstorf, Technische Universität Ilmenau - Ilmenau, Germany; Alexander Raake, Technische Universität Ilmenau - Ilmenau, Germany
Object based audio offers several new possibilities during the sound mixing process. While stereophonic mixing techniques are highly developed, not all of them generate promising results in an object-based audio environment. An outstanding feature is the new approach of positioning sound objects in the musical sound scene, providing the opportunity of stable localization throughout the whole listening area. Previous studies have shown that even if object-based audio reproduction systems can enhance the playback situation, the critical and guiding attributes of the mix are still uncertain. This study investigates the impact of different spatial distributions of sound objects on listener preference, with a special emphasis on the distinction of high attention foreground parts of the presented music track.
This session is part of the co-located AVAR Conference which is not included in the normal convention All Access badge.
The Soundfield as Sound Object: Virtual Reality Environments as a Three-Dimensional Canvas for Music Composition—Richard Graham, Stevens Institute of Technology - Hoboken, NJ, USA; Seth Cluett, Stevens Institute of Technology - Hoboken, NJ, USA
Our paper presents ideas raised by recent projects exploring the embellishment, augmentation, and extension of environmental cues, spatial mapping, and immersive potential of scalable multichannel audio systems for virtual and augmented reality. Moving beyond issues of reproductive veracity raised by merely recreating the soundscape of the physical world, these works exploit characteristics of the natural world to accomplish creative goals that include the development of models for interactive composition, composing with physical and abstract spatial gestures, and linking sound and image. We are presenting a novel system that allows the user to treat the soundfield as a fundamental building block for spatial music composition and sound design.
This session is part of the co-located AVAR Conference which is not included in the normal convention All Access badge.