SADIE is a multidisciplinary research project that aims to provide transformative improvements in spatial sound for interactive media experiences.
Interactive media systems such as games consoles have become commonplace in UK domestic environments offering increased connectivity and media integration. Whilst many systems support 3-D visuals and Ultra-High Definition video, most do not address the importance of 3-D sound for immersive experiences. 5.1 surround is the most common format supported by such systems, but its immersive capability is limited to a two-dimensional plane. New formats incorporating height channels have been targeted at cinema but could also naturally complement the virtual reality experience of gaming by extending the soundfield to three dimensions. The benefits not only include new audio features for enhanced gameplay, but also enable the design of immersive sound-centric games with significant social, cultural, educational and healthcare gains. However, sound immersion is difficult to achieve in the home as it is impractical to have dozens of transducers placed around the living room. SADIE pioneers new approaches for rendering interactive spatial sound in the home that includes sound source rendering with height.
For gaming, there is also the complex issue of listener position. For example, in games based on kinetic motion tracking the listener moves in reaction to the game play and the formation of a stable soundfields becomes difficult as they are no longer located at the acoustic sweet-spot. Real-time soundfield compensation through tracking of the listener can be used to counteract this, but the listener movement also makes digital equalisation of the room acoustics a considerable challenge. The result is a flawed virtual environment in which visual and auditory cues are not spatially coincident.
SADIE will address the improvement of spatial audio quality for immersive interactive media experiences in the home. It will undertake novel science to improve soundfield immersion and the formation of 3-D sound sources beyond the horizontal plane, lifting the constraints of loudspeaker placement and dynamic source-listener movements whilst conserving good sound reproduction quality. The research will pioneer new methods for soundfield rendering formed through characterisation of the cues required for perception of sources with height in dynamic listening. The work is poised to have significant transformative impact on sound reproduction in homes in the UK, whilst also addressing wider questions on auditory perception and acoustic signal processing that serve other research disciplines outside of interactive media.
Members
- Gavin Kearney
Funding
- EPSRC
Links
- Project web site (including binaural database downloads)
- York Research Database (Publications, Related Projects, etc.)
Research