Spatial composition represents a key aspect of contemporary acousmatic and computer music. The history of spatial composition practice has shown many different approaches of composition and performance tools, instruments and interfaces. Furthermore, current developments and the increasing availability of virtual/augmented reality systems (XR) extend the possibilities in terms of sound rendering engines as well as environments and tools for creation and experience. In contrast to systems controlling parameters of simulated sound fields and virtual sound sources, we show an approach of XR-based and real time body-controlled (motion and biofeedback sensors) sound field manipulation in the spatial domain. Allowing to be applied not only to simulated sound fields but also measured ones and reproduced with various spatial rendering procedures.
Authors: Damian Thomas Dziwis (University of Applied Sciences Cologne), Tim Lübeck (University of Applied Sciences Cologne) and Christoph Pörschmann (University of Applied Sciences Cologne)