Gaming & Metaverse
Immersive Music in the Virtual World Immersive music plays a crucial role in virtual environments, including gaming and the metaverse. Both domains can incorporate six degrees of freedom (6DoF), allowing users to experience spatial audio dynamically as they move within these virtual spaces.

Gaming
In gaming, immersive music and sound effects are not generated directly by the game engine but are instead processed through specialized audio middleware. The most widely used and popular game engine is Unreal Engine. Audio middleware functions as an intermediary layer between the game engine and the audio hardware, providing essential audio functionalities required for each project. These include pitch and volume randomization, sound fading, selection of randomized sounds from a predefined set (for specific actions), filtering, positioning (panning), and more.
​
Two of the most prominent audio middleware software packages are FMOD and Wwise. FMOD offers seamless integration with Dolby Atmos, while Wwise also supports Auro-3D. This integration enables immersive music to be recorded, produced, and mixed specifically for games before being directly incorporated into the audio middleware. The middleware further facilitates the conversion of audio into Ambisonics and, potentially, binaural formats, ensuring compatibility with headphone-based listening experiences.
Metaverse
The metaverse, or "metaversum," encompasses an interconnected network of virtual 3D spaces where users, often represented by avatars, can explore and interact. In addition to virtual reality (VR) environments, it integrates augmented reality (AR), online digital platforms, robotic process automation (RPA), micro-local concepts, news, advertising modules, and other internet-based applications. The term "metaverse" originates from the prefix "meta" (meaning "beyond" or "about") and the word "universe," signifying the next generation of the internet, in which all existing and future shared 3D virtual spaces are interconnected within a comprehensive virtual ecosystem.
The term was first introduced by Neal Stephenson in his 1992 science fiction novel Snow Crash, in which humans interact as avatars within a three-dimensional digital space that serves as a metaphor for the real world. Stephenson envisioned the metaverse as a VR-based successor to the internet.
​
Given the advancements in technology, various interfaces and communication protocols are currently being developed to facilitate interoperability between different virtual environments. Multiple organizations and research groups are collaborating to establish standards and protocols for seamless integration across platforms. Some key initiatives include:
​
​
​
-
Facebook – Horizon (sinds 2019)
​
-
Delta Media GBE – RPA Concept (since 2015)
​
-
Virtual Worlds – Standard for Systems Virtual Components Working Group (since 2010)
​
-
Information technology — Media context and control — Part 4: Virtual world object characteristics (ISO/IEC 23005-4:2011) ISO (since 2008)
​
-
Immersive Education Technology Group (IETG) Media Grid (since 2008)
​
-
Virtual World Region Agent Protocol (VWRAP) (2009-2011)
​
-
The Metaverse Roadmap – Acceleration Studies Foundation (2006-2007)
​
-
The Open Source Metaverse Project (2004-2008)
​
​
​
Many of these working groups are still in the process of drafting specifications and formulating open standards aimed at ensuring interoperability among different virtual ecosystems.