Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


At AES: The Realities of Mixing in Virtual, Mixed, and Augmented Reality

From problems to best practices, this Game Audio session explored VR, MR and AR audio challenges from all angles.

New York, NY (October 18, 2019)—The challenges in mixing for Virtual (VR), Mixed (MR), and Augmented Reality (AR) are well known—how can you provide the best experience for users when they are the ones in control of how the story progresses? Scott Selfon, audio experiences lead at Facebook Reality Labs (Oculus Research), tackled the topic head-on in a packed panel on Wednesday named “Real-time Mixing and Monitoring Best Practices for Virtual, Mixed, and Augmented Reality.”

To start, Selfon looked to the 100-plus years of linear media mixing for inspiration, and made comparisons between the two. What they have in common, according to Selfon, is that they both focus on “the important versus the other thing.” In film, the director focuses the audience on “the important,” while VR and the like focus on “the important” that is chosen by the user. Other similarities include gathering the best assets and mixing based on the content.

Naturally, differences abound, and Selfon outlined them in five parts:

• Rendering Pipeline—All sounds in VR, AR, and MR are spatialized.

At AES: Album Pros Deconstruct Recording Process

• Actual Mix Process—Other formats don’t have to worry about mixing while wearing a head-mounted display (HMD), nor do they have to mix while moving around.

• Listening Pipeline—Unlike film, where the theater is well calibrated, VR experiences are over earbuds or mobile devices, and extremely unpredictable.

• The Listener(s)—hared or solo experience? Are they participating in location-based VR? Again, many possibilities.

• Mixing with the Real World—Most of the time, you will hear sound from the real world during the augmented reality experience.

Still, even with all the possibilities and options that VR, MR and AR provide, Selfon was able to offer some best practice guidelines, including:

• Critical listening/mixing in a high-quality environment

• Listen and validate on expected consumer’s actual devices (earbuds, phone)

• Validate the mix in expected actual playback environments (if location-based, note what else is making noise in the environment)

• Use all of the best practices you’ve learned so far for storytelling

Want more information like this? Subscribe to our newsletter and get it delivered right to your inbox.

• Mix relative to well-defined playback levels—compare/balance existing “system” experiences, using ITU-R BS.1770-4 LUFS as a metering benchmark

• Plan for the real world’s potential impact, including loud or shared playback environments in VR and competing/complimentary sounds of the entire world in AR.