Putting the Virtual back into Reality

위치 기반 엔터테인먼트 (LBE) / VR 체험

Imagine taking a Virtual Reality experience out of its virtuality, and placing it in the real world, yet still keeping all its virtual world elements. It's a kind of a mixed reality set in a room where interactions should happen as they would in a regular VR experience, however people get into it without any goggles or headphones.

This was the challenge faced by Riverside Studios/Lucha Libre Audio team in Berlin for the XR Room experience developed by NEEEU, for Factory Berlin. NEEEU is one of the most renowned design studios in town, and Factory Berlin is Europe’s largest international innovation community and co-working space gathering creative people and helping them thrive with their ideas and concepts. It's similar to Andy Warhol’s NY Silver Factory.

Copy of factory-creativecode-by-cherie-birkner-11

NEEEU developed a 3D experience, but, instead of delivering it on a regular VR goggle, NEEEU implemented eight projectors in a big room, so that people can interact with the experience in real life. Our first challenge was how we were going to correctly add sound and music in such a presentation for VR elements. The solution was building a 64-speakers array behind fake walls and implementing all audio in Ambisonics.  The fake walls were built, and the speakers were implemented by WeSound, a company based in Hamburg which delivers massive soundscapes for many projects around Europe. WeSound used ICST Ambisonics tool from Zurich’s Institute for Computer Music and Sound Technology at the Zürcher Hochschule der Künste.

Copy of factory-creativecode-by-cherie-birkner-0064

When all the hardware set up was completed; it was time to start adding music and sound effects onto the project. The project was developed in Unity, and based on the complexity of the project, I decided to use Wwise for the audio. Wwise helped me organize all Music and Sound Effects for each of the “levels” of the experience. I had to use all 64 speakers in the best way possible, so I decided to separate the music for each of the levels in stems, place them spatially in Unity and Wwise according to the positions of the visuals, and also add spatialized sound effects and Voice-overs which were performed by the amazing actress/voice talent Emily Behr.

Since the XR Room was a reactive room, meaning, people could interact with the walls using the HTC Vive controllers, all sound effects needed to be precisely placed. I separated everything and created multiple attenuations, and Wwise helped a lot with delivering proper sound effects for each scene since all scenes are different in size, duration and intention.

We were on a tight time-frame, and I couldn’t ask too much from the developers who were busy figuring out how to make everything work flawlessly on their end.  All music and sound effects were different for each one of the scenes with no repetition, so using Wwise, I separated each scene into Audio Busses to avoid any sounds bleeding from scene to scene. I kept the voice in its own separate Audio Bus however since it was the only constant through all levels, and there was some Auto-ducking implemented for that. 

Screen Shot 01

Screen Shot 01B

For the music, I needed to create the soundscapes for each of the scenes and also use objects in Unity. I mixed some instruments and some stems in AmbiX 1st Order, which also helped a lot when prepping the room and mixing the sounds. And while all the base foundation was in AmbiX, some specific instruments and sounds that needed precise positioning were delivered in mono and added to the objects of each scene. 

Screen Shot 02

Another important task was to add the Audio precisely into the animations, which I made inside Unity’s timeline animation processes. I just separated each instrument in stems and went into Unity to add them precisely when and where things happened during the presentations.

Screen Shot 03

Screen Shot 04

In the end, SoundBanks were separated into Music, SFX and Voice, and also each one of these divided into levels. This was the easiest way to track everything and to help the Unity developers not get lost in naming while I began implementing everything. Although the developers didn’t do anything sound-related (apart from when we discussed some specific Voice Overs happening based on scores at Level 4), they were able to read and understand which sound were where.

Screen Shot 05

The whole experience was implemented only a few days before its opening, and while the room was ready, I was working at my studio the entire time and had only one day to mix things on-site before it started. Resonance Audio Renderer was the chosen plug-in to help me with that. I was able to have an overall idea of how things would sound in the room using my headphones during the creative process. I just had some minor changes during implementation.

Put your headphones and check out these three scenes from the experience. You can only hear 1st Order with these youtube links, but if you have the chance to come by to Factory Berlin to listen and experience this project for yourself, you'll be my guest. :) 

 

Scene 1 

Scene 2 

Scene 5 

 

  • Client - Factory Berlin - Paul Bankewitz
  • Architect - Anna Caspar
  • Construction - Ralf Norkeit
  • Acoustics - Rummels Acoustics
  • 3D visuals and concepts - NEEEU
  • Speakers Conception - WeSound
  • Music Composers - Oliver Laib/Billy Mello/Paulinho Corcione
  • Sound Effects and Sound Design - Billy Mello (Lucha Libre Audio)
  • Voice Over - Emily Behr
  • Project Manager - Ima Johnen (Riverside Studios)

 

Billy Mello

Sound Designer and Music Producer

Billy Mello

Sound Designer and Music Producer

Immersive Audio Specialist for Lucha Libre Audio at Riverside Studios, Berlin. Brazilian Sound Designer, Musical Producer and DJ for more than 25 years, recently specialized in Audio for Video Games at Berklee College of Music, nowadays focusing on VR, AR, Video Games and 360 videos.

 @maestrobilly

댓글

댓글 달기

이메일 주소는 공개되지 않습니다.

다른 글

Blade Runner: Revelations (블레이드 러너: 리벨레이션)

Blade Runner: Revelations는 잘 알려진 영화 ‘Blade Runner (블레이드 러너)’ 프랜차이즈에 기반한 상호작용 모바일 VR 게임으로, 최근 Seismic...

23.7.2019 - 작성자: Hexany Audio (헥사니 오디오)

Wwise를 사용한 음향 시뮬레이션

최근에 작업한 프로젝트에서 저희 회사는 어느 고객의 미래 사무실 공간 음향 시뮬레이션을 Wwise를 사용해 포로토타입할 기회가 있었습니다. 새로운 건물에서 음향이 어떻게 들릴...

2.10.2019 - 작성자: 에길 샌드펠드 (Egil Sandfeld)

Jurassic World: VR Expedition (쥬라기 월드: VR 탐험)

Jurassic World: VR Expedition은 위치 기반의 경험을 제공하는 인터렉티브 시네마틱 가상 현실(VR)로서, 플레이어들을 시각적으로 아주 아름다운 이슬라 누블라의...

18.3.2020 - 작성자: Hexany Audio (헥사니 오디오)

Wwise를 사용하여 UE 게임에 두 개의 오디오 장치 구현하기

먼저 제 소개를 해드릴게요. 저는 에드 카신스키(Ed Kashinsky)이며 러시아 상트페테르부르크 출신 사운드 디자이너 겸 음악가입니다. 현재 저는 아주 흥미롭고 독특한...

15.9.2020 - 작성자: 에드 카신스키(ED KASHINSKY)

3D 상호작용 음악 경험에 Wwise를 사용해야 하는 이유

‘Dirty Laundry by Blake Ruby (더러운 빨래 - 블레이크 루비)’ 모바일 VR 앱: 오디오 소개....

2.9.2021 - 작성자: 줄리안 메시나 (Julian Messina)

위치 기반 엔터테인먼트, 불규칙적 스피커 구성과 Wwise

Wwise는 오디오 채널의 개수와 스피커 배열이 기존의 소비자 채널 구성을 따르는 콘솔과 PC 게임 개발에 사용하는 것으로 잘 알려져 있습니다. 하지만 Wwise가 빛을 발하는...

16.11.2021 - 작성자: 숀 랩티스트 (Shawn Laptiste)

다른 글

Blade Runner: Revelations (블레이드 러너: 리벨레이션)

Blade Runner: Revelations는 잘 알려진 영화 ‘Blade Runner (블레이드 러너)’ 프랜차이즈에 기반한 상호작용 모바일 VR 게임으로, 최근 Seismic...

Wwise를 사용한 음향 시뮬레이션

최근에 작업한 프로젝트에서 저희 회사는 어느 고객의 미래 사무실 공간 음향 시뮬레이션을 Wwise를 사용해 포로토타입할 기회가 있었습니다. 새로운 건물에서 음향이 어떻게 들릴...

Jurassic World: VR Expedition (쥬라기 월드: VR 탐험)

Jurassic World: VR Expedition은 위치 기반의 경험을 제공하는 인터렉티브 시네마틱 가상 현실(VR)로서, 플레이어들을 시각적으로 아주 아름다운 이슬라 누블라의...