Wwise Midi Basics: New Super Lucky Tale Foxberry Timer Musical Midi Magic!

Programmation audio / Contenu sonore pour les jeux vidéo

Hello there all you Wonderful Wwise users :)

The essential skill needed to survive game audio is the ability to problem solve. Knowing the ins and outs of your tools can mean the difference between flying blind or aiming right at your target. Last year, I was composing for the charming and playful Switch title “New Super Lucky’s Tale” when I came across a problem requiring that essential skill. The Wwise MIDI system was my solution.

MIDI has been solving problems for Game Audio Composers since the days of iMuse and Creative Labs sound cards. You may be thinking, “But, Aaron it’s 2020 and isn’t MIDI as outdated as the Keytar”? First, the Keytar is the finest instrument gifted to us by the synth gods above. Second, absolutely not and this tutorial will help show exactly how useful it still is today! If you’ve ever wondered how to utilize the power of Wwise.

The Foxberry Timer Challenge

There were a few unique challenges for designing this melody:

  • Each Foxberry Timer has a very different length.
  • That length had no set time and was determined at run time depending on how difficult the challenges algorithm calculated it to be.
  • The music can occur at any time over any melody in a wide array of musical keys and time signatures.
  • Each challenge should feel as though it is counting down in a unified way to add tension while the player collects coins.
  • The traditional Wwise music system couldn’t handle all those variables. Luckily, we had the power of Wwise and Wwise MIDI to provide the solution to each of those challenges!

This video shows my workflow from creating the melody, mocking up the bpm curves, exporting the samples from Ableton, exporting the MIDI, chopping up the samples in Reaper, importing it all into Wwise and hooking it up in Wwise using an RTPC to make it work at any time for all the different Foxberry Timer lengths.

 

 

Part 1: The Ableton Mock-up

First, I wanted to prove the system out. For this I used Ableton Live. I absolutely love working with Ableton for quick iteration and creative tasks. In this case I wrote out the full melody and mocked up the bpm ramp to perform as the RTPC would in the game. I quickly tested different curves and ensured the general sound was working in any scenario.

Then I flattened out the bpm in order to prep the samples and the MIDI. I spread out the samples only using C and G from each octave to save space on the disk. I’ve found in many cases that using two samples per octave works quite well and maximizes your time. Of course, this will depend on your samples and complexity of your instrumentation.

 

Part 2: Reaper for Editing and Exporting

Once I had the mock-ups, I pulled the files into Reaper to prep the samples. Why didn't I edit straight from Ableton? Well, Ableton editing is… tedious to say the least and the faster I can achieve my goals the faster I can express the creative ideas in my brain. The quicker the workflow, the more creative I can be. Reaper is lightning fast at cutting up and exporting files while keeping specific naming structures. Once I chop it up using dynamic split I then export the files from Reaper’s region manager and wildcards. This ensures all the files uniformly edited, prepped and named to make it much easier to map in Wwise.

 

Part 3: Wwise for In-Game usage

I import the Reaper files into Wwise to set up the MIDI music system. The system is one that I created using blend containers to hold all the samples. I use the MIDI keymap editor to map all of my files to different MIDI key ranges so they trigger correctly within Wwise. After which, I pull in the MIDI tracks and map it to the blend container to play the samples. Finally, I set the RTPC to playback speed to control the speed of the MIDI track.

After all that, the MIDI music will trigger, the playback speed of the MIDI will be tied to the RTPC speed which can ramp up as the challenge is closer to being over and the MIDI will trigger notes that stay in key because playback speed won’t affect the pitches.

 

Tip: Speeding up the Process Using WAAPI

After posting this Youtube video the MIDI magician Bernard Rodrigue made a post showing how to use WAAPI to MIDI keymap a Blend Container in 5 seconds with WAAPI. Source code and instructions on GitHub. This is a GIGANTIC workflow enhancement compared to the manual process. Definitely go check this out to save yourself tons of time.

 

For more MIDI videos: 

Connecting a MIDI Controller 

 

 

 

 

 

Importing MIDI Files

 

Finale: Wrapping it all up

If you've ever considered using Wwise MIDI this will be a perfect resource as a basic guide on triggering your music in Wwise for the music of your game. I’m sharing this to help the community use this powerful tool because you are all wonderful audio people and I sincerely hope this information injects some MIDI power into your musical workflows. If you create something awesome with it, please reach out on social media and share your own MIDI magic :)

Aaron Brown

AARON BROWN SOUND

Aaron Brown

AARON BROWN SOUND

Since 2007 Aaron has created audio experiences with industry juggernauts including LucasArts, Playful, Epic, Electronic Arts, Activision, Naughty Dog, Twisted Pixel, Starbreeze, Raven and notable mobile gaming developers. During his career he has been fortunate enough to have contributed his skills to some video games that have won audio awards including Uncharted 3 and The Force Unleashed 2. Aaron has also done composition, sound design and implementation in VR with titles such as Path Of The Warrior, Lucky’s Tale for Oculus, Robo Recall for Oculus and multiple projects for Starbreeze. He also focuses on doing what he can to build a better sound community. He currently lives in Austin where he runs the Austin Game Audio Meetups, hosts gatherings at Mosaic Sound Collective and is on the AES Central Texas Board.

 @aaronbrownsound

Commentaires

Laisser une réponse

Votre adresse électronique ne sera pas publiée.

Plus d'articles

Yonder: The Cloud Catcher Chronicles - An Episodic Audio Journal - Episode One: Risky Beginnings

Yonder: The Cloud Catcher Chronicles was announced at the Sony PSX event late in 2016. It is an open...

24.10.2017 - Par Stephan Schütze

Behind the Beautiful Sound of Monument Valley 2: Interview with Todd Baker

This interview was originally published on A Sound Effect With Monument Valley 2, Ustwo Games not...

16.1.2018 - Par Anne-Sophie Mongeau

Is Hybrid Interactive Music the Future? PART I - How I used Get Even as an R&D platform for Interactive Music

When I write music for video games, I always wonder about how I can make it more meaningful for the...

27.3.2018 - Par Olivier Derivière

The Interactive Music Systems of Divinuet: Part 2

Hi, it’s me again! I’m here to tell you more about the interactive music in my upcoming tarot music...

3.3.2020 - Par Megan Carnes

Rain and Terrain: How Animal Crossing New Horizon’s Audio Design helps even the Totally Blind

Animal Crossing is playable by the totally blind! These words have shocked many people to whom they...

17.6.2020 - Par Brandon Cole

Achieving Cinematic Scoring through Proactive Audio

When was the last time you truly noticed the music playing in a game during an epic battle? Game...

8.7.2020 - Par Sonic Bloom

Plus d'articles

Yonder: The Cloud Catcher Chronicles - An Episodic Audio Journal - Episode One: Risky Beginnings

Yonder: The Cloud Catcher Chronicles was announced at the Sony PSX event late in 2016. It is an open...

Behind the Beautiful Sound of Monument Valley 2: Interview with Todd Baker

This interview was originally published on A Sound Effect With Monument Valley 2, Ustwo Games not...

Is Hybrid Interactive Music the Future? PART I - How I used Get Even as an R&D platform for Interactive Music

When I write music for video games, I always wonder about how I can make it more meaningful for the...