Blog homepage

How to Use Interactive Music to Score Gameplay

Community & Events / Interactive Music

During our recent Interactive Music Symposium, Joe Thwaites dove into how his team used interactive music design systems in Sackboy: A Big Adventure to help elevate the game’s scores. Below, you’ll find a summary of key points from his talk.

Expanding the Core DNA

"The goal of the music is always to support the story, and interactive techniques are really great tools for helping us achieve this goal," says Joe.  When creating Sackboy: A Big Adventure, the audio team decided to build on the musical legacy of the LittleBigPlanet titles, specifically the combination of licensed tracks and original pieces. Only this time, they wanted both licensed and original pieces to feel "equally as embedded and interactive." To them, this meant thoughtfully making use of interactive techniques which supported the narrative of the game, such as horizontal resequencing, vertical layering, stingers & embellishments, and runtime processing effects.

Let’s dive into how Joe’s team used these basic systems in Sackboy: A Big Adventure.

Horizontal Resequencing

Horizontal resequencing is an interactive music technique where music is dynamically pieced together based on a player’s actions. For Joe's team, this meant splitting each track into loopable chunks, and it allowed the music to move to a new section or repeat a previous section depending on what the player did. They used this technique whenever they wanted to:

• Help create momentum

• Change the mood

• Add a cadence to the music

• Change seamlessly to a new piece of music

Let's dive into a few examples:

Vertical Layering

Vertical layering is an interactive music technique in which layers of music can be added or removed to:

• Add variation

• Change the mood

• Build or release tension

• Support narrative development

Joe’s team split each track on a case-by-case basis, but they found themselves generally splitting the music into the following layers: drums, bass, lead, accompaniment, and vocals. Depending on gameplay events, they would add or remove layers to help elevate moments that supported the narrative. Let's have a look at some examples of this:

Stingers & Embellishments

Stingers & embellishments are short musical elements triggered by gameplay events. "These are great for highlighting specific moments of gameplay instantly, and giving feedback to the player when they do certain actions or collect certain things," says Joe. His team used these in Sackboy to:

• Reward the player

• Support gameplay mechanics

• Smooth transitions

Here are a few examples of this:

Runtime Processing

As well as making actual edits to the music, Joe’s team looked at applying filters and effects at runtime that manipulated the music depending on gameplay. They did this to:

• Embed the music into the game world

• React to gameplay states

In the following video, you'll hear some of these runtime processing effects:

 

Conclusion: Scoring Gameplay to Support the Narrative

Before planning to explore the use of interactive music techniques, it's vital to consider how those techniques are going to support the narrative of the game. For Sackboy, Joe's team wanted the music to support momentum through the level, so they decided to use horizontal resequencing to dynamically change the music as the player progressed. They wanted the music to build up and into narrative moments and boss fights, so they decided to use vertical layering to help build tension and drama. They wanted the music to reward the player when they made progress towards the goal, so they decided to use stingers and musical embellishments to give feedback. And because they wanted the music to feel embedded in the game world, they decided to use runtime processing effects to react to gameplay states. "In a game like this," says Joe, "there's so much that has potential to feed the music system, it is important for us to be guided by the journey of the player within the context of the game."

You can watch Joe Thwaites' full talk here:

Joe Thwaites

Principal Composer & Music Producer

Sony Interactive Entertainment

Joe Thwaites

Principal Composer & Music Producer

Sony Interactive Entertainment

Joe is Principal Composer & Music Producer at Sony Interactive Entertainment Europe.

 @joethwaites

Comments

Leave a Reply

Your email address will not be published.

More articles

The School of Video Game Audio's 10 Tips for #GameAudioGDC 2019

A lot of people have downloaded this list from when I first posted it for GDC 2015, but I've given...

12.3.2019 - By Leonard Paul

Music for Games Should be More than Just Music: Part 1

What is video game music? What is interactive music? The answers to these questions are not as...

6.11.2020 - By Olivier Derivière

Why Wwise for 3D Interactive Music Experience

‘Dirty Laundry by Blake Ruby’ Mobile VR App: Audio Explanation.

12.2.2021 - By Julian Messina

An Introduction to Controllers for Composers

During her presentation at our Interactive Music Symposium, Ressa Schwarzwald talked about how using...

21.1.2022 - By Ressa Schwarzwald

How to Create Audio-Reactive Objects Using Wwise and Unity

I would like to show you how to use RTPCs to move game objects in Unity, and how to create...

9.2.2023 - By Tomokazu Hiroki

KID A MNESIA Exhibition: An Interview With the Audio Team

Kid A Mnesia Exhibition is a digital exhibition of music and artwork created for Radiohead albums...

18.5.2023 - By KID A MNESIA Exhibition Audio Team

More articles

The School of Video Game Audio's 10 Tips for #GameAudioGDC 2019

A lot of people have downloaded this list from when I first posted it for GDC 2015, but I've given...

Music for Games Should be More than Just Music: Part 1

What is video game music? What is interactive music? The answers to these questions are not as...

Why Wwise for 3D Interactive Music Experience

‘Dirty Laundry by Blake Ruby’ Mobile VR App: Audio Explanation.