Have you been working in the film industry and are wondering what it would be like to work in game audio? Perhaps it's vice versa? I’ve been working as a composer and sound designer for both games and films for the last 10 years. On a macro level, we are basically delivering the same thing; sounds and music. On a micro scale, these two mediums can be vastly different, in terms of workflow, approach and environment. In this two-part blog, I provide some ideas about some of the similarities and differences, and to make sure that I'm not just making things up, I got in touch with four renowned artists who shared some of their own thoughts and experiences. What you'll find here are perspectives of course, but this should help you have a better understanding of how it might differ to work in one industry compared to the other. To clarify, this blog is mainly for newcomers who are starting our their audio career, or for those who may have only worked in one of the industries and were wondering what the other might offer. Let’s dive into it!
THE WORK AND THE DELIVERY
Let’s first talk about the work. What we actually do, and what we have to deliver. A recurring theme is that with games you will deliver flexible sounds/music that can adapt to the gameplay whereas in film, it’s pretty much one fixed delivery that will not change much once it’s delivered. In film, we try to finely shape every little part of the sound/music to emphasize the narrative. In games, we need to give the option for the narrative to be stretched if the player decides to explore (all relative to the game and its mechanics obviously).
But how do these differences affect how we work?
Let’s say you have to create the sound of a door opening and closing. In film, you would just create that one sound for this one time you see the door opening and closing. But in games, you won’t know if the player will open and close the door 20 times, so you’ll need to provide variations because a door opening and closing in real life would not sound the same each time in real life, and creating variations will help not break the player's illusion. Since games run on software, we can potentially still manage with a few sound effects and then add variance through slight pitch and filter changes.
When delivering sound for a film, delivering a Pro Tools session for the length of the film that then goes to mixing is the standard. For games, there’s the big additional task of implementation where the sounds are integrated into the game engine so that they get triggered correctly throughout the game. Game audio sound designers may or may not have the task for integrating the sounds, and may create variations directly using game audio middleware.
Richard Gould on what the work entitles:
“One of the big things in film one has to consider is the production track. The production track is made up of whatever sound was recorded on set, typically consisting of dialogue but also sometimes sounds like footsteps, doors closing, things like that. In post production, we often include these sounds from the production track in the final mix so any sounds that we add, whether they’re foley or a sound effect, have to match the quality and feel of the production track. The same goes for dialog in that live action films often have a mix of production dialog and ADR (Automated Dialog Replacement) which also have to match. Sometimes, due to technical or practical issues, the production sound that was recorded isn’t optimal and you can find yourself fighting with the production sound trying desperately to make it sound better. This isn’t an issue in animation films (where there’s no production track) or with video games, as all of the dialog is recorded in the studio.”
Jamey Scott about his workflow and delivery:
“It’s a completely different dynamic. When I work on a film, it’s all about a linear production and putting the sounds in a format in Pro Tools so they can go down the assembly line into the mix. Now when I work on a game, I’m not working in a linear structure in Pro tools. I’ll add a marker for instance labeling a type of gun and then I'll go in and make all of the variations for firing, recoil etc. And I'll always do multiples and I'll consolidate them into little files that then go into the game engine or audio middleware. So the delivery is very different because you deliver lots of individual files for games as opposed to one big single file in the different formats (i.e. Dolby Atmos, 5.1, stereo) for films.”
In film, the music is mainly written for a scene (scored to picture), and will have a specific length. You often fine-tune every detail to specifically fit each moment. A main consideration in film is to shape the music around potential dialog and sound design. The delivery will be one long file for a cue in potentially layers/stems (Drums, Brass, Synth etc.) to give some flexibility in the final mix/dub. The amount of layers varies depending on each project, and for smaller films with lower budgets a single stereo mix delivery will sometimes do.
In a game, most music has to be shaped around player interaction. Since it is unknown how long a player will stay in the same spot, the music needs to be interactive and have the flexibility to adapt to the pace of the player. There’s different ways to create interactive music. To name a few: looping stems, creating several layers/versions with different intensities that can be cross-faded, creating shorter music-sections for variations between stems, to make the music less repetitive. Cut scenes in games are basically small films within the game and this is where the two types of works are the most similar as the composer will just deliver one long file that fits the cut scene without the player’s interaction.
Garry Schyman on how he delivers things:
“In video games it seems a little counterintuitive because games are using such complex technologies but in video games I'm still mostly asked to deliver stereo wav files. Obviously you are separating out layers, intros and outros and those sorts of things that are part of the game's interactive dynamics. Sometimes I’m asked to deliver stems, especially with Sony. I’ve only been asked twice to deliver 5.1 mixes which is more than normal in film. Even 7.1 mixes. and everything in 8-12 stems so they can re-mix it on the dub stage.”
Brian Tyler about how he approaches the work:
“In the beginning stages, technically that’s where they meet a little bit more. When composing for film, I don’t start writing for each scene at the beginning. I will first watch the film to get a feel for it, and in a game I will play a couple of levels to get an overview of the story and cut scenes etc. Then I’ll step away and start writing music and themes, trying to get some of the emotions that I got from watching the film or playing the game. But from thereon, they start to divert a bit.
In a game, there’s often these distinct areas and worlds that need specific music to represent them. It’s almost like 5-7 small movies where each have their own complete arc, whereas in a movie there’s kind of only one big arc. And in a game, because the music changes depending on what the player does, it must be written in a modular way so it’s cohesive. When it comes to movies, I’m in complete control of how the music works - it will sound the same way every single time so I can easily control how it will flow.”
When sound and music are delivered for a film, they are usually all brought together in one Pro Tools session and the film makers will create a final mix. On bigger productions, this is done on a dubbing stage, which is a room the size of a cinema viewing room, to give the right impression for how the film will play in a movie theater. On smaller productions the final dub might just be done in a studio. The final mix is either done by the sound designers or a dedicated re-recording mixer, also called a dubbing mixer.
For games, when music and sound is delivered, it needs to get implemented in the game. This is either done directly in the game engine (i.e. Unity, Unreal etc.) or it will be implemented using game audio middleware such as Wwise. The sounds in the game audio middleware will then be triggered from the the game engine. This implementation can be done by the sound designer or composer, or by the coders on the audio production team. The advantage of using game audio middleware is that it usually contains effects for panning, volume, and various tools to create sound variations. Using its tools simplify the mixing process, without requiring extra resources from the game development team in terms of coding. The mixing of a game is definitely as important as mixing a film, but it’s often done throughout the game's production and while play-testing, and not only as a final step in the process as it is the case for most films. Technical implementation, including creating sound variations outside of the DAW and audio systems using middleware are becoming more and more a part of the sound designer or composer's job in game audio, as opposed to good skills to have.
This point is definitely very subjective but still an interesting one to discuss. I’ve heard so many different opinions about this, but they often lean a bit towards the idea that working in games provides sound designers and composers with a bit more creative freedom.
For sound in games it definitely depends on the size of game, but from my own experience and what others have shared with me, games have a tendency to give a bit more freedom. Both Richard Gould and Jamey Scott provide some good insights:
Richard Gould on creative freedom:
"I think the scale of a project, how large or small and whether it's being developed at a studio or if it's an indie development, affects the creative freedom. As far as trying to compare the two mediums;
I think there's more design opportunities in video games because you're often creating worlds that don't exist in reality. This isn't necessarily as true for film where you’re often creating a reality based on the world in which we live in. Now obviously there are exceptions to that but generally I find this to be true. Another reason why I think games sometimes offer you a little more creativity is the fact that you often have more time to develop ideas. (see next section)
Whereas in film you're often having to move quite quickly. It's not to say there isn't time to experiment [in film] but I think that there’s more time for experimentation in video games.”
Jamey Scott on creative freedom:
I feel a little less literal when I work on games. In films (unless it's animated) if you get too far away from the reality of how a sound plays I feel like people check out, whereas in games it's all animated and I can do things that don't quite fit with the action. So maybe games are a little bit more creatively stimulating.
There's sort of linear scale of games. There's the triple A games where there might be an audio director who really knows their stuff and would not tolerate any sort of deviation from their vision. Then there are smaller games where they don't know what they want apart from sounding great. In films you're dealing with 100 years of tradition and filmmakers are generally very in tune with sound as they are with music. They know what they want and they know how to articulate it. So I feel with films I have to be a lot more careful with what I present.
Also in music many will argue that since you don’t have to fit specifically with timings and hit points in the picture it gives more creative freedom. Others will say that there’s a limitation in game music since it has to stay flexible (as described in the previous section) which sometimes forces you to limit musical content such as modulation (key changes) or tempo shifts, in order to make it loop-able or crossfade-able and easier to adapt to the gameplay.
Garry Schyman on creative freedom:
"I definitely feel there’s more freedom in games because your compositional decisions are inspired by what's going on but aren't necessarily locked to any specific image or moment or action by a character. Cut-scenes in games are scored like a film. So those are identical. So you have more compositional freedom [in games] but also more responsibility to the composer, at least that's how I view it."
Brian Tyler on creative freedom:
“I feel you have a good amount of freedom in a game because you’re doing things that aren’t specific to every single line of dialog, so you can step back a bit more and listen to the music as a whole. In movies, you can micro analyze things to the point of absurdity, trying to figure out if you should have music playing on that piece of dialog or not. Most moviegoers aren’t analyzing the music when watching a film. The score is manipulating emotions in the background. On the whole, film can be more micro analytical, but I’ve also worked on a game where they micromanaged the music to the point where there was no longer any real benefit."
Eventually I believe it comes down to what type of composer you are and what type of music you write. It’s important to understand the different “limitations” the two types of mediums demand and keeping that in mind as a creative starting point.
Artists today are required to deliver more content in shorter time periods, and for less money. This seems to be the tendency with all media forms these days. TV series always demand things with much shorter turnarounds than film. But how do the games and the film worlds compare?
Jamey Scott on time:
I feel right now in games there's maybe more of a time luxury than there is in films. Maybe because of the fact that I've gone from working on big films to working on TV shows just because that's where the work is right now. Often with films (and TV) you’re asked to deliver hard effects for the entire 110 minutes in 2 weeks. So you’re furiously scrambling to create that much content. With games I'll think about the time it's going to take to record and create all the assets but I'm also tacking on double that amount of time for implementation and technical back and forth.
Richard Gould on time:
“Film is such a well-established medium. Whilst there are certainly still technological developments and new delivery formats, Dolby Atmos being a relatively recent example, generally the workflow is pretty set in place, both at a macro scale in terms of the processes of editorial, mixing and mastering, but also at an individual level in terms of people’s workflows and the toolsets . So the machine (in film) if you want to call it that, is well-oiled and capable of moving quickly, which is why I typically spend much less time on a film project than a game project. I spend at the most three to four months on a film whereas video games can span over a year in terms of the production schedule.”
Garry Schyman on time:
“In my experience with games you have much more time. That's probably because games tend to have a longer development process than films. They [game developers] tend to bring you in earlier in the process, either in the start or the middle. That also occasionally happens in films, where you hear about some composer being hired a year before the movie is finalized. But I would say more often than not you’re hired on a film where there's an edited picture to score and limited time to finish.”
Brian Tyler on time:
“On average, I’m on a game longer. There’s a game that I started 2 years ago that’s not coming out until next year. Granted, I tend to get tied in to pretty epic games, but there’s also times I’m on a movie and will start to speak with the director maybe a year or so before the premiere.”
Stay tuned for Part 2 of this blog where we will cover some differences when it comes to building a career within each industry, a little on the social aspects, and how deals are made including how we get paid as sound designers and composers.
June 19, 2019 at 07:23 pm
Wow, I'm a fan. Thank you for sharing this valuable information!
June 30, 2019 at 08:15 pm
Great article! For a long-time “traditional” television and film post-production engineer looking to expand into the gaming world, it was refreshing to hear from experienced actors in both worlds, and their opinions on the differences between them in terms of music and audio.