This is the 13th installment in guest blogger Frank Klepacki’s series on music production. Today Frank talks about sound effects for video games. If you missed Frank’s previous post, you can read it here.
Establishing the games “mixing board” in my experience, starts with what we refer to as “Presets.” Presets are basically a defined set of parameters that contain all the basic things you need to adjust for a sound effect, including volume, pitch, distance settings, panning, filtering levels, priority, and anything else of importance. You could compare this to the idea of setting up a “Bus” for sub-mixing.
You create your sound effect events as data, which reference the audio files you’ve created. Then you assign the “preset” that you want each sound effect event to use, like routing a track (sound event) to a bus (preset). This way at a global level, for example, you can have all major explosions using one preset, and all the guns using another, and all the spoken dialog using another, etc. Then, when you are balancing how this sounds in the context of playing the game, you can just change the preset parameters, which would in turn affect every sound assigned to it at once.
These sound effect events now need to be attached to corresponding things in the game. So this is where your hands get dirty. Every single instance of a sound effect needs to be scripted and assigned to each object’s data that would utilize it. Depending on how big the game is, this could take quite a while. Not to mention, once you’re finished, game design can also change course at some point, which then may require the sound events to be renamed or re-configured.
If the game has its own customized tools that you’ll need to use, then it’s good to have the programmer also take audio needs into account for anything that would require it, such as placing objects on a map. For example, you may want a virtual sound marker you can place yourself to cover a general area with ambient sound, such as an intermittent ocean wave, or a single area of birds or insects.
You will have animations to consider. Everything that moves in the game will have an animation it will play, and will alternate animations from one to the next. You need to be able to access these animations in a tool, and be able to add sound events to each frame that requires a sound to play right at the precise time. I’ve found that you can’t just assume that one long sound file will line up perfectly through one long animation. It’s best to split it up in segments. For one thing, computer speeds may vary, and secondly, the animations can switch at any time mid-way through playing. You wouldn’t want to be left with either lingering audio playing that is no longer part of what you are seeing, or getting the sound abruptly cut off and making it feel less realistic and jarring.
You also need to account for things such as what surface the object is moving on. For example, is the character running through water? Then you can’t have the same footstep sound from the dirt or grass. You need programming support to provide a way for your events to know what surface they are playing on at any given time, and make the switch to the appropriate audio file event.
There is so much more you can customize for the audio to your heart’s content beyond that, and every game has a different way of doing things. These are just some of the basic fundamentals that are dealt with right from the start when doing sound effects for a video game.
– Frank Klepacki
Frank Klepacki is an award-winning composer for video games and television for such titles as Command & Conquer, Star Wars: Empire at War, and MMA sports programs such as Ultimate Fighting Championship and Inside MMA. He resides as audio director for Petroglyph, in addition to being a recording artist, touring performer, and producer. For more info, visit www.frankklepacki.com