Base Principles of Game Audio Testing
Whether you've been an avid video game player for the past 15 years or only play video games when hanging out with geeky friends, you’re probably well aware of the important role audio plays in every game. From background ambience, like the gentle babbling of a nearby river, to unique sound effects (SFX) referencing specific events in the game, video games are filled with a wide range of audio that ensure an immersive experience. While this is true for most visual media, there’s one crucial difference when it comes to video games that separates them from most other forms of entertainment—they’re interactive. Namely, the audio featured in video games responds to player actions and the environment it is played back in, changing dynamically as the player experience develops.
However, have you ever wondered how the sound effects you hear in the game seem to match almost perfectly with the action you perform or the environment you enter? Finding the right audio samples to fill the soundscape and integrating them into a video game as seamlessly as possible to create an immersive experience is no easy feat. To ensure the proper use, high quality, and smooth fit of audio in video games, game audio quality testing is a must. And that’s where we come in. We have a dedicated audio quality testing laboratory, along with a variety of video game consoles and devices, that we use specifically for audio quality testing for games.
This brings us to the next point—how is game audio quality testing performed and what are its base principles? In this blog post we will look at the process in more detail and share some examples to answer this question.
Now, this is game testing!
When testing game audio, it's easy to check if a given sound accurately represents its intended action. But what if we consider the overall game environment, does the sound fit in—is it what you'd expect to hear? What if the sound doesn't match the visual reference and it breaks the player's immersion entirely—what then?
That's the thing about integrating audio into video games—the process is like a double-edged sword. Depending on the care taken, it can fully immerse players into the game or it can break their immersion entirely, ruining their gaming experience in the process.
If we compare video games to movies, a movie features linear audio throughout, while the interactive nature of video games allows players to experience it in countless ways, which the game developers can never fully predict. Nevertheless, they sure try their best and the testing department should be no less enthusiastic in ensuring that nothing gets broken in the process or disrupts player immersion.
That being said, there are various ways to test audio in games—by way of sound design, in-engine implementation, and audio middleware.
By Way of Sound Design
From a creative perspective, when testing game audio, you should look for connections between the different elements of the sounds being triggered around the in-game world. Specifically, pay attention to sound effects (SFX), music, and voice-over (VO). The sections here will serve as additive material to the more technical principles, making sure no obvious red flags are missed.
These principles should be checked for all platforms the game is available on—PC, consoles, mobile, VR, etc. The issues found for a certain platform could be triggered by scenarios unique to it, so you must take care when writing up specific bug re-creation descriptions.
Sound effects (SFX)
Even without a list to reference, you can simply navigate around the game space, following the game's natural progression and see—hear—how well the soundscape fits the expected outcome. Below we have two examples of a sound effect—full and stripped—depicting the sound of a hammer falling. If you listen to them closely, you will notice the stripped sound effect has elements missing.
Full sound effect depicting the sound of a hammer falling:
Stripped sound effect depicting the sound of a hammer falling:
It's extremely difficult to pay attention to all the audio details, but training your ear to look out for oddities and similarities will aid in noticing the errors subconsciously, without additional effort.
Some sounds will also have a more important role in the game, such as a gate unlocking after the enemies are defeated. Knowing what's important to the game helps the tester classify these sounds beforehand and find the crucial faults more efficiently.
Music
Outside of rhythm-driven games, the properties of music won't be a top priority, and testing that it doesn't keep playing indefinitely or stop abruptly will often suffice. That said, here's a short demo of how music could be reacting to testable in-game events.
Trigger events that signify a change in the gameplay dynamics, such as the player entering a combat scenario, grant the music a seamless flow. Testing if that flow gets broken is as simple as running through the trigger events.
Things to note are how smoothly the musical segments cross-fade. Do they leave inconsistent pauses between each other? Does music persist even after a change of game state or level? For example, the music system might not be properly responding to quick, successive action-state changes, resulting in the issue visualized below—the system, set to switch the music when the timing clock shows '1', misses the additional action-state change and incorrectly switches to exploration music anyway.
Animation showing music state switching error:
Voice-over (VO)
Voice-over, or VO, mainly refers to dialogue and announcements. Its priority is the parameter to look at here, as it dictates what voice lines should take precedence when a multitude of them are triggered.
Here are a few example scenarios you may come across when testing voice-over audio. If a character shouts to the player on a busy street, is that voice line discernable among the chatter? Or if multiple character interactions are triggered, do they keep playing after the player has moved away? Interacting with a variety of characters and hearing them all speak to the player at the same loudness level could also lead to crucial information being missed.
Additionally, large variations in loudness or pitch in lines that are spoken by the same character are usually a good indicator that something is due for a closer inspection.
By Way of In-Engine Implementation
A game engine is what merges all the linear elements, our beloved sounds included, into an interactive experience. A couple of the more popular ones out there are Unreal Engine and Unity, with internal engines made and improved in-studio, such as the CryEngine, being more specialized.
The following video snippet explains how non-linearity in game audio is managed by game implementation systems that require testing and why, for testers, understanding the principles used in those systems is more important than engine-specific instructions.
Audio console commands—debug code lines—will be featured for each section. Most well-known engines will have an equivalent, though they will differ slightly, hence they are written up in general terms. Keeping in mind, full command functionality will most likely only be available to the tester if the source project of the game is provided. For reference, here's the list of Unreal Engine audio console commands.
Randomization
Randomization is a simpler concept that is part of all things game audio. An effect sounding the same each time might be on purpose, such as alarms or victory jingles, but most will start to grow tiresome fast. Luckily, game engines are smart and can randomly adjust the pitch, loudness and delay of the elements of each sound, making each trigger sound slightly different while reducing the amount of resources needed to run the game itself.
Listening to the footstep samples below, you can note the difference between repeated and varied sound effects. The former is more monotonous, while the latter sounds more natural.
Redundant footsteps:
Varied footsteps:
To test randomization, you can keep inputting similar actions and note how the soundscape reactions change. These inputs could also be automated for more efficient data collection.
Audio Console Commands
- Isolate a certain sound class/cue/wave
- Show/debug all sound cues/waves
Spatialization & Attenuation
In short, spatialization deals with placing a sound object in the game world, and attenuation manages how far away it will be heard.
Testing sound spatialization and attenuation in games involves navigating around the 3D space of the game and paying close attention to audio positioning details. For example, are adjacent ambient sections overlapping well, are sounds reacting to changes in distance properly, and when the player perspective is shifted, do sounds follow logical panning?
This is where audio console commands will come in handy. Visualizing the sound sources helps immensely, as sounds might be set up as needed but still hard to place in the world. A faulty simulation utilizing the console commands of Unreal Engine as a testing tool can be enjoyed below:
Audio Console Commands
- Visualize active/all sounds
- Isolate a certain sound class/cue/wave
- Enable/disable audio spatialization
- Enable/disable audio attenuation
- Enable/disable binaural audio
- Show/debug all sound cues/waves
Occlusion
Occlusion describes the degree to which a sound is obstructed by objects between the audio source and the listener. For example, if you're standing outside a bar, the muffled music coming from within is occluded.
Looking at this example of simulated occlusion, we can see how sound behaves when obstructed, which prompts it to lose some of its spectral content in the process.
To test occlusion inconsistencies, consider the following scenarios. Is the delay before occlusion begins consistent? Is the drop in loudness level appropriate? Are frequency filters applied too heavily? Some of these faults can be observed in the audio console command simulation video from the previous section.
Audio Console Commands
- Visualize active/all sounds
- Enable/disable occlusion filtering
- Enable/disable low-pass & high-pass filtering
Concurrency
Concurrency is the limit on how many active sounds can be heard simultaneously in a game engine. It can be applied to an individual sound effect or a variety of manually grouped sound cues, like footsteps. Here's how concurrency is used to aid a game's resource usage and sonic intelligibility.
The testing process for concurrency is the simplest of all—spam away! Really, triggering sounds as often as possible or inputting as many different actions in rapid succession are the best methods to use. If the game crashes in the process, well done.
Audio Console Commands
- Show/debug all sound cues/waves
- Isolate a certain sound class/cue/wave
- Set maximum audio channel count
Additional Systems
Some additional game engine systems to consider are reverb, submixes, sound classes, audio volumes, and more detailed sound cue parameters. Though niche, these may be of higher importance to some game projects.
By Way of Audio Middleware
Audio middleware is a type of software used when the native audio implementation tools of a game engine don't provide sufficient controls over said implemented sound. Wwise and FMOD are the more popular examples to mention here. You can check out a tutorial for each below:
If a game project features audio middleware integration, getting acquainted with the documentation and systems within is a must. Modifying tools such as dynamic parameter sliders, context switches, and more, can be effectively debugged using the internal systems of the audio middleware, which can behave as a fully functional testing environment on their own.
Looking back at the video featured in the Music section of this blog, the Game State and Music State indicators demonstrate what you can expect to find within typical audio middleware.
Conclusion
While all the information we shared above is fascinating and provides an interesting view into sound design, it's important to keep in mind some standard testing practices when evaluating game audio. This includes setting priorities for audio areas of higher importance, noting any extra bits that seem out of line within the reports, and carefully describing procedures for reproducing any buggy scenarios afterward.
We can classify the principles of game audio testing explained here as exploratory testing, however, further functional testing ideas, as well as segmented automation, can be derived from them. TestDevLab has been working on developing these kinds of solutions to make game audio testing more efficient and accurate. Besides being spectacular ear candy, game audio serves a functional purpose, too. Namely, it provides information to players and offers additional insight into the world they're exploring. Prudent development of audio is paramount to a game's success, so it is important that all aspects of audio—sound effects, background music, voice-over, and so on—work as intended to deliver an immersive gaming experience. And we would love to be of assistance in making sure it does.
And remember, you should approach game audio testing in a less structural and more improvisational manner. Simply having fun while playing the game can be immensely beneficial, as it helps testers stumble upon sections and ideas within the game that regular players might also come across while playing, those players having similar interests in mind. As a result, bugs found this way will naturally be of higher importance and lead to more efficient test results.
Need help testing game audio—or require other audio and video quality testing services? Contact us with your project details and find out how we can help.
Additional Material
Want to learn more about game audio testing and advance your skills in the process? If you have some extra time to spare, we recommend watching these videos relevant to game audio! They're sure to help reduce your stress levels, if nothing else.