Performance and Development Efficiencies via Code Base Integration

Being the developer of the audio engine as well as the composer and sound designer for Children of Liberty led to several advantages for the project and team. It allowed me to integrate audio functionality directly the into the main codebase, which created opportunities for runtime efficiencies and saved game developers from having to address audio functionality themselves.

Here are a couple of examples:

  1. Sprite Manager 2 Animation Cell Integration
    For animation-driven sounds, rather than exposing the expensive SM2 event handling code that created a delegate call out to the audio engine every frame (30x per second), I created an SMIntegration class that assigns sound effects to individual animation cells at load time. This completely removed the per-frame delegate calls, and replaced them with a simple existence check. The audio engine is only called when a sound effect has been assigned to the given frame.
  2. Parameter Integration
    Several parameters are used to affect interactive audio, for example the volume and rate of the player's heartbeat when they are in cover, or the dynamic mixing of music based on the current tension level of the scene. Rather than the game developers having to be aware of when to update these parameter values, the audio engine itself is aware of how and when to retrieve their current values through a set of delegate functions whose call frequency and other properties are configurable.