With each new generation of hardware, there are a whole host of weird things to learn about how to make effective games. With our first game, Taxiball, we ended up doing a lot of strange things while creating the soundtrack. Instead of your standard sound effects and musical score, we decided to do something pretty radically different - an all-vocal beatbox soundtrack that is highly responsive to user input.
For the Art of Sound contest, I thought it might be neat to give people a bit of insight into how we put together this unique take on in-game audio, and more importantly, why. While Taxiball isn't primarily a game about music, the music's an integral part of the game - not only does it respond to players' actions, but it also communicates some very specific information back to the player. The Art of Sound, in this case, is the way the audio in Taxiball responds to the player's interactions, and the meaning that it communicates back to the player.
Here's a video of Taxiball's gameplay - a preview video we made just before the game launched - but it's a good representation of the general style of the game's audio:
We're really happy with the way the game's turned out - and since we ended up learning so much during the development process, it seemed only sensible to share our experience with others. If you're interested in a bit of a discussion about the design and development process of a game, particularly about something that most people might not give a second thought, hopefully this will be a useful insight into the way things get built.