Dan Reynolds is an award winning composer and sound designer with a love for video games and interactive arts. For almost a decade, Dan has engaged every new project as an opportunity to create compelling audio experiences. Whether cutting dialog, designing sounds, producing music, implementing in-engine audio, MIDI programming, or just writing music--Dan embraces varied and new opportunities.
Game Audio Networking Guild
2010 Best Handheld Audio
Monkey Island 2 Special Edition: LeChuck’s Revenge
Guilty Gear Xrd -SIGN- (PS4/PS3/Arcade)
Dialog Editor (2014)
Fix the Leaks (iPhone/iPad)
Composer & Sound Designer (2011, 2013)
AMD Moovida Media Center
WWISE Implementation (2011)
Caribbean Treasures (iPhone)
Composer & Sound Designer (2011)
Red Orchestra 2: Heroes of Stalingrad
Choir Contractor (2011)
Monkey Island 2 Special Edition: LeChuck’s Revenge (PC:Steam, XBLA, PSN, iPhone/iPad)
Music Producer & Arranger / MIDI Mock-Up (2010)
Wonderland Adventures 3: Planet of the Z-Bots (PC)
Marvel Superhero Squad (Wii, PS2)
Music/MIDI Programmer/MIDI Mock-Up (2009)
Composer & Sound Designer (2009)
Above is a selection of compositions for games and for fun.
Here is a recent example of some interactive music composition/design/implementation. I wanted to experiment with the Unreal Engine 4's visual scripting system called BluePrints--I had a hypothesis on creating a smooth crossfade between multiple layers of music activity and I had to build a level to test my hypothesis.
The first thing I had to build was a case manager hooked into the Player Controller.
The interactivity concept required real-time fade movement based on player character movement, so I built a simple switch case that set a numerical target for a mathematical interpolation system.
Next in line is the interpolation manager-this series of blueprints manages which layer of music will be fading in at any given moment and whether the fade is on its way out or coming in.
The MovementControlValue is essentially the current value of the activity level and it is constantly modified by the FInterp To function. It's always trying to reach the Movement Target value, which is passed from the Activity Target value from the previous blueprints. Essentially, if the player character is in motion, the movement target value is high and so the MovementControlValue tries to go up, if the player is not moving, then the Movement Target value is low, and the MovementControlValue tries to go down.
I vary the interpolation speed based on whether it's going up or down. This way, the music ramps up faster than it ramps down.
After this point, I set the Med Status and Act Status (which are the volume levels of the two fading music layers) and feed them into an arbitrary pointer called Current Track which points to the currently playing audio component.
Below, you can see my Music Segment--in this case, two bars of music (or a single vertical slice of music). Here is where I set the value of the play length, play the music cue, and then delay the tick for the duration of the music.
By using a delay, I can arbitrarily start the next piece of music-this allows my music segments to have reverb tails, allowing a smoother transition between segments and a more musical stop, if I branch off to another segment or some exit point.
The Music Delay Duration is a custom function I built within BluePrints. It's essentially a music math function that determines the length of time of each individual beat based on BPM information and then multiplies that amount by the number of beats reported.
Sound Replacement Demo
I recently spotted a very cool student animation project and got permission from the artist to redo the Sound Design for my own demo purposes—this recent work I feel expresses well my sense of mixing, editing, and naturalistic design (approx. 20-30% originally sourced sounds).
Original Sound Work
This is a more cartoony or arts-and-crafty example of a sound design aesthetic. This was a casual physics puzzler I did sound and music for—the sound was a lot of fun to work on and the sound sources are approx. 95% originally recorded.
I won’t lie, I spent a fair amount of my time trying to get an endless sludgy loop of leaking goo. The creature vocalizations are actually more varied than is on the video, but the client was still tweaking an emotional state system for the critters when he recorded this.