After studying Video Game Music Implementation at Berklee College of Music, Gregory discovered the power of VR to convert movements of the body into inputs that could be used to trigger musical events.
Oversaturated was Gregory's first attempt to create what he called an "interactive music video in VR". While the project taught him much about Unity development, implementing audio in Wwise, and creating audio-reactive visuals, playtesting revealed that the system was too complex to be explained to non-musicians, so he pivoted to a new project.
Rave Gazebo is a dance-interactive virtual reality album that uses your dancing to remix a song in real time. The main input is simply gesturing in one of six directions, which switches between pre-composed layers of music. Each gesture interacts with the environment, sending physics objects into the scene and generally creating chaos. Learn more in our special Rave Gazebo Page!
Composer Demo Reel
Here is a video compilation of the projects Gregory worked on in interactive audio from 2019, before pivoting to focus his efforts on virtual reality.
While interning at the Brookline Interactive Group, Gregory created this tutorial on how to quickly and cheaply create ambisonic files using Reaper and the Ambisonic Toolkit
Gregory was given the opportunity to talk about his work on interactive music in virtual reality with the Simulation Series in 2019 while he was still at Berklee. He has been working ever since to bring his visions to reality.
Starting in the middle of the Covid-19 lockdown, Gregory was hired to work on developing curriculum for the online boot camp at XRTerra.
He has used his own VR development journey to create hands-on courses with a heavy emphasis on live troubleshooting and support.