top of page

Gregory Osborne

- Interactive Audio Specialist
- XR Curriculum Developer

Headshot_Osborne_April2019_2.jpg

Interactive Audio

After studying Video Game Music Implementation at Berklee College of Music, Gregory discovered the power of VR to convert movements of the body into inputs that could be used to trigger musical events.

Oversaturated

Oversaturated was Gregory's first attempt to create what he called an "interactive music video in VR". While the project taught him much about Unity development, implementing audio in Wwise, and creating audio-reactive visuals, playtesting revealed that the system was too complex to be explained to non-musicians, so he pivoted to a new project.

Rave Gazebo

Rave Gazebo is a dance-interactive virtual reality album that uses your dancing to remix a song in real time. The main input is simply gesturing in one of six directions, which switches between pre-composed layers of music. Each gesture interacts with the environment, sending physics objects into the scene and generally creating chaos. Learn more in our special Rave Gazebo Page!

Composer Demo Reel

Here is a video compilation of the projects Gregory worked on in interactive audio from 2019, before pivoting to focus his efforts on virtual reality.

Ambisonic Tutorial

While interning at the Brookline Interactive Group, Gregory created this tutorial on how to quickly and cheaply create ambisonic files using Reaper and the Ambisonic Toolkit

Simulation Series
Interview

Gregory was given the opportunity to talk about his work on interactive music in virtual reality with the Simulation Series in 2019 while he was still at Berklee. He has been working ever since to bring his visions to reality.

XR Instructor

Starting in the middle of the Covid-19 lockdown, Gregory was hired to work on developing curriculum for the online boot camp at XRTerra.

 

He has used his own VR development journey to create hands-on courses with a heavy emphasis on live troubleshooting and support.

XR Device Simulator 2.3.0 And Later
22:05

XR Device Simulator 2.3.0 And Later

This video explains how to use the XR Device Simulator to control a VR rig from the Unity Editor using just the keyboard and mouse. Unity updated the XR Device Simulator in the XR Interaction Toolkit package version 2.3.0, so if you're using an older version of the package look at our other video. XRTerra Links: Programs and Courses: http://www.xrterra.com/programs Newsletter Signup: https://www.xrterra.com/mailing-list-signup/ XR Device Simulator 2.2.0 And Earlier: https://youtu.be/Lo6_1CnycFE Your First VR Scene with the XR Interaction Toolkit in Unity: https://youtu.be/nlRzw2lCIkk VR Locomotion with the XR Interaction Toolkit in Unity: https://youtu.be/sQFdjAV-dBg 00:00 Intro 00:23 The XR Device Simulator 01:49 Updating XRITK package to 2.3.0 02:27 Importing XR Device Simulator 03:46 Setting up a VR Scene 04:32 Adding XR Device Simulator Prefab to our scene 05:00 Looking around 05:35 Rotating only the camera 06:10 Simulating walking around 07:09 Locking the cursor 08:04 Toggling between rotation and translation 08:42 Scrolling the mouse 09:34 Selecting the controllers with toggle button 10:56 Adding locomotion to our scene 11:29 Using the controller joysticks for locomotion 12:39 Selecting the controllers with Left Shift and Spacebar 13:43 Controlling both controllers simultaneously 14:46 Controller buttons 15:19 Creating a grabbable cube 15:46 Grabbing with the XR Device Simulator 16:48 Dangers of clicking out of the Game window 17:54 Setting up Activate UnityEvent 18:24 Testing cube activation 18:50 Simulating other controller buttons 19:14 Other controller's joysticks 19:46 Adding snap turn controls 19:59 Testing out alternate controller joystick 20:56 XR Device Simulator as a debugging tool 21:51 Outro Instructor: Gregory Osborne
Custom Input Actions With The XR Interaction Toolkit using Unity's New Input System Package
21:32

Custom Input Actions With The XR Interaction Toolkit using Unity's New Input System Package

This video shows you how to detect your own inputs, such as buttons or joystick controls, by adding your own custom input action to the XRI Default Input Action asset and subscribing to their events through a C# script. XRTerra Links: Programs and Courses: http://www.xrterra.com/programs Newsletter Signup: https://www.xrterra.com/mailing-list-signup/ Your First VR Scene with the XR Interaction Toolkit in Unity: https://youtu.be/nlRzw2lCIkk C# Events in Unity: https://youtu.be/rhRGBTYONgY XR Device Simulator 2.2.0 And Earlier: https://youtu.be/Lo6_1CnycFE 00:00 Intro and prerequisites 00:52 VR Scene setup 01:45 Explaining our exercise 02:14 Opening the XRI Default Input Action Asset 02:51 Creating a new Input Action 03:12 Binding the button action path 03:56 Adding an interaction condition 04:45 Creating a script 05:15 using UnityEngine.InputSystem namespace 05:35 Referencing the Input Action Reference 05:50 Explaining Button action events 06:20 Subscribing to the button's events 06:58 Creating the subscribed function 07:13 Receiving InputAction.CallbackContext parameters 08:12 Toggling a Mesh Renderer component 08:40 Creating a cube to toggle 09:13 Assigning button Input Action reference in Inspector 09:46 Finding our Input Action asset in Project window 10:02 Testing out our button action 10:49 Lets use the thumbstick to move our cube around 11:13 Creating the custom thumbstick input action 11:56 Changing the Action Type 12:13 Setting the Control Type to Vector2 12:49 Binding the thumbstick action path 13:49 Declaring a new Input Action Reference 13:59 Subscribing to the thumbstick action 14:40 Extracting information from the CallbackContext type 15:07 context.ReadValue 15:39 Putting extracted vector into the Console 16:02 Assigning thumbstick Input Action reference in Inspector 16:22 Debugging thumbstick inputs with the headset 17:13 Testing out thumbstick output in the Console 17:57 We already have a reference to the cube 18:14 Describing what we're about to do 18:35 Converting Vector2 to Vector3 19:22 Moving the cube by our new vector 19:59 Testing out a very fast cube 20:28 We'd probably multiply the vector times Time.deltaTime 20:39 Summary Instructor: Gregory Osborne

VR Developer Foundations

This is an 8-class course that teaches you the foundations of VR development in the 3D Engine Unity, introducing the XR Interaction Toolkit, Collision and Trigger Detection, Animations, with an emphasis on learning C# programming fundamentals

AR Playground: CoSpaces

Many school systems in the US do not have access to computers that can run more powerful programs such as Blender and Unity, and instead rely on Chromebooks to give their students internet access. In order to teach these students XR development skills, Gregory created a course that can be taught using only web-based applications like CoSpaces and Tinkercad.

AR Playground: Reality Composer

The iPad is an incredibly powerful device with several key pieces of hardware that make it a useful tool for the 21st century. This course has students take advantage of the camera, microphone, and internet access of their iPads to create their own interactive Augmented Reality experience in Apple's Reality Composer app.

bottom of page