top of page

VIDEOS

Gregory has created a wide variety of instructional videos, both in his own personal capacity as well as for the XR Terra development bootcamp.

XR Device Simulator 2.3.0 And Later
22:05

XR Device Simulator 2.3.0 And Later

This video explains how to use the XR Device Simulator to control a VR rig from the Unity Editor using just the keyboard and mouse. Unity updated the XR Device Simulator in the XR Interaction Toolkit package version 2.3.0, so if you're using an older version of the package look at our other video. XRTerra Links: Programs and Courses: http://www.xrterra.com/programs Newsletter Signup: https://www.xrterra.com/mailing-list-signup/ XR Device Simulator 2.2.0 And Earlier: https://youtu.be/Lo6_1CnycFE Your First VR Scene with the XR Interaction Toolkit in Unity: https://youtu.be/nlRzw2lCIkk VR Locomotion with the XR Interaction Toolkit in Unity: https://youtu.be/sQFdjAV-dBg 00:00 Intro 00:23 The XR Device Simulator 01:49 Updating XRITK package to 2.3.0 02:27 Importing XR Device Simulator 03:46 Setting up a VR Scene 04:32 Adding XR Device Simulator Prefab to our scene 05:00 Looking around 05:35 Rotating only the camera 06:10 Simulating walking around 07:09 Locking the cursor 08:04 Toggling between rotation and translation 08:42 Scrolling the mouse 09:34 Selecting the controllers with toggle button 10:56 Adding locomotion to our scene 11:29 Using the controller joysticks for locomotion 12:39 Selecting the controllers with Left Shift and Spacebar 13:43 Controlling both controllers simultaneously 14:46 Controller buttons 15:19 Creating a grabbable cube 15:46 Grabbing with the XR Device Simulator 16:48 Dangers of clicking out of the Game window 17:54 Setting up Activate UnityEvent 18:24 Testing cube activation 18:50 Simulating other controller buttons 19:14 Other controller's joysticks 19:46 Adding snap turn controls 19:59 Testing out alternate controller joystick 20:56 XR Device Simulator as a debugging tool 21:51 Outro Instructor: Gregory Osborne
Custom Input Actions With The XR Interaction Toolkit using Unity's New Input System Package
21:32

Custom Input Actions With The XR Interaction Toolkit using Unity's New Input System Package

This video shows you how to detect your own inputs, such as buttons or joystick controls, by adding your own custom input action to the XRI Default Input Action asset and subscribing to their events through a C# script. XRTerra Links: Programs and Courses: http://www.xrterra.com/programs Newsletter Signup: https://www.xrterra.com/mailing-list-signup/ Your First VR Scene with the XR Interaction Toolkit in Unity: https://youtu.be/nlRzw2lCIkk C# Events in Unity: https://youtu.be/rhRGBTYONgY XR Device Simulator 2.2.0 And Earlier: https://youtu.be/Lo6_1CnycFE 00:00 Intro and prerequisites 00:52 VR Scene setup 01:45 Explaining our exercise 02:14 Opening the XRI Default Input Action Asset 02:51 Creating a new Input Action 03:12 Binding the button action path 03:56 Adding an interaction condition 04:45 Creating a script 05:15 using UnityEngine.InputSystem namespace 05:35 Referencing the Input Action Reference 05:50 Explaining Button action events 06:20 Subscribing to the button's events 06:58 Creating the subscribed function 07:13 Receiving InputAction.CallbackContext parameters 08:12 Toggling a Mesh Renderer component 08:40 Creating a cube to toggle 09:13 Assigning button Input Action reference in Inspector 09:46 Finding our Input Action asset in Project window 10:02 Testing out our button action 10:49 Lets use the thumbstick to move our cube around 11:13 Creating the custom thumbstick input action 11:56 Changing the Action Type 12:13 Setting the Control Type to Vector2 12:49 Binding the thumbstick action path 13:49 Declaring a new Input Action Reference 13:59 Subscribing to the thumbstick action 14:40 Extracting information from the CallbackContext type 15:07 context.ReadValue 15:39 Putting extracted vector into the Console 16:02 Assigning thumbstick Input Action reference in Inspector 16:22 Debugging thumbstick inputs with the headset 17:13 Testing out thumbstick output in the Console 17:57 We already have a reference to the cube 18:14 Describing what we're about to do 18:35 Converting Vector2 to Vector3 19:22 Moving the cube by our new vector 19:59 Testing out a very fast cube 20:28 We'd probably multiply the vector times Time.deltaTime 20:39 Summary Instructor: Gregory Osborne

Audio Tutorials

Gregory has created tutorials on how to create ambisonic files, how to implement audio-reactive visuals, and how to trigger pre-made Digital Signal Processing clips, to list a few

Simulation Series
Interview

Gregory was given the opportunity to talk about his work on interactive music in virtual reality with the Simulation Series in 2019 while he was still at Berklee. He has been working ever since to bring his visions to reality.

Dance-Interactive Music

Starting during his time at Berklee College of Music, Gregory has been theorizing and then creating dance-interactive music, where the listener has an integral role in the playback of audio

Interactive Music Videos in VR (IMVVR)

While at Berklee College of Music, Gregory started a club to discuss the possibilities of interactive music videos in VR. While the meetings did not draw many people, a lot of important conversations and brainstorm sessions occurred that would have a great influence on later projects.

RaveGazebo_3000x900.png

Rave Gazebo Videos

As part of the release of Rave Gazebo, Gregory prepared a behind the scenes video for each of the six songs, and also created special videos to cover interesting techniques used during the development process.

Behind The Scenes of Songs

These videos take a peek behind the curtain at the creating of Rave Gazebo's songs, explaining how some of the assets were created and the special techniques used to achieve fun visual effects.

Deeper Technical Dives

The next videos shine a spotlight on common techniques we used in Rave Gazebo that could be useful to those attempting to make their own dance-interactive VR songs. We talk about stencil shaders, giving information to the user, audio-reactivity, and more!

bottom of page