-
Notifications
You must be signed in to change notification settings - Fork 0
Recipe ‐ Turn your dorm room into a VR experience
This recipe will show you how to build a 1:1 replica of your room/apartment and move inside of it both in the virtual and the physical world. You will learn how to setup XR games with Godot and run them on an Meta Quest device. In the end you will have an interactive replica of your room.
Note that some things won't be explained too much in detail, so if you have no experience with the Godot Engine you might have to look some things up.
- Meta Quest 2 or newer with Developer Mode activated.
- Working connection between Headset and PC via Link Cable (see https://www.meta.com/de-de/help/quest/articles/headsets-and-accessories/oculus-link/set-up-link/).
- Godot 4.x installed, download it here: https://godotengine.org/download/ (I am using version 4.2)
- Android Studio and Open JDK17 installed for exporting the project (see https://docs.godotengine.org/en/stable/tutorials/export/exporting_for_android.html for requirements)
- Basic familiarity with the Godot Engine and GDScript is beneficial.
- Tape measure, (squared) paper, pen for measuring and sketching the room
- Download two hand models (
left_hand.glb
andright_hand.glb
) which we will later use for hand tracking from this repo: https://github.com/JoeLudwig/xrsandbox/tree/master/projects/xrsandbox/assets/models/valve_hand_models
Godot is an open-source game engine that is free to use and is famous for being lightweight and having a small executable size. The basic setup for a XR game can be done in less than 2 minutes and doesn't require 3rd party plugins or assets. I find the setup process for Unity to be very tedious because of the large amount of plugins and APIs for XR development. In my experience there is always some problem popping up when setting up a project. Additionally Godot is just great for tinkering around with ideas because of its straight forward node-based system and the easy-to-use scripting language GDScript (which is syntax-wise VERY similar to Python).
Start the Godot Engine, now the Project Manager should open. Click on New
. Select a Project Path and Project Name of your choice, the Forward+
Renderer is fine for our purposes. Click on Create & Edit
. Now the Engine opens.
Go into Project->Project Settings
. Under XR
click on OpenXR
. Set Enabled
to true. Click on Shaders
and set Enabled
to true. On the bottom right of the Project Settings
window click on Save & Restart
, now the Engine restarts and applies the changes.
In the Scene
view, click on Create Node3D
to create the root note. Select the Node3D node and click on the +
icon. Now you can search for "XROrigin3D"
and add the node. Under the XROrigin3D
node add a XRCamera3D
node. Now the most basic setup is done. You can try to start the game. If everything is setup correctly you should be able to look around and see the floor we added before.
Attach a script to your root node. I called it Main
. Add the following code into the script. This tries to find an OpenXR instance, in our case the Meta Quest, then V-Sync is disabled and the device is used as a viewport.
extends Node3D
var xr_interface: XRInterface
func _ready():
xr_interface = XRServer.find_interface("OpenXR")
if xr_interface and xr_interface.is_initialized():
print("OpenXR initialized successfully")
# Turn off v-sync!
DisplayServer.window_set_vsync_mode(DisplayServer.VSYNC_DISABLED)
# Change our main viewport to output to the HMD
get_viewport().use_xr = true
else:
print("OpenXR not initialized, please check if your headset is connected")
Create two OpenXRHand
nodes. The OpenXRHand
nodes automatically receive the information from the headsets handtracking information. Click on the first OpenXRHand
node and set Hand
to "Left"
, do the same for the second one but set its Hand
property to "Right"
.
Create a new folder models
in the file system and drag and drop right_hand.glb
and left_hand.glb
in there. When they are finished importing, drag left_hand.glb
onto the left-hand OpenXRHand node, do the same with right_hand.glb
and the right-hand OpenXRHand node. Now right-click on the newly added nodes and activate Editable Children
. This exposes their subchildren - You should now see a Skeleton3D
node. Assign each OpenXRHand
node its respective Skeleton3D
node under the option Hand Skeleton
.
Now start the game. If handtracking is activated on your Meta Quest, you should see the hand models properly tracked to your hand movement. Nice job!
Now we need to step out of the virutal world for a second. Get your tape measure and start measuring your room (or your entire apartment if you are really motivated).
Measure all the walls and all of your furniture. To be able to memorize everything get your piece of paper and make a floor plan with all the widths and lengths. This should be pretty accurate as you want to have a 1:1 representation of your room. Also write down your wall height and your furniture heights.
Once all measurements have been done and we have our floor plan ready we can go back into the Godot Editor. Start building the geometry for your room based on your floor plan. I used simple MeshInstance3D
nodes for that but if you want to add some physics based objects to interact with into your game, you can add colliders to the meshes.
You will need to calculate the placement of your wall and furniture objects based on their sizes, but with basic middle-school geometry knowledge this should be no problem.
This is what my result looks like:
You can see that I structured my meshes a little bit. You can create simple Node3D
nodes to group your meshes. I also added a SpotLight3D
light, but you can add whatever you want.
You can already add some textures or materials to the walls and the floor to theme it a little bit.
Now you need to set your origin which is the starting point in your game. The starting point has to be the same in the real world as in the virtual world. If this is not the case the position in the virutal world doesn't correspond to your position in the physical world and everything will be off. Get your tape and mark a spot on your floor with it. This will be your origin point. Get your tape measure and determine the position of your origin. Now you have to set the position of your XROrigin3D
node to this exact position.
Sometimes when starting the game, our position doesn't match the origin or we want to recenter our origin. This is why we have to make a small adjustment to our Main
script.
We have to listen for the pose_recentered
signal that our XRInterface
emits when the player presses the Oculus button for 2 seconds or when a reset is triggered in the menu. Connect to the signal in the _ready
function.
xr_interface.pose_recentered.connect(_recenter_requested)
Now add a _recenter_requested
method which is called whenever the signal is received.
func _recenter_requested() -> void:
XRServer.center_on_hmd(XRServer.RESET_BUT_KEEP_TILT, true)
Your Main
script should now look something like this:
extends Node3D
var xr_interface: XRInterface
func _ready():
xr_interface = XRServer.find_interface("OpenXR")
if xr_interface and xr_interface.is_initialized():
print("OpenXR initialized successfully")
xr_interface.pose_recentered.connect(_recenter_requested)
# Turn off v-sync!
DisplayServer.window_set_vsync_mode(DisplayServer.VSYNC_DISABLED)
# Change our main viewport to output to the HMD
get_viewport().use_xr = true
else:
print("OpenXR not initialized, please check if your headset is connected")
func _recenter_requested() -> void:
XRServer.center_on_hmd(XRServer.RESET_BUT_KEEP_TILT, true)
Let's try out if our virtual room is perfectly mapped to the real world. For this it is we have two options:
- Exporting the project as a native application and run it directly on our Quest.
- Running the game via AirLink.
You can choose one of these options.
For running the game directly on the Meta Quest we have to export it to Android.
Go to Editor->Manage Export Templates
and click on Download and install
. This downloads all necessary Export Templates.
After the the Export Templates are finished downloading go to Project->Install Android Build Template..
. A confirmation popup will open up. Click on Install
.
Access the AssetLib
and search for Godot OpenXR Vendors
. Download the library and click on Install
in the popup. The vendors plugin needs to be enabled before the export settings become accessible. Open Project and select Project Settings..
. Go to the Plugins
tab. Enable the GodotOpenXRVendors
plugin.
Now we can go to the Export Settings in Project->Export
. Click on Add..
.
Maybe you encounter this error in the Export settings: "A valid android SDK path is required in Editor settings". To tackle this problem, go to Editor->Editor Settings->Android
and add your Android SDK Path, which can be found in "C:/Users/youruser/appdata/Local/Android/Sdk"
The last step is to create a export preset for our Quest Device.
Open Project
and select Export..
. Click on Add..
and select Android
. Next change the name of the export preset for the device you're setting this up for, say Meta Quest
and enable Use Gradle Build
. Also make sure to enable Runnable
for using one-click deploy. Under XR Feature
change the XR Mode
to OpenXR
.
Scroll to the bottom of the list and you'll find additional Meta XR Features. Set Hand Tracking
to optional, this enables us to use hand tracking in our application.
Now we can run the project with remote debug. The cool thing is, that everything is now executed on the headset instead of the computer. This means, we can unplug our USB cable from the Headset and go around freely.
This is pretty straight forward and as you have already established a connection with QuestLink this should be done really quickly. Make sure that your computer and your Meta Quest are connected to the same network. Now go to the quick settings on your Quest device, go to "Quest Link" and enable "Use Arilink". Your computer should now show up as a device. Now all there is left to do is to connect and to start the game in Godot.
Yay! It works. Your view should now look similar to mine:
this_seems_to_work.mp4
But wait.. the real world position is not correctly mapped to our virtual position. This is where our previous origin positioning comes in handy. Put your headset on the floor on the spot that you previously marked with your tape.
Now it depends if you are connected via AirLink or Remote Debug. If you are connected via Remote Debug just keep the Oculus Button pressed for a while. This should trigger the recentering.
If you are connected with AirLink you have to press the Oculus Button before you put the headset on the marked spot and click the "Reset view" button in the AirLink menu. Now put the headset on the marked spot and press any button.
If everything worked correctly your virtual world should be aligned with the real world. (Some minor offsets can't be avoided) Congrats!
Now that we can finally move around in our own virtual room we have to do something cool with it that is (probably) not possible in the physical world. I added a window to my room. Then I downloaded some cool free space assets from kenney.nl and threw them into my scene, so I can see a nice mars-like scenery when looking out of my window. I also added a DirectionalLight3D
and a WorldEnvironment
to the scene to light everything properly. Make sure to create an Environment
in your World Environment
and set the Background Mode to Sky
and add a Sky
.
But what if my room was some kind of spaceship that can roam around on the planet? Let's make it happen. I renamed my Geometry
node to Spaceship
. This is where all our room geometry is in.
Create a new scene called Button
. Give it a MeshInstance3D
with a cylinder shape to make it look like a button. Then add a Area3D
node and under that a CollisionShape3D
node with the same parameters as the mesh instance shape. We need this in order to detect an interaction with the players hands.
Now add a script to the button, I called it Button
. Click on the Area3D
node and go to the tab Node
besides the inspector. Connect the signals area_entered
and area_exited
to your main scene node Button
. Now two functions should have be added to your Button
script: _on_area_3d_area_entered(area)
and _on_area_3d_area_exited(area)
. These get triggered whenever something enters the Collision Shape of the Button.
Change the Button
script to follwing implementation. This adds two signals that are emitted from the button whenever something enters or leaves the button, signalling a button press or a button release.
extends Node3D
signal button_pressed()
signal button_released()
# Called when the node enters the scene tree for the first time.
func _ready():
pass # Replace with function body.
# Called every frame. 'delta' is the elapsed time since the previous frame.
func _process(delta):
pass
func _on_area_3d_area_entered(area):
button_pressed.emit()
func _on_area_3d_area_exited(area):
button_released.emit()
Add two buttons into the scene. I added them into Spaceship/Furniture
. Move them somewhere where you can properly press them. In my case, they are on my desk.
Now we need to add an Area3D
node to our right_hand
and left_hand
meshes which are subnodes of OpenXRHand
and OpenXRHand2
. Then add a CollissionShape3D
under both Area3D
nodes. Give them a cube shape with size (0.1,0.1,0.1) or (0.2,0.2,0.2). This is the bounding box of our hands. When this shape enters the button, the button_pressed
and button_released
signals will be emitted.
The last thing we have to do is to add a script to our Spaceship
node. I call the script Spaceship
. What this script does is listening for the signals emitted by the buttons and moving the whole spaceship (which is actually just a room) forwards or backwards. The script should look something like this:
extends Node3D
var speed = 0.0
# Called when the node enters the scene tree for the first time.
func _ready():
$Furniture/ButtonForwards.connect("button_pressed", _move_forwards)
$Furniture/ButtonForwards.connect("button_released", _stop)
$Furniture/ButtonBackwards.connect("button_pressed", _move_backwards)
$Furniture/ButtonBackwards.connect("button_released", _stop)
# Called every frame. 'delta' is the elapsed time since the previous frame.
func _process(delta):
translate(Vector3(0, 0, speed * delta))
func _move_forwards():
speed = -1.0
func _move_backwards():
speed = 1.0
func _stop():
speed = 0.0
Now everything is set. Let's try it out!
spaceship_train_but_its_actually_my_room.mp4
My furniture looks kinda blocky, because it is literally just plain blocks. But you could add more elaborate 3D-models for chairs, desks and shelves. You could also add nice textures or images on the walls. You can try to mimic your physical room or theme it completely differently - the possibilities are endless.