Documentation

Demo - VR Room - Open Dev Kit Documentation

Open Dev Kit Documentation :: Demo - VR Room

3D Virtual Reality game room with 3D physics utilizing OpenGL, Jolt physics, OpenVR and OpenXR. Players can freely move in a 3D environment and interact with various objects using VR controllers. In addition, they can pick them up and throw them at other objects, watching them react dynamically to physics.

How it Works

  • Game Initialization
      The game's main Scene initializes with a 3D environment containing various physics-enabled objects, as well as static ones. The UI Scene HUD is also loaded, displaying control instructions for movement and aiming/throwing.
  • Player Controls
      The Controls Keybinds Resource is responsible for the camera's 3D movement as well as the aiming, picking up and throwing objects. Players can navigate the space freely using either the keyboard's WASD keys or a VR controller, rotate their view by right-clicking and dragging the mouse or via VR headset motion and pick up/throw objects by pressing and releasing the VR controller's trigger.
  • Interaction with Objects
      When the VR controller's trigger is pressed, the Grab and Interact parameters of either gLeftController or gRightController are changed to 1 (and back to 0 when the trigger is released). Since these two variables belong to the Character class, these changes in turn lead to the setter script's execution of the mInteract and mGrab variables of that class. Essentially these are the scripts for actually interacting/grabbing/throwing objects.

      The grabbing system works by having each controller equipped with a very large, collision shape (a line) that does not physically push objects due to its Physics Solid property being set to 0 (unchecked). When grabbing is initiated, the controller goes through every object that is colliding with it and selects the closest one. This object is then assigned to the mHolding pointer in the controller's Character class. While an object is being held, the HoldTick function continuously updates the grabbed object's position and orientation, keeping it locked to the controller's movement.

      Interaction is handled slightly differently as it relies on the existing Mouse Pressed and Mouse Released Event system. When the player activates an object, the controller once again checks the collided objects, finds the nearest one, and simulates a Mouse Pressed Event on it, causing it to behave as if it was clicked with a mouse. Interactable objects need to have the Selectable checkbox ticked.

  • Object Recoloring
      The VR demo Scene features two sinks filled with water. One with normal clear water and another with color-tinted water that players can modify using three pairs of clickable switches for Red, Green, and Blue values. These switches adjust the RGB values of the colored water in real time.

      When an object is dipped into either cauldron, the water actor's Ticked Event checks for collisions via an Array Loop function. Every tick, it goes through all currently intersecting objects and gradually changes their tint color toward the water's current color with a Change Tint Color function. The clear water sink works the same way, but instead resets the object's tint to its original color. Dropping an object into the water also triggers a visual water splash effect to provide feedback.

  • Hand Switching
      In addition to grabbing and interacting, the controller's state is also visually reflected by switching hand animations. This is managed through the mHandPose variable in the Controller class. When this value changes, it triggers its corresponding Setter Script, which updates the hand's visual pose in real time. The hand will switch to a pointing animation when hovering over an interactable object while the player is in an interaction state, to a grabbing animation when the trigger is pressed, and will otherwise default to an idle hand pose.

If you think anything is missing, please feel free to: submit documentation feedback on this page