Context
While preparing for a round of research with an immersive experience, we found there was a need to observe the output from the VR headset and controllers. Existing frameworks support this type of input exposure, but traditionally this has been accomplished either by monitoring the system state within an emulator (Dolphin) or by capturing input from a USB device at an OS level.
Modern consoles make this more challenging, with the onboard hardware inaccessible, relying instead on input displays on a per game level - FIFA 23 has an example of this for online play.
For VR devices this is different as they are a contained console device as well as a possible control surface. There’s an accepted requirement to show a controller overlay in VR that bridges the gap between the real-world, handheld device and the virtual-world analogue. This bridging aids in user understanding of the possibility of actions and provides an anchor point for their senses.
However, the disconnect between the VR device user and any external observers is quite a gulf of experience. Without a secondary headset that is mirrored to the display output of the primary, the observer is left to view a projection on a flat screen, akin in experience to plato’s shadows on a cave wall.
Approach
To this end, for recent research lab sessions I wanted to find a means of displaying some form of controller input information on an on screen overlay. My search brought me into the SteamVR file structure where I found a folder of JSON files containing input mappings from a set of typical consumer VR headsets (Valve Index, Etc etc). This folder also contained a JSON file for ‘Oculus’. I’m assuming that the name is a dependency elsewhere that hasn’t been updated to ‘Meta’ or ‘Quest’.
I deduced that SteamVR must be reading input in some manner and using these JSON mappings to direct button presses to their corresponding action. Like a controller config menu in modern games. Through online research I discovered ‘SteamVR2WS’; a local server application that runs in parallel to SteamVR and will catch VR device inputs and make them available to query via a websocket server. You can then build a frontend graphic that will display the inputs as a JSON object in a webpage. I adapted an existing overlay for the HTC Vive to the Meta Quest 2 with an overlay for controllers and general telemetry capture.
The implementation involved configuring a mapper for the controller telemetry JSON payload, and triggering a computed svg graphic on a localhost webpage that would respond to the input values.
Once I figured out the JSON configuration, the rest was an exercise in computational graphics. I traced a basic outline of the Quest 2 controllers and then rendered the trigger, button and joystick states on top of this to provide a representation of the actual input. When a button is pressed or a joystick moved, the overlay would render the change.
Finally, it’s as easy as pointing video capture software such as OBS at the webpage and on-screen gameplay and you have a working responsive device overlay that can be used to monitor device inputs in tandem with gameplay and spatial movement.
Next steps
This project remains a work in progress that will adapt to ongoing learning from further research into immersive experiences. The next step involves combining all this realtime data into a digestible format that can be queried during research analysis. Such as a data dashboard.