A few weeks ago, the Immersive Web Emulation Runtime package was released, which is primarily (though not exclusively) the non-browser extension of the Immersive Web Emulator. This package is fantastic because it allows you to simulate a headset without needing to install the extension in the browser. Moreover, it enables you to control the headset using JavaScript.
So I came up with an idea: what about using it to perform test automation on WebXR? And so my journey beganā¦ It has been challenging, but Iāve come up with a solution that I want to share with you. Itās important to highlight that, in order to reach this solution, I had to make some assumptions that I am absolutely open and ready to change them.
In brief, the task was split into two parts:
Recording actions from the headset
I thought the best way to record actions (camera and controller movements) would be to do it directly from the playground, so I added the āRECORD XR SESSIONā option in āOptionsā menu. By enabling it, once you enter the WebXR session, the actions are recorded and saved in a .json file.
Executing actions in automated tests
Once the actions are recorded, they are executed during the automated testing phase. Only after completion, the control is passed to Playwright, which takes the screenshot and compares the results.
Seeing the result of this activity was amazing, maybe so amazing that I got confused and ended up opening the pull request on the wrong repo . I apologize for mistakenly opening the pull request on the Babylon.js repository when I thought I was opening it on my fork. I tried to delete it but was unsuccessful.
This is the pull request:
The pull request includes comments that explain the rationale behind some of the decisions made, as well as a brief overview of how to record and replay an automated test. Iām happy to hear your thoughts and feedback.
The proposed solution works very well for VR testing, for AR the situation is more complicated because there we need to reproduce the real environment. There is the @iwer/sem package that allows simulating and injecting a synthetic environment, but it is closely tied to Three.js.
In my experiments, I managed to retrieve the scene mapping from the Oculus Quest 3 in Meta XR Simulator Room JSON Syntax, and the next steps involve recreating the scene during the automation test. However, this raises some questions about whether to create the scene at the WebXR level or at the Babylon level, and I would love to discuss this with you.