XR-Debugger | Debug your ExpressJS server with your VR headset in 3D

Early stage project to debug code in XR with a headset (Quest 3 or Apple Vision).

I’ve used Babylon 7.0 and the greased lines mechanism.
Thanks @roland for the Greased lines support!

Code| GitHub - dht/xr-debugger: Initial web
Web | https://xr-debugger.com


This is super cool! I can imagine configuring the/a/any server in 3D just by dropping nodes into the scene and connecting them. You could also set parameters of the nodes in 3D. It could generate a config or code.

You are welcome and I’m happy to be a tiny tiny part of this futuristic and super exciting tool!


1 Like

Yes exactly,
This visual approach might allow you to debug very complex systems.
Good ideas, a 3d version of react-flow, allowing both editing and exporting…
I think it can also be interesting to use it in realtime anayltics. See people move inside pages of your website.

1 Like

I thought about visualiziing connections between babylon.js forum topics, users, etc…

Or hacking :smiling_imp:

1 Like

hi @dht - i really like what you made - thank-you for sharing. i worked for a company specializing in SRE (4 years ago) and made a proof of concept for linking source code changes to failing API calls and the communication between resources. similar to what you have - my goal was managing outages and SLOs/SLIs on ie: 500 HTTP status codes. it worked in VR as well, so was a really cool way to visualize and manage the interdependencies of microservices/releases.

there is an unrelated public demo and source code here for the graph and vizualisation:
BabylonJS Directed Graph (brianzinn.github.io)

I took a big shortcut for building the graph. What I really wanted was a 3d force directed graph that was built automatically and dynamically (ie: adding a node/service would shift the objects accordingly). I’m curious how you laid out the nodes in your graph. Here was a forum post on a physics engine I made for a proof of concept to put the microservices/resources/assets into 3d space and connect - i never took it to production, so it sits in a borked semi-working state:
So, i made a physics engine - Demos and projects - Babylon.js (babylonjs.com)

edit: it looks like you put off the graphing part if I am reading the code correctly! not a bad choice :smile:
xr-debugger/web/src/components/Playground/Playground.elements.json at main · dht/xr-debugger (github.com)

1 Like

Interesting, I wasn’t aware of the term SRE; I had to look it up!

I tried the demo and added a bit.ly link to make it easier to load on the Quest 3. I like it! I think I’ll borrow your implementation and experiment with it. I also experimented with the force-directed graph algorithm using Fruchterman-Reingold. It’s much easier to implement it today with the help of ChatGPT to bootstrap the code and fix any issues.

You can see the implementation here, and I’ll share a link to a demo on this thread soon (just need to build it and upload it).

On a different note, I came across the React BabylonJS repository. I like the direction it’s taking. I think this approach of a descriptive representation of a scene is a much-needed alternative to programmatically constructing a scene. You can even mix and match, setting up a scene with tags or with JSON and then manipulating or expanding it programmatically. Perhaps we can ask @RaananW whether such a contribution to the BabylonJS repository would be welcomed.


nice - thanks for sharing the 3d force directed graph code - it’s quite incredible what chatGPT is able to come up with already and it’s not getting any dumber… LOL.

Force Directed (demo):

Apart from the extensions repository, which is fully community-oriented repository, all of the repos in the babylon org are maintained by the team. There are some reasons for that, which I won’t get into here. However, if it is well maintained and fits the core Babylon mindset, we can always discuss that.

It could be implemented as a loader:

SceneLoader.ImportMesh('', '', 'scene.json', scene, function (meshes) {

Where scene.json is:

     boxes: {
         b1: {
            id: 'b1',
            position: [0, 1, 1],
            values: {
                 size: 10

Or perhaps:

    meshes: {
            id: 'b1',
            type: 'box',
            position: [0, 1, 1],
            values: {
                 size: 10

It leaves open questions whether to define the material as part of a mesh definition or reference it or perhaps enable both.
But to make this JSON loader work, we need some adjustments in the core package. Specifically, in the MeshBuilder to define types like “IBox.” Currently, a box is defined as the parameters of the CreateBox method:

export declare const MeshBuilder: {
    CreateBox: typeof CreateBox;

I’m willing to submit a PR for @babylon/core if that’s acceptable. I believe having clear definitions for different primitives as independent TypeScript types could be beneficial, even without the discussed JSON loader implementation.

I understand the project’s focus on programmatically setting up and interacting with scenes, but I think supporting a descriptive scene setup via JSON alongside the current approach is feasible. Perhaps it’s already supported through some mechanism I’m unaware of.

What do you think?

I am bit confused as to what you are actually suggesting. Is it a filetype to offer loading primitives?

A method for loading or setting up a scene from a JSON or a JavaScript object.

If it were available, it could support various use cases, which I can elaborate on.

We do have the .babylon file format, which is a JSON object :slight_smile:

The .babylon File Format | Babylon.js Documentation (babylonjs.com)


Ok, I had a feeling I was missing an important piece… :slight_smile:
I am working with the engine for a few months now and did not know the babylon format is json…
I’ve seen people using it in the playground but have yet to work with it.
Looking into it. Tnx.