Creating smooth PBR meshes for WebXR

I have recently started running meshes on Quest2, both at 60 & 90 hz. When viewing meshes with a PBR material where the roughness === 0, the experience is not good, unless the mesh is flat shaded. I have not completed building a custom environment texture for my scene, but regardless, there is a large amount of flickering.

Using the stock urban environment is really bad. Here is a PG using it.
In browser:

In Quest ( looks less foggy than video shows ):

Using Studio512 has less of the problem, but chrome looks dead. PG
In Browser:

In Quest ( looks less foggy than video shows ):

To me, it seems that the problem is due to the fact that NO ONE can keep their head perfectly still, and the device is too sensitive. The reason I say that is if you pause the video, and individual frame is ok. Anyone think it is something else?

How has anyone else has solved this before, beyond not using a noisy environment texture? I see the options as:

  • bump up the roughness to hopefully get a comprise that work well.
  • Switch to flat shading, where it just looks like a architectural choice. Obviously, this will only work on certain meshes, and not things like eyes.

Kind of hoping someone else has already been here.

1 Like

As is typical, the process of getting my thoughts together to post, got me to another possible solution.

Sorry, @RaananW, but this is code solution possibility. I specifically put this into the ‘Content creation’, so you would not need to concern yourself, but oh well.

I wonder if an option to reduce the precision of position / rotation of the camera rig (round to a number of decimals) might make things less ‘shaky’? Even smooth, or even STD materials seem to be fuzzy at the borders. If I am right about the sensitivity, this could make things real sharp

The camera’s position is coming from the device itself and is being updated in an internal loop in the camera itself. It will, of course, be possible to skip a frame or two, but XR should be viewed with the highest framerate possible, otherwise it will lead to unwanted effects to the viewer (such as dizziness or headaches). I would recommend not to limit this (and for that reason will not introduce this to the framework itself).
I would also not recommend to filter the movement of your head, thou a properly defined kalman filter (Kalman filter - Wikipedia) might help clean up the noise. I bet there is a reason why the runtime environment doesn’t employ one (or better yet - they do and this is much better than what you should experience :slight_smile: )

Ok, well I did build a replacement sub-camera class, which I replaced in the rig camera, using an after render. Not even so much as a solution, but to absolutely verify that this was the cause. I put it on my actual scene, not a pg.

Can say, for sure, that camera “shake” is the problem. Precision of rotation was a far greater issue than position, especially at larger distances, as expected. Only 4 decimals of rotation is almost undetectable, but large gains are not had until only 3 decimals. It was a good experiment, but not going any further with that at this time.

Working the problem from the material.roughness angle is not only more conventional, but I got promising results. I modified the PG to add a slider to adjust the roughness. The noise on smooth pole’s current “interior” edges goes away with pretty minimal increase in roughness.

The outer edges still have an issue. All of this seems to be distance related. I think, at the application level, high priority materials with single meshes could have their roughness dialed up or down based the meshes distance from the camera.

That combined with flat shading all or parts of meshes, a good environment texture, and just backing off a little on lesser important meshes could get a very acceptable result. Not any real silver bullets, though am thinking about implementing a PBR monitoring class, with an after render. Say:

public constructor(cam : BABYLON.WebXRCamera) { . . . }
public registerForMonitoring(mat: PBR_Material, mesh: BABYLON:Mesh, lowestVal: number, closestDistance: number, highestVal: number, farthestDistance : number) { . . . }
2 Likes

I can experiment with movement smoothing and see if we can add that as an optional parameter to the WebXR camera. I have never considered it, as it somehow goes against the way VR headsets should work, but it might solve those material issues that you are mentioning. I want to be sure the user experience will stay the same thou.

Thank you for sharing that!

1 Like

If it means anything, here is the camera class

export class XRSubCamera extends BABYLON.TargetCamera {

    /**
     * more or less swiped from WebXRCamera._updateNumberOfRigCameras()
     */
    constructor(stockRigCam : BABYLON.TargetCamera) {
        super(stockRigCam.name, stockRigCam.position, stockRigCam.getScene());
        this.minZ = stockRigCam.minZ; // 0.1;
        this.rotationQuaternion = stockRigCam.rotationQuaternion;
        this.updateUpVectorFromRotation = stockRigCam.updateUpVectorFromRotation; // true;
        this.isRigCamera = true;
        this.rigParent = stockRigCam.rigParent
        // do not compute projection matrix, provided by XR
        this.freezeProjectionMatrix();

    }

    public _checkInputs() : void {
        if (XRSubCamera._posDigits < XRSubCamera._skipNDigits) {
            this.position.x = this.round(this.position.x, XRSubCamera._posMult);
            this.position.y = this.round(this.position.y, XRSubCamera._posMult);
            this.position.z = this.round(this.position.z, XRSubCamera._posMult);
        }

        if (XRSubCamera._rotDigits < XRSubCamera._skipNDigits) {
            this.rotationQuaternion.x = this.round(this.rotationQuaternion.x, XRSubCamera._rotMult);
            this.rotationQuaternion.y = this.round(this.rotationQuaternion.y, XRSubCamera._rotMult);
            this.rotationQuaternion.z = this.round(this.rotationQuaternion.z, XRSubCamera._rotMult);
        }
        super._checkInputs();
    }

    public round(value : number, mult : number) : number {
        return Math.round(value * mult) / mult;
    }

    //======================================================================
    private static _xrCam : BABYLON.WebXRCamera;
    private static _skipNDigits = 7;

    // handle getting & setting of digits at static level, so no rig camera looping
    private static _posDigits = XRSubCamera._skipNDigits;
    private static _posMult = Math.pow(10, XRSubCamera._posDigits);

    public static get posDigits() {return XRSubCamera._posDigits;}
    public static set posDigits(value : number) {
        XRSubCamera._posDigits = value;
        XRSubCamera._posMult = Math.pow(10, XRSubCamera._posDigits);
    }

    private static _rotDigits = XRSubCamera._skipNDigits;
    private static _rotMult = Math.pow(10, XRSubCamera._rotDigits);

    public static get rotDigits() {return XRSubCamera._rotDigits;}
    public static set rotDigits(value : number) {
        XRSubCamera._rotDigits = value;
        XRSubCamera._rotMult = Math.pow(10, XRSubCamera._rotDigits);
    }

    private static swapped = false;
    public static swap(xrCam : BABYLON.WebXRCamera) : boolean {
        if (XRSubCamera.swapped ||xrCam.rigCameras.length !== 2) return false;
        XRSubCamera._xrCam = xrCam;
        for (let i = 0; i < xrCam.rigCameras.length; i++) {
            const old = <BABYLON.TargetCamera> xrCam.rigCameras[i]
            xrCam.rigCameras[i] = new XRSubCamera(old);
            old.dispose();
        }
        XRSubCamera.swapped = true;
        return true;
    }
}

I would not have thought of doing it either, but think I just had a perfect storm. I had gotten a 568k triangle, 5 drum & 3 symbol drum set down to 29k. In doing so, all the poles & other chrome pieces were very low poly, but looked still looked great on a Desktop, due to smooth shading.

When I put it up on Quest, it looked like it was radio active. Something had to be done & I cast a wide net.

If you make any progress, this will help any scene with all meshes, especially those in the distant. I wondered, as well, why it should be done at this level. Seems like everything on Quest would benefit, even those videos on “Occulus TV”. Having it here means, we might have some control over other platforms as well, though.