Set setHardwareScalingLevel on Oculus Quest 2

I am trying to set hardware scaling level on Oculus Quest 2. It does affect the oculus browser window but when I enter immersive VR mode, it always resets back to 1. Anyone knows how to upscale rendering on Oculus?

(I am doing this to cure edge aliasing which is a really tough problem with oculus…)

1 Like

In WebXR, I don’t think you have control of resolution. The quest edge issue is a result of the camera shaking / vibrations. I have froze the camera. Individual frames look good.

I am looking for hw scaling level that renders to 2x bigger framebuffer and then downsamples to Oculus native resolution.

Also, one interesting thing to this matter is that in the Oculus Quest 2 native browser window my scenes always look very well antialiased. But when the immersive launches it almost feels like the resolution gets halved down… My knowledge of VR is bit limited still, maybe this actually has to do with 2 eyes being rendered…

Hey guys, I trying to solve this aliasing problem too (but in the oculus quest 1).

All the information that I have found talk about webVR and there isnt a clear solution.

I have used the default rendering pipeline but when using FXAA I see very little difference and worst FPS. I tried to change the MSAA samples, 1, 4 o 8 but there is no change.

This problem is extremely distracting, is there any other way to reduce the aliasing?

My own investigations have really lead me to think that VR just is really challenging in terms of aliasing. Maybe with very highres headset like Varjo, you could get away of this problem, but meanwhile one of the best findings for me has been this:

https://www.facebook.com/permalink.php?story_fbid=1818885715012604&id=100006735798590

Hello JSa,

Carmack gives very good guidelines that I have been using for a while when I develop in Unreal Engine for the Oculus Quest and PCVR. But still I feel like there could be a “vr.pixeldensity” (like in UE4) or “setHardwareScalingLevel” to easily scale the VR resolution.

Does anyone else has any idea on the topic? And why MSAA isnt working?

Thank you

By the way, at least for Cculus Quest 2 Babylon’s MSAA definetely works but I had to be very carefull to not to disable it accidentally. Many default postprocesses (like volumetric rays) may block the MSAA, I guess because of how they use offscreen render targets.

I would also be very interested in knowing if you get better antialiasing quality for Oculus Quest in Unreal Engine for the same scenes? If that is the case I guess then on the webvr and Babylon side there could be something to do about this…

Hey there JSa,

Do you mind sharing your code or a playground with how you use MSAA? Its suppoused to be ON by default but I try changing the samples and I see no diference, I have tried with and without default rendering pipeline.

And yes, you can get better antialiasing quality for the Oculus Quest with Unreal Engine, you can use MSAA x8 but most importantly you can scale the resolution, x1.25 its usually alright (for the Oculus Quest 1).

Could you possibly just use a postprocess that super samples and does its own MSAA or even just a small 5 tap cross shaped neighborhood sample?

@Davidae: I did not have time to do the example yet, but some background:

I started of by comparing the babylon js output on Oculus Quest 2 to this page:
https://immersive-web.github.io/webxr-samples/immersive-vr-session.html

That page uses some default antialias on Oculus Quest 2 by default. This same quality was my goal for Babylon JS output also. For now, I have not been able to get better quality than the one I see in that pure webgl example.

This makes me wonder if webgl is somehow limited to 2x or 4x antialias? I haven’t seen any difference either if I change the MSAA sample amount in Babylon JS.

This is how I create the pipeline:

      var pipeline = new BABYLON.DefaultRenderingPipeline(
        "defaultPipeline", // The name of the pipeline
        false, // Do you want the pipeline to use HDR texture?
        scene, // The scene instance
        [scene.activeCamera] // The list of cameras to be attached to
      );
      pipeline.samples = 4;

Could you possibly just use a postprocess that super samples and does its own MSAA or even just a small 5 tap cross shaped neighborhood sample?

@Pryme8: This sounds like a good idea… I am wondering if you have any playground code to show how to use this kind of postprocess?

(I am afraid though that this would cause too much performance penalty on Oculus.)

As I said, before, I found the primary cause to be a frame to frame issue where. No matter how still you keep your head, the rotation of the camera oscillates & giving the appearance of an aliasing problem. Anti-aliasing is contained on a single frame. I am not sure how well single frame solutions are going to work.

I ran into this at the end of last year. I had some smooth shaded chrome meshes, and I had more than edge problems, because the chrome was highly reflective, the shake of the camera made the interior looked pretty terrible as well. I tried a number of things.

First, I tried to replace the sub-cameras of the XR camera & tried to dampen the minor position & rotation changes, but it always seemed that I got really good gains only after side effects started to appear.

Fortunately, I found the PBR material property specularAntiAliasing. This made the chrome look mostly white at a distance, but that is mostly what a browser showed from far away as well. I still had edge issues, but the interior looked now acceptable.

With that, I abandoned trying to do anything with the camera, but I made scene for you to see it for your selves. Camera Shake

The scene, is just a grid floor, but when you hold down the ‘A’ button of the right controller, I freeze any changes to the position & rotation of the sub-cameras. You now get a solid image. Granted, depending on the exact spot where you freeze, parts may not look good, but it is rock solid. Even when you run the scene on the desktop, some corners look bad at given camera locations. This scene is kind of a torture test for aliasing.

Making everything look fussy with a post process may help, but if multi-view is ever implemented, I think you might not be able to use it. To me that is going to be more important. Running with Babylon Native, once it runs on more than Hololens, might also give Unreal like results, since then OpenXR is directly being used, not through a browser.

Hmmm, I wish I had a VR set to dig into this.

@JCPalmer Great example, you really nail this “phenomena” with the Camera Shake.

But now, I would really like to emphasize one thing here: On Oculus Quest 2 even the camera shake example looks just fine outside the immersive mode! Even if I move the grid around with the pointer.

I know head tracking movement is not quite the same thing out of immersive, but still - I cannot help noticing that antialiasing seems at least 2x better running this example in the Oculus’s browser window out of immersive.

So… I am still wondering, along the lines of the original post, that is there just some limitations on hardware antialias on WebGL that do not limit the native apps?

It would actually make sense to try @Pryme8’s idea on this example to implement the 8x MSAA manually and see if it makes it any better…

I have had over 6 months to think about why the stuff, including the tool bar, settings area, & browser in 2D are virtually “shake free”. I have at times thought that immersive mode in the Browser is not as good as native. I have not been able to talk myself out of this with a better reason, but I still have a lot of other problems, so I have not done much.

I do wish to deploy using Babylon Native, regardless. In March, I double checked that Oculus has posted an OpenXR library, then asked a question Get OpenXR working on Oculus Quest.

We really need an apples to apples comparison of browser to Babylon Native on quest of the exact same scene. Any chance of that working before the end of summer, @syntheticmagus ?

Sadly, I doubt I’ll be able to take point on it in that timeframe, but it’s something I would dearly like to see happen. (I love the Quest devices.) I’ll be happy to offer as much support as I can to any initiative to get that started, and it likely won’t be a complicated workstream as most of the problems should at least be roughed-out in the current OpenXR integration. There will probably be a fair bit of legwork involved, though, both in re-multiplatform-ing the OpenXR integration (it’s gone a bit Windows on us recently) and in figuring out the logistics of building with the Oculus SDK integration. Both of these workstreams may fall in naturally with refactor work that will eventually be needed for the Babylon Native XR dependency, but the timeframe on that is, I think, probably not very immediate.

Again, though, I’d love to see it happen and will be more than happy to offer as much help as I can to any initiative that wants to get this started. :wink:

Hi,
Hopefully I am not totally of topic but I read you have an Oculus quest 2. I am working also on an webXR app created with babylonjs. The problem is that I do not own an Oculus quest 2 and I would like to test if oculus browser has Webluetooth support before I buy one. (my app requires this feature) Can you please help me? You just need to start my web app and press on the “connect” button, if it displays a dialog with blue tooth devices web blue tooth works.

Ergometer (ergometer-space.org)

Thanks in advance

Tijmen

Just my 2 cents here (as I don’t own a quest 2) - the Bluetooth on the quest devices is turned on only in dev mode (AFAIK). But would be great to know if it does support it :slight_smile:

@tijmenvangulik it gives an error sadly

1 Like