MultiPointerScaleBehavior not multi-point?

On the guide about mesh behaviors at

The documentation about MultiPointerScaleBehavior says: “This is used to scale a mesh based on 2 pointers (eg. fingers or vr controllers)”

However when I load the example playground on my iphone chrome browser, or the Oculus Quest browser, only one finger or controller pointer may use the gizmo at a time.

The documentation phrasing makes it sounds like I can use both controllers to pinch and pull to scale, but only one controller ever gets the pointer “ray”. If you press the trigger button on one controller, the other controller ray goes away. On my iPhone I also cannot pinch or pull on the gizmo controllers. Only one of the 6 degree of freedom widgets controllers can be highlighted at a time.

Is this a bug? Unless… I’m misunderstanding how this works?

Adding @PolygonalSun

I can take a look and see what’s going on. I don’t have an iPhone or Oculus Quest so I’ll have to find a good way to replicate the issue.

I’m gonna add @RaananW for his perspective on the VR side.

As it is using pointer events it should work out of the box. I can look into it in the next few days and see what the issue is (on the oculus :-)).

Yes, native babylon pointer events work out of the box on Oculus, but… (side story)… when I tried to use the native pointer event to shoot bullets when pressing on the Quest controller trigger buttons, I found out that only one gun can fire at a time. I don’t know if that was a feature or a bug. It kind of makes sense that a mouse can only point at one thing at a time. But modern day trackpads or ipads should have multi-touch and of course Oculus Quest has two controllers which should allow you to point at two things at once.

The pointer system only sends pointer events to one hand due to performance reasons. Constant pointer-move raycasting (times 2) might hurt perfs. If you want both of them to be enabled you can set the flag in the pointer selection feature. The flag is enablePointerSelectionOnAllControllers. it’s part of the initialization options, but can actually be a flag that can be enabled/ disabled. I can add this feature, but for now, if you want both hands to have pointer events, you will need to enable it using this flag.

That makes sense. I’d love to try this out. If using the default xr experience helper like this playground is, what’s the best way to initialize the pointer selection? (I looked around I promise! But still a little confused)

Which means we don’t explain well enough :slight_smile:

You can re-initialize the pointer selection with different options. Technically, if you have the feature manager from the base experience helper, you can do something like that:

featuresManager.enableFeature(WebXRFeatureName.POINTER_SELECTION, "stable" /* or latest */, {
  xrInput: xr.input,
  // add options here

And it will recreate the pointer selection feature. If you want to do it as clean as possible, set the returned value to be xr.pointerSelection and then you are done.

I guess like this?

I modified the MultiPointerScaleBehavior playground:

  • changed default vr experience to default xr experience
  • added a ground mesh so we can teleport up closer to the model
  • enabled both controllers to be able to point

I think the controllers can both point now because when I hover point both rays at the blue handles on the gizmo they highlight with both controllers at the same time.

However, I think the gizmo does not actually support grabbing more than one handle at a time. Once you highlight and select one handle the other handles disappear.

It works correctly for me. I guess it is the expected vs. implemented behavior that is the confusion.
The multipoint scale behavior allows you to select any two points on the mesh and using those two points to scale the object. ignore the gizmo around it, grab two points and move them apart from each other - you will see that the object scales.

Oh! I have to click on the mesh itself! Thanks!

1 Like