I am trying to figure out why my camera and scene never quite looks how I intend for it to look. In the code below I drew three spheres at coordinates (1,0,0) (in red), and (0,1,0) (green), and (0,0,1) (blue), and then a white one at the origin, in order to easily see the coordinate system. I then try to set the camera at position (0,-1,10) and then make it look at the origin. I would think that the camera should see all of these points (although maybe with the origin a little occluded).
However, when I render it, nothing is visible. If you zoom out, then the objects become visible. It’s like the camera is initially looking away from the objects, even though I set it to look toward the objects. It also seems like its initial position is much closer to the origin than (0,-1,10).
As @newbie123 said, for an arc rotate camera, use camera.radius + camera.alpha/camera.beta to position the camera. In your first PG, radius = 1 (4th parameter of the ArcRotateCamera constructor), which is too small.
Part of what confuses me, then, is that the documentation says
The position of the camera can also be set from a vector, which will override any current value for alpha, beta and radius. This can be much easier than calculating the required angles.
I wonder if this behavior is then a bug, or maybe the documentation needs changing? (Or probably at least as likely: I’m misunderstanding something still.)
So ok, actually you can set the position by camera.setposition.
You just have to use an actual BABYLON Vector
Like: camera.setPosition(new BABYLON.Vector3(0, 0, 10));
But be aware of your upperRadiusLimit… You can’t exceed that.