We are having testers reporting very slow camera speeds ( seen it myself too at times ). Using the standard UniversalCamera, it seems like the worse the performance, the slower the camera movement becomes. A few people called me crazy but recently its become worse. Am I correct in reading this that FPS is taken into account for the camera speed ?
public _computeLocalCameraSpeed(): number {
var engine = this.getEngine();
return this.speed * Math.sqrt((engine.getDeltaTime() / (engine.getFps() * 100.0)));
}
actually matches like 1 unit in 1 second. As a bonus, no square root
Just wondering, additional info appreciated.
(I’m switching 1st and 3rd person camera, trying to keep the same speed)
alone does the trick.
But that’s (mostly) because I assume that both deltaTime and FPS are function of performance alone. If that function is something squared, root may make sense, but I just don’t see it.
Heuristics?
Either way, just trying to make sense of it.
If I’m understanding correctly, this line is for adjusting the local camera speed so that frames with a delta time that differs drastically from the average (engine.getFps() will return _fps, which should be the average FPS over time) will scale and reduce the effects of major deviations in frame speed. I will admit, I’m not sure that I fully understand the existence of the sqrt myself though so lemme ask around a bit about it.