Accurately measuring FPS

I’ve started to do some experiments to benchmark the performance of my program. I’m using engine.getFPS() inside of a render loop to determine the general speed.

I’ve been wanting to record the total number of lag spikes (or fps drops) I get where the drop in performance causes the duration between two frames to be more than double the average time. I’ve seen generally that when loading heavy duty textures there is a handful of pretty severe, very noticeable spikes present.

However, when I print engine.getFPS() to the console from the render loop when these occur, I still get an fps that is considerably higher than expected (something like 80 or 90 fps when a very apparent slowdown happens). Conversely, calculating the fps by hand using 1000 / engine.getDeltaTime() I generally print an fps of around 8 to 10 on the same exact slowdowns.

If this isn’t a bug, then what’s happening here? And more importantly, what would be the most reliable way of measuring “lag spikes” // “fps drops” in this case?

You may check for the biggest deltaTime.
Here is the small utlitity for scene debugging - https://playground.babylonjs.com/#938RNX#29

1 Like

There’s still a pretty big discrepancy during periods when there’s heavy computation. Calculating fps by hand using deltaTime can return some pretty different values than just using getFps

Is deltaTime more reliable than the fps measurement?

You may also try The Performance Profiler | Babylon.js Documentation

1 Like

getFps returns the value from the performance monitor:

Babylon.js/performanceMonitor.ts at master · BabylonJS/Babylon.js · GitHub
which in turn provides is the average FPS. this is probably why there is a different between getFPS and your per-frame calculations