Mesh of fingers jitter on some mobile devices(Mali G77 & Mali G610), but fine on PC and iphone.
Visualize the bones with the inspector, the bones are not jittering.
jittering is severe on fingers, moderate on forearm, and fine on other parts of the body. Could it be floating points errors accumulated along the bone chain?
Any suggestions on how to debug it will be helpful
It looks like it could be a problem with floating point errors indeed.
You can try to set mesh.computeBonesUsingShaders = false for all meshes with bones, to see if the problem comes from the GPU. When this property is false, all calculations are done on the CPU.
The finger stop jittering, but the console log many errors (and they are basically glsl shader code). Is it because readMatrixFromRawSampler() is called in many shader, not just the skinning shader? And it’s weired that it can’t compile but the game keeps running as if no error…
Edit: I forget to add “;” at the end of a line… typescript has reshape my brain…
readMatrixFromRawSampler is not used if you set useTextureToStoreBoneMatrices = false, so any change there won’t have any effect.
However, bone matrices are now passed through uniforms, and you have a limited number of these that you can use in a shader. If you have too many bones, this method won’t work as expected and you will get some error in the console log, and it seems it’s what you get… To be sure, you should report here the errors you get in the console.
Yes, it is useTextureToStoreBoneMatrices = false that solved the problem, and narrow down to:
Passing floats to GPU through texture (compared to uniform) has bug on some device
readMatrixFromRawSampler has bug on some device (texture2D() driver bug?)
I skip some texture updates by
if (this.isUsingTextureForMatrices && mesh._transformMatrixTexture && Math.random()>0.9) {
mesh._transformMatrixTexture.update(mesh._bonesTransformMatrices);
}
and when texture is not updated, the mesh isn’t jittering, does this means readMatrixFromRawSampler is quite stable and can rule out speculation2?
Any suggestions on how to debug this?
mesh._transformMatrixTexture is not null? Are you exporting a model from Max? If yes, can you export to a .gltf instead of a .babylon and see if that helps?
and I replace the Constants.TEXTUREFORMAT_RGBA with Constants.TEXTUREFORMAT_RGBA_INTEGER in CreateRGBATexture.
But the avatar do not show on the screen and console log:
GL_INVALID_OPERATION: Mismatch between texture format and sampler type
Try TEXTURETYPE_HALFFLOAT instead of TEXTURETYPE_FLOAT to see if it’s a problem with float texture support. Normalized SHORT or INTEGER won’t work because we need the values to go outside the -1…1 range (the texture stores matrices).
So, you are using a model exported from MAX. Can you export this model as a .glTF file (instead of a .babylon) and see if that helps? What’s more, exporting to a .glTF file means you can test it with other glTF viewers (such as https://gltf-viewer.donmccurdy.com/).
SpectorJs can not ensure the internalFormat the hardware is using. I capture this on the mobile, it indicates boneTexture is using RGBA32F. But my last post indicates that the problem is caused by texture using half float.
I read a blog claiming that “type is a hint for precision, but GL can choose any internal precision to store the texture”. I’m not sure if he is right, but on glTexImage2D - OpenGL ES 3 Reference Pages
I find a table :
Sized Internal Format
Format
Type
Red
Green
Blue
Alpha
GL_RGBA16F
GL_RGBA
GL_HALF_FLOAT, GL_FLOAT
f16
f16
f16
f16
GL_RGBA32F
GL_RGBA
GL_FLOAT
f32
f32
f32
f32
Both GL_HALF_FLOAT and GL_FLOAT can lead to GL_RGBA16F?
for anyone who come across the same issue:
First, create a texture with format Constants.TEXTUREFORMAT_RGBA_INTEGER
Second, turn float32 into int32. In my project, all model is smaller than 2**11 cm, so I think it is fine.
There is another bonesDeclaration.fx in folder ShadersWGSL .
var boneSampler : texture_2d<f32>;
Should I change it into this?
var boneSampler : texture_2d<i32>;
The doc says it is for WebGPU. My concern is, will WebGPU be enabled if Babylon detect the device support it, and the shader will fail to execute (because I mess up i32\f32)?
In my understanding of the spec, if a GPU indicates that it supports FLOAT textures, it shouldn’t be allowed to switch to HALF_FLOAT if you created a texture as FLOAT…
Only if you want to use these changes in WebGPU. Babylon.js will not automatically switch to another engine. If you explicitly create a WebGL engine, you are guaranteed to work with WebGL.
My fix has merged into master and take effect, otherwise I could just give you url to this project…
Can reproduce it on Mali Gpu by taking a random character playing Idle animation(Idle to make it more noticable, like my youtube video).
Can reproduce the jittering on PC by RawTexture.CreateRGBATexture(…, Constants.TEXTURETYPE_HALF_FLOAT)
Another more noticeable way is to scale Float32Array _transformMatrices. You can prove that scaling every element of a transform matrix make no different, unless it exceed 65535 and overflow float16. In the video below, I scale every element with t (t is time). Some bone’s matrix has element whose value is around 200, so when t reach 300, mesh of hand disappear. other parts of mesh disapper as t increases… (this can help you quickly identify if a platform is using f16. On PC however you scale the element , nothing happens)
My phone print the info in chrome console:
“{“driver”:“ANGLE (ARM, Mali-G610 MC6, OpenGL ES 3.2)”,“vender”:“Google Inc. (ARM)”,“webgl”:“WebGL 2.0 (OpenGL ES 3.0 Chromium)”,“os”:“Android 10”}”. Float16 is a new feature aim to improve performance, so I guess it is newer gpu that has this problem: GL spec deault lowp sampler2D in vertex shader, and mali gpu decide to use float16 for lowp.