Hello all,
So sorry for the lack of updates on this glTF bookcase texture loading problem. I have been very busy with other random life commitments, but now I am back on the job!
Today I did some investigating into this issue of the textures of the bookcase model not being ‘seen’ by the path tracer on the GPU.
The good news is that I know what the initial problem was now. The bad news is that I’m not entirely sure how to solve the remaining problems yet. And it looks like the solution will require a lot of new plumbing.
So here’s what I understand so far:
The reason my path tracer couldn’t ‘see’ the wood textures and other textures included in the glTF bookcase model was because, unlike the damaged helmet glTF model, the bookcase model doesn’t have 1 large overarching albedo color texture. My texture checks fail if material.albedoTexture is not found for the entire model. If you look at the damaged helmet albedo texture (the actual image saved on disk/server) for instance, you’ll see that the modeling program nicely packed all the model pieces/components and their associated color textures into 1 large texture atlas:
albedo texture atlas for all model’s components
This is then specified as the material.albedoTexture, which is therefore no longer ‘null’, and passes my texture checks in the loading section. As long as your glTF model has an ‘atlas’ type texture system (not only for the albedo, but also the bump/normal, emissive, metallicRoughness, etc.), we should be ok. All your textures will have a similar size and ‘look’ to them, because the model only has 1 set of texture uv’s to use when looking up various texture data from the shader with a sampler2d, something like:
vec3 color = texture(uAlbedoTextureAtlas, uv).rgb.
However, the bookcase model does not have an overarching master texture atlas system. Instead, it is presented in a modular system consisting of no less then 150 separate meshes (components/pieces), each with their own material, texture, and triangle uv data! Now, granted the same wood texture might be assigned to 30 of the 150 pieces, but you can start to see why I’m sweating
.
I have found at least the initial source of the problem: I was merging all 150 meshes into 1 giant mesh too soon - all the unique material data gets lost as soon as I do this. Then my checks fail of course because ‘material.albedoTexture’ for this new behemoth model is null: there is no nice packed texture atlas for all the pieces of the model. The model therefore shows up as all white by the time it gets to the path tracing shader. And in case you’re wondering, yes, in order for the path tracing ray caster not to crash, all models do in fact need to be merged into 1 giant mesh, for BVH reasons that we discussed earlier.
To combat this problem, I tried intercepting the load routine and cherry picking 1 small mesh (out of 150 possible mesh component choices), then seeing what I could see. Finally I could get to some of the data. If you try this updated example, open up the browser console and take a look:
bookcase model with 1 texture successful
At least we have 1 wooden texture applied (to the whole model, but it’s a start). In the console,
I can see the material Id, the fact that albedoTexture is not null (yay), the lengthy url of that texture, and the actual _buffer data. The _buffer data looks like 0-255 RGBA data that is intact, but I’m confused at the length of the flat array, which should be around the 200,000 elements range (256w × 256h x 4 channels), and instead consists of only 67,000 or so elements. I may be interpreting this incorrectly though.
But in any case, if this modular model component indeed has an albedoTexture, I should be able to send that over to the GPU as a texture uniform like I have been doing with other models. However, this plan works for just 1 mesh out of 150 (lol), so I’m assuming that we would need a system that looked for unique textures while loading. If 30 shelves use the same wood texture, we don’t send it 30 times over to the GPU as 30 textures, but rather mark the triangles of that component model piece with some sort of enumerator or unique texture id counter, that then must be dealt with inside the path tracing shader. I’m imagining something like
vec3 color = texture(uniqueAlbedoTextures[hitTriangle.albedoId], hitTriangle.uv);
The albedoId element number above could be any number up until the max texture units of your GPU I suppose (32?) - not sure what the max limit is these days, ha. For instance, the total unique material textures count for the bookcase model might be around 8. Therefore, an array of texture uniforms [8] needs to be sent once to the GPU during startup, then hopefully the triangle data texture lookup will tell us which of the 8 textures we need to sample while intersecting the model in the shader raycaster.
Like I mentioned, this is going to require a significant amount of new code to handle these situations, but at least I can see the light at the end of the tunnel. Also, we will reap the benefits of our new system because any arbitrary scene, no matter how many triangles, or how many different meshes/parts and textures, will be able to be squashed into 1 giant bvh mesh, path traced in real time, all while retaining the material data that was captured for each individual component at loading time. The only case where this wouldn’t quite work is if the original glTF has moving parts or animations. But if it’s a static scene, this ability will open up a lot of new doors for end users!