The model I have uses PBR material. I added a node material to generate reflection and the reflection works but the new change overridden the properties of existing PBR material. The model PBRMaterial’s channels applies Metallic Roughness, Emissive, and Ambient texture.
I am able to apply Emissive texture together with baseColor via AddBlock connect to PBRMetallicRoughness’s baseColor input node. But I m not sure how to add the other textures.
So my question is - how can I map properties from PBRMaterial to the PBRMetallicRoughness to keep the feel of the original model?
The block is named PBRMetallicRoughness in the NME to emphasize it is using the metallic/roughness and not the specular / glosiness model but it is really mapping the PBRMaterial.
So:
the AmbientTexture should be linked to the ambientColor input.
the EmissiveTexture should be a component added at the very end, between the PBRMetallicRoughness.lighting output and the FragmentOutput.rgb input. Make sure to perform the sum lighting + emissive in linear space! The lighting output is in gamma space, so you will need to convert it to linear (lighting_linear=lighting_gamma^2.2). For the emissive texture block, just select “Convert to linear space”. To convert the result of the sum back to gamma space for display purpose, you can either apply a ^(1/2.2) to the result of the sum or select Convert to gamma space on the FragmentOutput block.
the channel corresponding to metallic in MetallicRoughnessTexture should be linked to the PBRMetallicRoughness.metallic input and the channel corresponding to roughness in MetallicRoughnessTexture should be linked to the PBRMetallicRoughness.roughness input
In the video, Patrick created a texture for use in the NME only, not for the regular PBRMaterial, so he could put metallic/roughness wherever he wanted.
If on the contrary you want to recreate a node material that matches a default PBRMaterial, the metallic component is the red channel of the metallicRoughness texture and the roughness component is the alpha channel.
Assuming that our Emissive map is gamma-corrected (since it’s a color map), would simply adding it to the ‘lighting’ output be enough then? This would also apply to our (gamma-corrected) lightmaps
Yes it can be enough, but note it’s not 100% accurate and you won’t get the same results than performing all the computations in linear space and only gamma correcting the result:
a^0.4545+b^0.4545 != (a+b)^0.4545
a and b being in linear space and ^0.4545 performing the conversion to gamma space.