Blender:
BJS:
All properties are the same except the radius.
I’m curious why the bright spot in blender is so smooth?
Could someone help?
Blender:
I test the glb in the ThreeJS PG.The bright spot in ThreeJS same as in BJS.
This is probably due to small differences in the shader and the way the shader reacts to light. Might also be the light properties themselves, or the type of light used.
You haven’t mentioned if it is a .babylon export, a gltf/glb, or a different format, or whether you have created the demo yourself on both platforms.
I build the scene in blender.Then i export the scene as glb from blender.
I import the mesh in BJS PG and set the intensityMode same as in blender.Because blender use different intensityMode.
BJS PG:
Sorry that i don’t know how to build an online PG with blender.I export .glb and .blender.But it seems that BJS forum can not upload them.
Please export the glb in my PG.Then import it in blender.But we should make the world env invalid in blender.
Like this:
In my understanding,the different between BJS PG and blender PG is caused from the light radius.But i set the green light radius in BJS PG.It can not make it same as blender PG.Could you give me other ideas?
One more words,i don’t want to change other props like pbr meterial and so on.I think that they should be same.I want to know that how can i change the green pointLight props make it same as blender PG.
pinging @PatrickRyan, hoping you have some input here?
@Aze, the difference you are seeing with Blender is due to the fact that you have enabled viewport rendering in your scene. What Blender is doing - I don’t know the exact rendering model here but I can make a guess from some obvious clues - is a progressive ray traced render. You can tell because when you move the camera around, you can see the renderer drops to a very low-quality image until the camera stops moving and then re-renders a higher quality image. This is most noticeable in the shadow of the cube:
This is not truly real time rendering since the camera needs to be still to get the best falloff in shadows and lights. In real time rendering, especially on the web where browsers only have one thread to do everything, we don’t have the resources to do ray tracing so engines like Babylon and Three will use approximations to save on rendering time since we need to do everything else on the frame in the same thread. So you won’t see super smooth falloff and no secondary bounces that are not baked into a texture. There are several examples of real-time ray tracing on the web, but these are super early prototypes that are not widely used yet. Mostly this is due to the fact that we need to render at 60fps minimum on a feature phone with very little power and 90+fps on any HMD so that the user does not get motion sickness.
And since the shaders for Blender are expecting ray tracing and the shaders for Babylon expect approximations, you can’t rely on the renders in Blender to determine how the Babylon scene will look. As you can see here, I am able to get a closer approximation to the render from Blender by changing the position and intensity of the point light:
For the Babylon point light and PBR shader the parameters you used in Blender to create the lighting effect you want won’t align because the shaders are different and aligned with the way the lights are written in each engine. When I am creating scenes in Babylon, I will use the DCC tools (Blender, Maya, Max, Substance) to see what the materials will look like in a PBR environment, but I always do my final lighting in engine as the mismatches between different renderers can cause extra work adjusting the lighting for each specific renderer.
I hope this helps.
Thank you for your reply!!!