Can we achieve this kind of quality in our BabylonJS?


I have a very basic question. I request you to guide me in this.

Please look at the attached image. It is so realistic.

  1. Can we achieve this kind of quality in our BabylonJS?
  2. Can we achieve this with BabylonJS at realtime ? I mean without pre-baking it for hours?
  3. Can we achieve this with DirectX at realtime?
  4. What all I need to do and have to achieve this kind of quality realtime?

My application is a web application in which the user can select a room like shown in the picture and start applying various textures to floors and walls. The output should be so realistic. As it is an interactive application, we cannot afford hours of baking. If we can achieve it with BabylonJS, great. If not, If we can achieve with DirectX, may be I load the model at the server using SharpDX and WARP (Windows Advanced Rasterization Platform (WARP) Guide - Windows applications | Microsoft Docs) and take a screenshot and deliver it back to the client.

Please help me on this topic. Thanks in advance.

  1. Well it depends on what you mean by quality :slight_smile: This one is a raytraced rendering. it could be done with babylonjs but you ahve to provide prebaked lighting for instance.
  2. Yes with prebaked lighting. The reflection can be done with really high quality cubemaps. But prebaking for hours will be necessary. The point here is mostly about really high resolution textures and antialiasing + global illumination
  3. Probably on latest 2080TI + RTX
  4. My best hint would be to say: try :slight_smile: Create a simple model in blender and try to get as close as you can to your goal using our more advanced options (lightmapping will be key: From Blender to Babylon - standard workflow)
1 Like

Please educate me on this topic with some referenCe material/documentation/samples

  1. what is pre-baked ligting?

In my scene, the positions of the lights are fixed.
User selects a texture image from a list (thumbnails are shown in a list of to pick) on clicking of a mesh and the texture should be applied to the mesh. The texture image will not have any reflections but once it is applied to the mesh, the reflections should be shown to make it realistic

pre-baked lightning is also known as lightmap

so no matter what your textures are, the lightmap can hold all the lightning and shadows
For instance for our Espilit demo (Babylon.js - Glowing Espilit demo) this is the lightmap used:

1 Like

So, once the light map is built, is built for a model, it can be reused is it?
In that case, how to build that light map (with raytraced rendering)?

this is another topic and not for this forum directly :slight_smile:
Which tool are you using ? 3dsmax, maya? blender?

1 Like


So yeah, you need to find some documentation on how to bake light in maya
Let me try to see if the all mighty @PatrickRyan can give you some pointers

1 Like

Please correct me with my understanding.

  1. The light map image is always used as Ambient texture.

  2. When we need to change the texture image of a mesh, we need to keep the ambientTexture of the material of the mesh as is and change the diffureTexture. If we do so, the lighting and shadows will be preserved.

I just made public a demo we’ve made on my company, which may give you an idea about what you can expect : Virtual Staging - Apartment configurator ; by Axeon Software

And yes your lightmap will only contain lighting informations, so you can change albedo while keeping your lighting.


@Subrahmanya_Chakrava the best way to think of how you need to configure your assets for Babylon is to think of them as real time game assets. You can get high quality rendering out of a real-time engine, but you can’t approach the assets as you would if you were ray tracing them.

This breaks down into two main issues when converting from pre-render workflows to real-time render workflows:

  • Shadows, including self shadowing
  • Global illumination and reflections

For shadows the trade off you need to make is that to get soft shadows that are impacted by bounced light, or global illumination, you will need to bake your shadows into a light map for your object, so you are limiting the object to not moving in the scene and always being in the same position in your scene. You can get soft, real-time shadows but those are more expensive to render at run time and won’t be affected by global illumination as we aren’t calculating any bounces in the lights (which would lead us to real-time ray tracing that requires a lot of compute power). Self shadow is also a more expensive method that is easily solved by baking your lights. You are more performant with a shadow map than with self shadows in real time, but you further lock down the asset as it can’t have animation or the baked shadows become obvious.

A great tutorial for baking light maps in Maya is available at Pluralsight. It is a little older, but the principles are still the same. This is the best way to get soft shadows affected by global illumination.

The other part which is global illumination and reflection is a harder solve in real time. Baking your lights into a shadow map will add the global illumination into the shadow maps, but you still need to account for illumination for a bounce. Consider this image from Unity:

It deals with both of the challenges we face. Objects close to a colored wall will receive color contribution from the global illumination or bounced light. The sphere near the green wall takes on a green cast in its lighting calculation. And the metallic sphere in the center reflects the environment. One of the issues, the reflection, can be solved with reflection probes which bake a local cubemap which is used for reflection. The problem is that you are baking 6 images every frame to account for objects moving which can get expensive very quickly.

And for the environment lighting, you need to provide a precomputed DDS so that we can use the mip levels as a substitute for calculating the specular lobe based on roughness per pixel every frame. I’m currently talking about environment maps in another thread on the forum, so I won’t repeat it all here.

So, if you bake an environment map of the room your scene is set in, you can get both reflection and image based lighting (IBL) out of the one file, but the user cannot interact with things in the scene, i.e. move furniture around or replace it. However this will render the fastest in real time.

If you need to make use of interaction with the objects within your room, you will need to likely make use of two tricks. The first would be to bake out an environment map of your room empty for IBL and then use reflection probes to handle all of the reflections of dynamic objects.

The downside here is that baking shadows across objects is not going to work. You can bake self shadows into an object like a chair, but you won’t be able to bake the shadow into the floor asset. The thing you will need to do is create a plane under the object that catches the shadow and when you assemble the scene, the texture on the shadow catcher uses a multiply blend mode in the material so that it will look like it affects the floor asset.

To get the quality of render you showed as your example, you will never get completely away from baking some textures until our everyday hardware catches up to top of the line GPUs. Until then you can break down the problem in to manageable chunks and will have to use some smoke and mirrors and some trade offs to get the assets to behave the way you want. I will say that this is a very typical workflow for game engines today. And there are a lot of resources available online for creating realistic game assets. If you frame your queries around game assets, you will find everything you need. Let me know if you have more questions.


@PatrickRyan Thanks a lot for the detailed solution.After reading your answer, I learnt many thing which made me realize that I knew nothing.

I will try implementing your guidelines and update you my progress.

Thanks again for helping me.

1 Like