Babyloan Js scene rendering using blender

Hello, everyone!

I’m currently working on a project that involves transferring scenes from Babylon.js to Blender, and I’m facing a bit of a challenge with accurately cloning camera settings from Babylon.js into Blender. I want to ensure that the camera’s position, rotation, field of view (FOV), and clipping planes are precisely replicated in Blender to maintain the same perspective and scene framing.

Here are the specific details and steps I’ve considered for this process:

  1. Exporting Camera Data from Babylon.js: I understand that I need to get the camera’s properties from Babylon.js, such as position, rotation (or target if using a target camera), FOV, and near/far clipping planes. Is there a recommended method or script for exporting these properties efficiently?
  2. Recreating the Camera in Blender: Given the exported properties, what’s the best approach to recreate the camera in Blender accurately? I’m familiar with Blender’s Python API but could use some advice on scripting this part, especially on converting rotation values correctly and adjusting for the different coordinate systems (Babylon.js uses a left-handed system, while Blender uses a right-handed system).
  3. Adjustments and Considerations: Are there specific adjustments or considerations I should be aware of during this process, particularly regarding rotation conversion, coordinate system differences, and setting the FOV correctly in Blender?

I aim to automate as much of this process as possible to facilitate smooth transitions between Babylon.js and Blender in my workflow. If anyone has experience with this or can offer insights into the best practices for achieving an accurate camera clone, I would greatly appreciate your advice and examples.

Thank you in advance for your help!

1 Like

Hello and welcome!

I believe the GLTF / GLB format is actually all you need.

Thanks, Labris,

The main issue I am facing is when I am trying to push camera data to Blender through the API, I cannot pass them in such a manner so they can get the render from the same angle direction. For more detail i am sharing my Json

{
“material_id”: 1,
“material_url”:“url.com”,
“product_id”: 1,
“scene_id”: 1,
“cameraData”: {
“target”: {
“X”: 1.732,
“Y”: 1.39,
“Z”: 0.327
},
“position”: {
“X”: 1,
“Y”: 1.5,
“Z”: 1
},
“rotation”: {
“X”: 0.104,
“Y”: -2.313,
“Z”: 0
}
}
}

@aman0x Might be also worth checking or playing with:

  • camera.upVector
  • scene.useRightHandedSystem
  • Matching Babylon aspect ratio and Blender rendered image aspect ratio

But I agree with @labris . GLTF/GLB as an interchange format should handle this.

1 Like

You may have a look at GLTF Cameras Tutorial: glTF-Tutorials/gltfTutorial/gltfTutorial_015_SimpleCameras.md at main · KhronosGroup/glTF-Tutorials · GitHub

As @inteja mentioned, you need to check other parameters, like aspectRatio and some other as well. The GLTF specification should be enough for Blender :slight_smile:

1 Like

I have attached the default camera in GLB. It is working fine, But the core issue I am facing is when I am changing the view in Babylon js and passing that data to Blender through the API to get the render it’s not working

I want to create an open-source library that can help to get real-time render from blender

1 Like