Hello all!
I’m Erich, the developer of the three.js PathTracing Renderer. @PichouPichou has recently contacted me and asked if I would help build a similar renderer using the Babylon.js library. I replied “I’m in!”.
@Deltakosh I must admit that I have been mainly involved with the three.js library way back shortly after Ricardo first started it. But I always have respected the open-source and feature-rich Babylon.js library and as I mentioned to Valentin, kudos to you David and your team! It’s been a while since I tracked the progress or sat down and tried all the demos on the example page - but wow, Babylon has come a long way over the years!
Valentin suggested I post on this thread in hopes that our public discussions and collaborations might be of benefit to all interested in the world of ray/path tracing and Babylon.js. Now I am not an expert in path tracing, the math behind it, or GPU programming, but over the last 5 years while working on my project I have been able to put the necessary pieces together to make real time path tracing happen inside the browser, whether on desktop or mobile.
Having said that, my project does have some hacks that I’ve had to come up with to get it to either work with three.js or WebGL, or both. One of these is that the GeometryShowcase demo for instance, which looks like a typical demo from the core library displaying its core shapes (like sphere, box, cylinder, etc.) actually does not use the three.js code for those shapes at all. Instead, they are defined mathematically in the fragment shader. If I did use the three.js building code for defining these shapes, I would end up, as you’re well aware, with a bunch of triangles to feed the path tracer. And as you also know, rendering any significant amount of triangles is very expensive and without a BVH, it becomes non-real time very quickly. So all basic shapes in my entire project use well-known math routines in the standard ray tracing libraries. But all the triangle models use the actual triangle data (stored in a data texture on the GPU) and then the fragment shader steps through the pre-built (.js code) BVH and renders whatever triangles you feed it, like any major renderer out there.
I suggested to Valentin that we start off simple, and just try to get a simple sphere scene working first, and then maybe down the road we can add true triangle model rendering. Oh I forgot to mention that even though the renderer fragment shader does not call the library’s shape definition routines, it still can rely on the library’s notion of an object transform. In three.js, this is called an Object3D() and defines a matrix for position, scale, and rotation. I’m sure Babylon has an equivalent data structure. This allows the .js file to feed the shader the transform of every scene object, even though the .js file doesn’t even know what shape it is referring to - it is simply an empty object placeholder with a transform.
So even though the simple demos don’t rely on the core shapes of the host library, they definitely benefit from all the math/matrix routines going on in JavaScript behind the scenes as the scene is updated many times a second. Also, having a host library like Three or Babylon provides automatic support for loading models, loading textures, user input, window sizing, WebGL management, UI, etc… In fact having a library to build off of is the reason I ‘piggybacked’ Three, as opposed to something like MadeByEvan’s older path tracer, that did everything from scratch. I definitely needed/need a WebGL library in place to build the path tracer on!
Well, sorry for the long-winded intro but I wanted to be as clear and open as possible about the challenges and benefits of building a renderer on top of a WebGL library. To David: if we run into issues along the way (and I’m sure we will!) when trying to implement this using Babylon, I hope that you can guide or help us, or point us in the right direction.
Looking forward to working with you all!
-Erich