Hi there, i have a RGB hight map png here wich i need to “convert” to a mesh tile with textures.
the rgb height map is this:
where each pixel height is represented by this formula:
height = -10000 + ((R * 256 * 256 + G * 256 + B) * 0.1)
the texture is this for example:
is there any functionality already integrated in babylonjs to do something like this?
Babylon can generate a heightmap from a grayscale image. If you want to use a different scale you could either convert it beforehand, or convert it in-place before feeding it into the heightmap creation function
You could also create a node material that would convert the color texture to a gray scale texture usable by the mesh.applyDisplacementMapFromBuffer function.
wow that looks just exaclty as what i was looking for!!
the only issue i am having is that the whole process is quite expensive to calculate. - and since i am trying to make some sort of a map application wich is supposed to generate tiles quite fast this could become a problem. Do you see some optimization potential in your code?
There’s a few modifications that can be done just in the JavaScript I wrote… l think some of the JavaScript methods for calculating the uvs and indices could be done in the same for loop, and the method that takes data as a slice for your custom heightmap algo could be cleaned up too I think.
But ultimately it comes down to testing. I regularly use this sort of code with 4096x4096 height maps, but I don’t usually rebuild the mesh from scratch every time the application starts up. If you save the results as a .gltf or .glb file locally on the users machine to the indexedDB using something like localforage, you should see the scene load up in under 3 seconds.
The main question to keep in mind going forward is:
How many tiles?
You’ll start to notice JavaScript operations lagging when you need to do millions of calculations per frame. If you need to build a million individual mesh tiles, that is going to take quite some time and use up a lot of memory. If that’s the case you’ll need to start learning about shaders, where you can more directly interact with the vertices data on a lower level and decide how to render things with code that gets executed in parallel on the GPU. Lucky for you webGPU is out now and you can do state management for code that lives and runs in the GPU much easier now from your JavasSript.
I think some of the JavaScript methods for calculating the uvs and indices could be done in the same for loop, and the method that takes data as a slice for your custom heightmap algo could be cleaned up too I think.
This could definitely be refactored to a shader.
But again, you need to test first and determine whether or not this is actually an issue for you as of yet.