Just sharing some study I’ve been doing inspired by the awesome PG Babylon.js Playground @Cedric thank you for the amazing demo and the node material that is giving me sleepless nights. Still amazed with it.
Wishful thinking I will be adding player GUI in game, menus, in game collisions, particles, audio… if I don’t get lost in the math behind the track generation and the node material that is cracking me up.
I understand the track positionings and matrix computations on the src code but the node material math… specially around the vertex output is giving me nightmares. I reversed (tried) engineering the math but would love some guidance to understand this better. For example, where did the vector values to calculate the x,y,z comes form, namely Vector2(32.0,0.50), Vector2(224.0,0.50), Vector2(32.0,0.50), Vector2(32.0,0.50)?
I see a BABYLON.RawTexture.CreateRGBATexture being created to populate the information for the vertex output textures samplers but still confused how this influences the output.
Also a problem I think I could understand better how to sole would be for very lengthy tracks I see the segments being very stretched, on my ignorance I assuma adding more segments to the track on generating the initial mesh would solve this but does not seams like it. I am assuming something I am missing in the node material.
I know I can ignore and move along to the gameplay and other neat stuff on the gameplay but I am stuborn by nature and cannot move forward until clearly understand the logic behind this.
IIRC, these values are used to sample the matrix textures.
32/256, 224/256 give a U value between 0 and 1.
Make a layout on paper with how the datas are set in the texture and it will get clearer
Thanks for the quick response. I don’t think I still clearly understand the behaviour here but I’ll keep digging/experimenting
I want to add extra info to the track in the node material ( shader) for behaviours such as jumps. Also want to clearly get this stuff to be able to add ‘shortcuts’ to the track and some dynamic routings.
Again. thanks for your quick answer. BabylonJS Team FTW!
Ok I think I got it. Feel free to correct me if I am wrong here.
You pass the RGBA texture containing the information for the vertex positionings ( PG src computeMatrix) once done in the shader ( node material) you extract that data from it. Following trackSamplerN from (x/32, y/96, z/160, position/224).
Added small intro counter and final race screen for testing flow.
Added device orientation handling and proper resize methods.
Running smooth on devices ( no touch control to play on it though, not a priority right now).
Same link as the one in the original post to test it. I would love to get as much feedback on it as possible. I am all in in this project.