Antigravity study demo

Hello all.

Just sharing some study I’ve been doing inspired by the awesome PG Babylon.js Playground
@Cedric thank you for the amazing demo and the node material that is giving me sleepless nights. Still amazed with it.

Test it here: BJS Going Deeper

Wishful thinking I will be adding player GUI in game, menus, in game collisions, particles, audio… if I don’t get lost in the math behind the track generation and the node material that is cracking me up. :slight_smile:

8 Likes

Ho man! That’s so great!
Yes please keep up the good work.
It makes me so happy to see the PG being used .
Can’t wait to see your progress :slight_smile:

1 Like

I don’t want to be a bother but I would love to get a in depth explanation in this magic if possible. https://nme.babylonjs.com/#01HFES#76https://nme.babylonjs.com/#01HFES#77 ( Vertex Output only)

I understand the track positionings and matrix computations on the src code but the node material math… specially around the vertex output is giving me nightmares. I reversed (tried) engineering the math but would love some guidance to understand this better. For example, where did the vector values to calculate the x,y,z comes form, namely Vector2(32.0,0.50), Vector2(224.0,0.50), Vector2(32.0,0.50), Vector2(32.0,0.50)?
I see a BABYLON.RawTexture.CreateRGBATexture being created to populate the information for the vertex output textures samplers but still confused how this influences the output.

Also a problem I think I could understand better how to sole would be for very lengthy tracks I see the segments being very stretched, on my ignorance I assuma adding more segments to the track on generating the initial mesh would solve this but does not seams like it. I am assuming something I am missing in the node material.

I know I can ignore and move along to the gameplay and other neat stuff on the gameplay but I am stuborn by nature and cannot move forward until clearly understand the logic behind this. :star_struck:

Again thanks for your amazing work on this PG.

IIRC, these values are used to sample the matrix textures.
32/256, 224/256 give a U value between 0 and 1.
Make a layout on paper with how the datas are set in the texture and it will get clearer

Thanks for the quick response. I don’t think I still clearly understand the behaviour here but I’ll keep digging/experimenting :grin:

I want to add extra info to the track in the node material ( shader) for behaviours such as jumps. Also want to clearly get this stuff to be able to add ‘shortcuts’ to the track and some dynamic routings.

Again. thanks for your quick answer. BabylonJS Team FTW!

1 Like

Ok I think I got it. Feel free to correct me if I am wrong here.

You pass the RGBA texture containing the information for the vertex positionings ( PG src computeMatrix) once done in the shader ( node material) you extract that data from it. Following trackSamplerN from (x/32, y/96, z/160, position/224).

:melting_face:

IIRC it contains orientation and position. but that’s the principle, yes.

1 Like

Woooo, this is fun! :smiley:

1 Like