#include directives in shader code output from Node Material Editor

I’ve been experimenting with using GLSL for shaders. I’ve been able to integrate them into code. I am now having difficulties taking the generated shader output from the Node Material Editor (NME) and doing the same thing. It seems the code is having trouble expanding the several #include statements within the code.

I am familiar with C and so am familiar with the use of a preprocessor, but am not clear on how these statements are functioning. This statement #include<lightFragmentDeclaration>[0..maxSimultaneousLights] is odd as it includes the additional square bracket syntax and an additional parameter.

My attempt at integration can be found within this playground. First, I established that I could integrate the Phong Shader from CYOS. Then, I tried to integrate the shader from my NME instance. This fails. The playground includes these details in the comments as well.

To make experimentation on the playground simple, I implemented a conditional based on the first variable defined USE_NME_SHADER. If false, the Phong shader from CYOS is used. Else, the shader code generated from NME is used.

I gather that the NME includes these statements to reuse existing code modules for often implemented functions, i.e., for modularity. I was able to find the relevant include code on GitHub here. I see code that should explain the use of the #include syntax, but I am not sure where it’s from.

I find that .fx extensions are used for DirectX, but it seems other options are possible too.

At this stage, I thought it’d be worthwhile to ask. Can anyone shed some light on the .fx format, and perhaps point to documentation for these include statements? I’d like to understand this code (I’m willing to work through it; I understand there will be several hundreds of lines) and be able to directly integrate the output from NME into a Babylon.js material. Is there a more direct way to get the playground to run, or to have the NME generate the shader code after the preprocessor has run?

At this stage, as I’m still learning, it’s best to dig all the way down to the code instead of relying completely on systems that hide the complexity. I see lots of code and examples online, and it’d be nice to dive into the nuts and bolts directly. It’d also be nice to see exactly what the NME is doing behind the scenes. I feel that once I better understand this, I’ll be able to use the NME more effectively as well.

Thank you very much.

The shader code generated by the NME can’t be used “as is”, you must at the very least bind the uniform and sampler variables with the right values.

In your example, u_world, u_viewProjection, u_Diffusecolor, etc must be bound with a value, else they are empty by default. Even the standard matrices must be, because, for eg, u_world is not the standard name used by Babylon, it is world instead. All those things (and much others) are done for you by the NodeMaterial class so you don’t have to worry about it.

Even then, it won’t still work because the ShaderMaterial class is a thin wrapper around raw shader code, and notably it does not deal with lights, even when adding the light declarations in the shader code like #include<__decl__lightFragment>[0..maxSimultaneousLights] and #include<lightFragment>[0..maxSimultaneousLights]. That’s because you still have to call some javascript code behind the scene to make those includes work.

As an exercise :slight_smile:, I updated your PG to make it work with the NME generated shader code. You can compare this PG with the previous one to see what have changed.

https://playground.babylonjs.com/#KR5FLI#6

The biggest change is that I had to override the BABYLON.ShaderMaterial.prototype.isReady method to make it handle lights. Note that my changes are not very clean, but it works (the ShaderMaterial class would need more reworking to handle lights cleanly).

As said, it was an exercise, the shader code generated by the NME is not really meant to be included in a ShaderMaterial afterwards… If what you are after is to be able to reuse a NME material in another one, I think some people are working to make this happen inside the NME itself (@Deltakosh will know more about this).

Regarding your other questions:

  • #include<lightFragmentDeclaration>[0..maxSimultaneousLights] means to inject maxSimultaneousLights times the content of the lightFragmentDeclaration.fx file, the first time replacing {X} by 0, then by 1, etc (look inside lightFragmentDeclaration.fx, you will see some {X} all over the places). maxSimultaneousLights is passed by the javascript code when creating the effect.
  • .fx extension is just the extension used in Babylon for files holding glsl code, but it has no meaning by itself (except maybe for some gulp processing that expects that glsl code are inside .fx files?)
2 Likes

Awesome! This is very helpful. I’m glad I asked as I think I would’ve spent a lot of time going in a wrong direction.

It’s very useful to know that ShaderMaterial won’t handle lights by itself. Essentially, here was my concern:

I wanted to see about pushing certain limits now and in the future and had to have access to the underlying GLSL code. I was interested in implementing the Game of Life now, and later wanted to try more advanced things like genetic programming or machine learning integration. I could see using NodeMaterial for this purpose, but it seemed better to try to get right into the code. As WebGPU will be coming out in the near future too, I wanted to get a head start on ‘what’s going on under the hood’ too.

It would be no good if this worked but didn’t handle lights and shadows, so I was trying to recreate everything.

I was also concerned that the output from NME might be a bit larger than necessary. I figured that if I knew what I was doing, I could keep what was needed.

What would be the best path here then? It seems I could just continue to alter ShaderMaterial's prototype for now, if everything is working. Maybe I should attempt to help out writing a few modifications to ShaderMaterial? Is NodeMaterial sufficient for everything, even if I do it all within JavaScript? There are no major pitfalls or dead ends here? (Like, I would not have known that I could not have gotten lights to work if you hadn’t mentioned it.)

I’m still a bit off from the more advanced applications, but I do want to make cool looking cubes/blocks and on some I’d like to render stylistic cellular automata, so I think I can make do with the playground modifications for now.

Thanks again. The modified playground is quite helpful.

Well, WebGPU is not expected before the end of the year at best, and not in all browsers. So, it will take time before it is widely supported. Also, the shader language is different from GLSL used in WebGL and is not completely defined yet (but knowing GLSL will certainly help to write shader code for WebGPU).

Note you also have the CustomMaterial and PBRCustomMaterial materials that you can use to inject some custom code into the existing StandardMaterial and PBRMaterial respectively. You can’t inject anywhere but at some specific locations, but you can still do a lot of new things with those materials (look for them in the doc and in the playground). And because they are extensions of existing materials, lights do work for them.

It seems it generates a lot of code because there is basically a single instruction per line, but in the end, once compiled and optimized by the gpu driver it should not be much different from something you would have written by hand.

Try the CustomMaterial and PBRCustomMaterial and see if that works for you.

Something to discuss with @Deltakosh / @sebavan I guess.

I would say the major feature you can’t use in the NME (at least not yet, I don’t know if it’s planned but it seems hard to support) is the “loop” construction. You can do without it (by duplicating nodes) if you know how many times you must loop and if it is a rather low number, else it’s not really supported.

Maybe one way to extend a NME shader (for advanced users) would be to have a kind of node where you would inject your own custom shader code afterwards… Don’t know if/how it would work, it would need some thinking.

1 Like

For me shader material is freestyle shading :slight_smile: and should stay as such. There are tons of different ways to handle lights and it would over complexify it. As @Evgeni_Popov said relying on Custom materials might be the way to go.

I also quite like the custom code node but in this case, it might be simpler to create your own node in your code and reuse it in the graph ???

BR,

1 Like

Yes, that’s what I was thinking, but maybe we could provide a NodeMaterialBaseCustomBlock to help in the matter(?)

1 Like

Would be cool but will be hard to define inputs and outputs in a graphical way so once ready to write code maybe it is more flexible to go full in ?

1 Like

Yes, this block would not be used in the graph editor, it would be a base class we would subclass to write our own block, but it would provide some help for doing so.

For eg, we could devise a way to declare the inputs in a more straightforward way (something like: I need the output named “X” from the node with name “Y”).

3 Likes

Thanks for all of the input guys. I’m starting to better understand how to do things and where to put things.