yes but not for 4.1. I want to build the same system on top of Babylon.js actions
Hi DK and team. I have some curiosity about save/load and possible “packaging” used on these sharable materials. Any info about how THAT works… is welcome.
Fellow forum users… it is easy to miss the “when you click on edit” statement at 1:25 into the video. The edit button is easy to see in the playground, but DK’s pointer/action (on the EDIT button, at around 1:25) is blocked by the lower-right inset-video… within the main video. Sometimes I miss the obvious stuff, eh?
That big “edit” button is located in the Configuration group… on playground #1 Inspector panel (when a node material is selected, automatically done in THIS playground). Click on that edit button, and it will launch the NME (node-material editor). YAY! Notice that lines 5-60 of that playground… is… what one could call… NME Scripting Language v0.1, eh? This is why saving/loading NME configs… seems interesting to me.
Once launched, you can drag any of MANY nodes in the left column/sidebar… onto the NME “canvas”. FUN! Powerful. Today was my first-ever launch of the NME. My dog is wagging its tail quite vigorously. MY tail is doing the same.
Sidenote: Should the Scene Explorer close… when that playground is re-runned? shrug.
The save and load use a classic json file format with parameters and textures data embed. So they are independent files that can be shared over internet. I also plan to have a standalone version of the NME where you can save/load just using the Url like the playground
Also please note that you can load the json file directly from your code with NodeMaterial.loadAsync
I will work on the Doc soon
The beginning of the doc:
why the choice of “setAsWellKnownValue” as the method name? Wouldn’t it be like VaryingValue or something similar to glsl convention?
The idea is to identify values known by the system (like constants or globals). Also I do not want to be too close to glsl as the overall idea is to provide a simpler way to create shaders without worrying about how shaders works
I agree about the simpler way, but you should use the same nomenclature so that way things are associatable once they start digging deeper.
What does Unity call them?
Then we should call then uniform which makes no sense for the beginners;)
“wellknownvalue” does not have any meaning to a beginner… and they could not google anything about it either if they were trying to learn. I mean it has no meaning to me and I know what they are, so a beginner is even more going to take it as convention.
It would really be smart to keep the nomenclature associative so that way things are transferable and you don’t have to learn two different naming conventions to understand how to make something from references. It’s still glsl, it’s just node based.
Well I disagree:) they are well known values of the system and differents from varyings. They are uniforms that the system can fill for you.
Again think that the audience is not only developers but also designers or people who could have used other node editors like in Unity or unreal.
Also then I must change everything as the system uses inputs that could be attributes or uniforms. Uniforms can be either user defined or well known values.
I’m sorry but the philosophy of the node editor is not to be on par with glsl. It could generate whlsl or spirv in the future for webgpu for instance.
The goal is to provide a tool to describe vertex and fragment shader is a simpler way
From the API standpoint we have:
- input.setAsAttribute which gets its value from a mesh attribute (like position or normal)
- input.value=… to directly set a user value (into a uniform)
- so it appeared that input.setAsWellKnownValue(enum) did make sense to set the input (again as a uniform and not a varying) as a system well known value (like world matrix or camera position)
On first reading I thought a well know value was one I should know well but didn’t. Just some possible suggestions (if WellKnowValue is up for debate) GlobalValue, SystemValue, SystemKnownValue or even BJSValue, BJSKnownValue
I like System Value as well. I could change setAsWellKnownValue to setAsSystemValue if this helps @Pryme8 ?
I’m down for whatever as long as we have the documentation to explain it all and make the correct associations!
I just don’t want to scare off newbies with the thought of having to learn multiple different vocabularies.
So as long as we have the information for them to be taught correctly I’m all for whatever everyone else agrees on.
Completely aligned. I’ll rename the function to the System values one.
Also I will spend time on making sure the Doc is clear (and I will also do more videos so beginners can see it live)
I really believe this is a cool tool to bring more people to babylonjs
hello,How can vscode have tips for Babylon.js? I don’t know what to do.
Add this to the first line of your JS file$
/// <reference path="../engine/babylon.d.ts" />
Node Material Editor… yuh yuh yuh. The “NME”.
WAY BACK… when this thread came active… and we read some info… and saw some simple demos… I said… “ooookay, this looks like an interesting way to make some complex materials.”
What is a material? These days, or maybe always, it’s a shader. Works for me, I don’t need to know much more than that. Tell me how to activate it, and I’ll gladly use it sometimes.
But then, recently, @PirateJC kindly and informatively produces some Node Material demos/videos… and he has vertex positionKind data… moving all over hell, using complex formulas… and… well…
…is that still a “material”? Is it… a “Geometry Shader”? What the heck is going-on, here, guys?
I fired-up a playground of this NME-driven position-kind animator… that PirateJC showed us in part 2 of his “Mystery Demo” series. https://playground.babylonjs.com/#4I3SIR#15 NME version here: https://nme.babylonjs.com/#QHB2ME#1
Gerstner waves. More than just a sine wave. Added Gerstnerization! Gerstnerification? Gerstnerity? Trochoidalism? hmm.
So, if I were to monitor the positionKind vertexData for that animated ground… would I see it changing world-space locations?
Or… am I viewing a GPU-caused shader-made “optical illusion”?
PositionKind animating with a MATERIAL editor… how goofy is THAT? Or is it?
I can’t tell what’s an animation and what’s a simulated animation… anymore. Am I alone?
So, who has the goods on what’s shakin’? Do tell. If I added a meshImpostor to that wavey ground, and tossed a physics active sphere out there, would the sphere even KNOW that waves were happening, or would it just lay there, limp and flacid… saying “I don’t feel any waves. Are you sure they’re real?”
Who is Chief Puppy Handler for the NME? Some of us NME puppies could use a little paw-holding and belly rubbing. To us, these “materials” seem to have ventured outside-of their native definition of “What IS a material”, don’t ya think? I’m SO confused. HELP! I might need diagrams. Do we have EZ-Flowchart for GUI 2d, yet? hmm. I need NME-like drag’n’wire in the playground… to make diagrams.
Then we’ll add that to playground-based tutorials (PBT) systems… as an “active diagrams” feature. YEAH! NME for GUI 2d! yuh! hah
@Wingnut I totally feel ya on the confusion.
I am very new to the world of advanced materials and shaders and have been falling further and further down the rabbit hole ever since I first started.
I can only give you my own personal perspective, which is by no means a representation of everyone. Before diving deep into NME, I understood materials to be a combination of things that generally changed the “look” of a mesh. Shadows, Normals, Diffuse, AO…stuff like that. It wasn’t until unraveling this beast that the whole world of animated materials was opened up to me. I suddenly realized that there is an entirely new world of advanced “materials” out there, and that my previous perspective/idea about what a material was, was limited.
Now the truth is, that the definition of material is likely different in the minds of different folks. For myself, my definition of what a material is has advanced quite a bit. It’s a combination of math that can be done to change the position and look of a mesh! This is something that blew my mind, but I find incredibly exciting (as you can likely tell in my enthusiasm for the subject in tutorial videos.)
To your question…are the physical properties of the mesh actually changing in world space? Or is it just an optical illusion? That’s a great question! A FANTASTIC question! Again, seeking guidance from @sebavan on this one, but from everything I’ve understood so far…geometry shaders and vertex shaders (we only have vertex shaders in WebGL land) both actually move vertices of meshes in world space…they are NOT optical illusions. Rather than moving vertices through rigging, animation, physics, or fx, in the case of these shaders, we’re moving vertices by giving mathematical operations to the GPU to move the vertices for us! How cool is that???
So now the real question…what is the difference between a Material, and a Shader?
I found this article: terminology - Difference between Material and Shader - Game Development Stack Exchange
which has this definition in it that I find quite helpful in understanding the space a little more:
"A material is what you apply to geometry to give it a colour and pattern. A texture is a component of a material.
A shader is a small program that allows this material to be rendered at runtime. The nice thing about shaders is that you can do everything from simply rendering the material, to adding dynamic effects like specular highlights and reflections all the way up to extremely clever things such as rendering fake holes through walls where a bullet has hit. "
I hope this is helpful in some way. I have to admit my knowledge is still very entry level on the subject, but I think you raise some really great questions! Thanks for bringing them up! Again, hopefully @sebavan help us more deeply understand and correct anything that I’ve misrepresented!