As the title suggests, I searched for a lot of information in the official website’s documentation, but couldn’t find a similar document. However, I tried using it myself “ let advancedTexture = BABYLON.GUI.AdvancedDynamicTexture.CreateFullscreenUI(“UI”, true, scene);
var button1 = BABYLON.GUI.Button.CreateSimpleButton(“but1”, “Click Me”);” This code can be used to paste materials later, but it has no effect
Sorry, trying to understand your question. The title suggests a custom shader (which is not possible in the 2D GUI), but then you show a simple code sample to create a button (which is a standard feature of our GUI system), so I am not sure where the custom shader comes into play
Do you want to create a playground as an example, explaining with running code what you are trying to achieve?
As said above the PG would be best, but else you could may be just send an example (link or screenshot). Using a multi-cam approach or working with textures, there’s a lot you can achieve. For example, in one of my demos, I wanted to have a frozen-like GUI (with a frost animated layer over containers or controls). As for the control themselves, honestly, I am not a big fan of these (mostly lagging) GUIs that have just too many fancy FX.
I have used Unity and UE engines before, and recently started working with the Babylon engine. When I was creating UI interfaces, some dynamic effects could be directly implemented on the UI, such as sweeping light in the UI interface or implementing dynamic flow light in the UI interface. However, I cannot find similar materials or cases in the Babylon JS tutorial, and the official interface editor does not provide relevant implementation interfaces.
If pure HTML is used to achieve the effect of scanning light or more complex effects, then the UI component function of babylonjs is indeed problematic
Yes, well. That’s precisely what does not exist in BJS and why you also cannot use a shader on the 2D GUI (layer). It’s ‘per design’. The GUI layer is rendered separately from the scene in it’s own layer. Only post from the rendering pipeline can affect the GUI. However if this is just to achieve a simple FX as the one showing above (a light swipe), this is still achievable (fairly easily). You can do it through animation (cells on image). You can also ‘sandwich’ the GUI layer between two other layers, using layerMask and a multicam approach. As I said, something I did. The building will just require a little bit more effort (and be fairly different) from the out-of-the-box UI fx you are accustomed with. You can call it more ‘amateurish’, more ‘cumbersome’ or… simply ‘more creative’
You can also use the texture mode of the GUI, which creates a material for the mesh you apply the GUI too. The material is a simple standard material, but you can create your own material (like a node material), and reuse the emissive texture of the original material (which is the GUI) in your new material:
Just wanted to add a note - this will manipulate the entire texture (i.e. the entire GUI). To achieve the effect you have pasted before you would need to generate a GUI texture per GUI element. It is impossible to manipulate this way a single button in the GUI hierarchy.
It’s possible to apply different effects on different controls in a single UI, if you know the location of the controls in the GUI and convert their coordinates/dimensions to values between 0 and 1 that you can use in the fragment shader. Of course, it’s a bit cumbersome, but it’s technically possible:
Oh. That’s starting to become really interesting. I realized about the remark from @RaananW when I actually attempted to move the control. The fx was not showing the same. I realized that when making a ‘quick and dirty’ transpose to use as an FS GUI (since I suppose this is the final goal). Was wondering why the poster did not ask about this part yet.
Would you say that this would be done by using multi-cam and layerMask (and use adaptive scaling in some way to resize the plane and adt resolution on canvas resize? how just exactly?) And in this case how would you translate the coordinates for NME?
I was thinking more along the lines of applying a post-process to the GUI layer, to apply all the effects at once. This means that the GUI layer should not be applied by the system (the layer component) but used as input for a post-process, which would blend the result onto the screen.
Something like:
To disable default rendering, I set advancedTexture.layer.layerMask = 0, but this also disables GUI texture refresh. That’s why I have to do this hack on lines 133-136, so that the texture is updated before post-process rendering.
We could make something quite generic this way, the pain point being that all the effects must be implemented “by hand” in the post-process…
WoW You are my hero, man I think I told you already …This is just so incredibly cool. I’m speachless. Bookmarked already. As said, I’m not so much about using fancy FX in the GUI. But being able to turn this using the FS GUI is something that needs to be told and showcased.
Just because you don’t like it doesn’t mean others won’t use it. As a tool, you can choose not to use it, but you can’t do without it. Otherwise, as a tool, it would be a failure
For example, if I want to achieve a sweeping effect in multiple buttons and only use a custom pipeline, then it’s like a program, just add the sweeping image and your own defined speed and direction, and change the underlying texture and sweeping mask in different places, which is very efficient. But if we follow your idea, every button you make needs to be processed with a corresponding sequence frame. What if the customer needs to modify it? You need to remake the sequence frame texture, and the hardware’s performance cannot be guaranteed
Yes, you can add effects to controls displayed in a GUI with this method (see my PG), but you would have to write the effects in glsl directly inside the post-process shader code (see lines 16-34 from my PG).
Absolutely. You’re totally right there (and I also mentioned that it would be great if it can be achieved). As I said, this is my personal opinion only (in terms of design and as a user/gamer).
Well, that’s the part I would actually disagree with (sry ) I don’t think you can call this ‘a failure’. It’s rather a choice of design and where to put the (limited) efforts and resources to deliver the best possible experience according to a vision and objectives.
Yes, indeed. I understand that. But then, let me put it this way. Just before you said something like “It’s not because I don’t like it that it shouldn’t be there”. But if I follow your way, now you are also asking for ‘JUST’ the sweep-light FX YOU need ! What about all the others?… What if some other user wants ‘JUST’ a venitian store FX, an explode FX, a whatever FX (copy/paste here the list from your favorite app… PSD, AE, Unity, whatever…). That means We (the Team or the Community) would need to implement all of these ‘out-of-the-box’ FX, and of course with all the fancy properties that come with it/them.
Fine. I would be very happy to have all of this . But the reality is that it also comes with a cost.
May be you haven’t noticed yet, but we don’t have the same resources than the tools you are refering to (sadly… and at least, not at this moment).
But… (and that’s MY big ‘but’) we do have a strong ‘base’ with this framework (So I think ). Something that’s likely more versatile and more customizable than ANY of the ‘competitors’ you mention (again, my opinion only). Only very rarely is there absolutely no solution for a problem For I do agree that the solution sometimes requires more efforts and/or a compromise. But a world without efforts and compromise is a normalized and restrictive world, where we’ll all be put ‘in a box’ and - in terms of design - all use the same predefined FX… Personally, I find that there’s already enough of this ‘normalize-to-a-box’ companies and vision today. I’d rather not add BJS to the list. Again, as said, my opinion only
In conclusion, I sure hope you’ll find a suitable (acceptable) solution for your use case… and meanwhile, let me wish you a great weekend
You can wrap an advanced texture into a GUIWithEffects instance and just call setControlEffect(control, effectNumber) to set the effect you want on a control.