(Sorry for the long title, discourse was giving me issues with using just the function names).
Here is the playground: Babylon.js Playground
It looks like the standard CreateScreenShot function works for capturing the FullScreen UI, however using a render target does not. Perhaps I am missing something, but it would be nice to be able to use the anti-aliasing along with a few of the other options.
The CreateScreenShot function seems to produce pretty poor results for me, regardless of the image size. Is there a better way to produce an image of what is actually shown to the user?
Also, as a side note, is it possible to use the DynamicTextures for Meshes to create a text that maintains its size (like the 2D UI)?
Sorry to bring this back up, but I have started looking at this again.
Now that I know UsingRenderTarget is better in regards to quality, I am now trying to get it to function with my gui elements.
I have taken your suggestion for adding the elements to a plane, however, I am struggling to get a close representation of the plane with gui material vs just the gui.
Rats! you are right
Well this is where the Node Material could be super useful You could create a vertex shader that will generate the quad in screen space directly
You wouldn’t happen to have a simple example that does that in a playground would you?
I watched the youtube video you just put out today, but since I am creating the gui on the fly, I am unsure how I will be able to convert it to a shader.
var Multiply1 = new BABYLON.MultiplyBlock("Multiply1");
Texture.rgba.connectTo(Multiply1.left);
var fragmentOutput = new BABYLON.FragmentOutputBlock("fragmentOutput");
Multiply1.output.connectTo(fragmentOutput.rgba);
nodeMaterial.addOutputNode(vertexOutput);
nodeMaterial.addOutputNode(fragmentOutput);
multiply1 is only connected for left but not right
I am trying to add all of my gui elements to an texture, that is then used in the shader program, so that when calling UsingRenderTarget, my gui elements will also be displayed.