WebGL and '2D Code editor look'

I’d like to create what looks like code you’d see in an editor within BabylonJS and I’m still wondering how best to go about this. Examples of what I mean include code as seen on codepen, or here, or here. That ‘2D code look’.

I’ve developed great looking and functional prototypes using HTML elements, but now that I’m using BabylonJS, I’d love to integrate 3D.

I’ve thought of using a DynamicTexture, but the lighting is a problem. I want the lighting to look like a code editor or like a command line; to look 2D. I’ve thought of overlaying HTML elements atop a canvas that Babylon controls, but I’d like to, at times, invoke 3D effects like using a camera, or introducing perspective. In essence: I’d like to have the full power a WebGL context provides, but also want to look like a normal 2D code editor.

To give an idea: I’d like it to be possible for it to look like someone’s coding away inside their IDE, and then a cube lifts up some of the text, or it bursts into particles, etc.

I’ve briefly looked at the GUI extensions. I’d prefer to keep things within the main BabylonJS code base using meshes and materials (or other features of the main branch) that I can control. If I have to make a custom shader, I’ll learn to do that.

Any ideas would be most appreciated.

Edited:
Looking at this playground regarding creating custom meshes, I guess I could create a custom mesh that’s rectangular, specify the normals so I get a homogeneous lighting effect, and use DynamicTexture as much as needed to fill in the text. This would be a simple solution – it seems like it would work, but I’m not sure yet (still learning).

It is worth a try :wink:

1 Like

Alright! I’ll be giving this approach a try first. If it works halfway well, I’ll create a playground for demonstration and for requests on improvements.

1 Like

I thought for a moment you were working on some sort of VR IDE :o

1 Like

That’s something I do have in mind! I mean, think about it: look 50 years into the future…will we still be coding via little 2D boxes? Holographic and VR/AR interfaces will provide greater expressive capability and will begin to make an impact. Further out still, we should have neural interfaces; first indirect (EEG), and then direct (e.g., something like Neuralink). BabylonJS already supports WebGPU and we’ll continue to have code that runs in VR/AR environments with practically little modification. So – experimenting with a VR IDE, and other future experiences, is definitely something we can start thinking about now.

It’s a very exciting time for the web and all technology.

1 Like

I am 100% looking forward to that.

I just watched a GabeN interview and he mentioned he has been largely working on human-brain interfaces :o