Introducing Polymorph: An Open Discussion

I have no idea what this is!
The article was like Chinese to me.

1 Like

We made some work at @naker ( on this subject. So I wanted to share with you how we manage the pipeline. You can see the code here: Naker Compression

The idea is to compress the model we receive in our editor which can be huge (creation asset) and turning it into a more friendly web model (consumption asset => I like those names!)

For that we use imagemin - npm in order to compress textures, on average we save 80% of the file size without losing in quality.
Then we use gltf-pipeline - npm and obj2gltf - npm in order to end up with only one glb file because we think this is easier to manage.

We thought about adding a step with model compression using Draco but we didn’t have the time to implement it. We were also very enthusiastic and met with the guys from Slight3D: Sligth3D on Vimeo but the project doesn’t seem to exist anymore which is a shame.

Completely agree with the choice of C++
Hope this will help in your thinking :wink:


But then how is all this gonna work? It’s all fine and dandy talking about it, but how are we going to actually take the first steps?

I get at this point it’s theory and that’s the point of the decision. But isn’t stuff like this decided by market share and acceptance by the user base once there is something to grab onto?

If we are talking about shapping industry standards then it’s going to take getting the industry on board. Maybe I missed some of the message in this but is all seems very hypothetical at this point.





1 Like

If you intent is to have this work flow run in a browser, then it seems like a ‘step’ in the process could be implemented in C++, or JS, as long as at least a C++ header was written for one written in JS.

Also, speed is overrated for a back office tool. I am working on a back office tool which needs to perform a Short-time Fourier transform, allow me to edit the result, then run an inverse to get the modified input back. The biggest problem is integrating it into JS where the rest of the larger tool is. I do not care how long it takes. If it does its job, I can pull off something in real time which I could never do directly.

Another way of saying, speed really counts in the product, not a dev tool.


I read through the document posted by @PirateJC and could not help getting the feeling I was listening to a politician - seeing words like “consumption experiences” and "creation assets (CAD models, etc.) and consumption assets (glTF, etc.).

Then I read further to “Alice and Bob need mesh decimation” and wondered how that might work.There are a couple of decimators that I use when necessary Instant Meshes and MeshLab. The latter has all kinds of options that involve two menus and a popup box. I wonder how the proposed decimator will work?

And then there is a chair like this

How would the decimator handle that chair with all those tufted buttons ( that I would have created as instances) ?

And if you are going beyond just displays of models/items - and creating scenes - there is the issue of scale if original sources are different. And what file formats will be included for conversion to gltf?

Here is and example of models created with 3D scanning technology - end result a .stl file.

And here is a simple celebration I made from it for my grand daughter’s birth - 250,000+ vertices to 35,000 vertices using Instant Meshes, then creating a “dirty vertex” texture to emphasize the shadows.

Will such models be loadable and decimatable for Alice and Bob?

If you are going to display items on the web - then build for the web, not slash old catalogue models.

I hope that is not too negative - I just don’t see “one pipeline fits all” - but I maybe very wrong :grin:

cheers, gryff :slight_smile:


@gryff Im right with you on all of your points, so it can’t be too negative. It’s the issues that need to be addressed.

It’s like getting excited and telling everyone you are going to cure cancer, it’s all good in conversation but then actually executing the process is another.

But it all does start with motivation and conversations.

Some the process is identifying the problem, and we seem to have that covered. I guess now is more the time for conversations on solutions to these problems, I just see more questions then awnsers and kinda feel like others might be in the same boat.



What is the SCOPE?


  • glTF
  • maya
  • unity
  • blender
  • other?


  • decimation
  • compression
  • other?

Happy Holidays.

:eagle: : )

1 Like

Prior art

An example of prior art is the Autodesk Model Derivative API. They support conversion from 60 formats to an intermediary format called SVF. The purpose is for displaying models on the web using their Forge Viewer (maybe there’s other uses too, but that’s the one I’m familiar with).

It’s closed source of course, but I think some people have reverse engineered the SVF format so it may be useful to take a look at that.

glTF as the intermediary format

I’m assuming that you are going to use an intermediary format and that all processing will be done on this format:

Input formats -> intermediary format -> pipeline operations -> output formats

In that case, have you considered using glTF for the intermediary format?


  • That would solve the problem of developing a new format which is likely to constitute a huge chunk of the work here.
  • Pipeline operations would become operations on glTF files which means they can be used anywhere that supports glTF.
  • glTF is designed to be lightweight and fast to operate on.
  • Open source tools already exist to convert many formats to glTF.
  • One less exporter to worry about :grin:


  • glTF may be too limited if your goal is anything beyond real time use.
  • You are using a format belonging to someone else (arguable whether that’s a con or not as they will also fix many problems for you).
  • Licensing issues?

If glTF would prove to be too limited, it may still be useful to use it as the starting point. After all, it’s designed so that people can write their own extensions.


I like to see a bunch of different definitions for “messy middle” and “pipeline”

sorry for bad writing

1 Like

Hi all,

When I first read it, I didn’t understand anything. But after a second reading, things became clearer (not to say that it’s unclear :blush:).

Are we just talking about 3D? Or more than 3D?

I have a strong background in the furniture manufacturing industry. So my point of view on the pipeline is a bit different (I can be wrong).

I will try to briefly express my thoughts (Furniture/ Joinery Industry). So;

3D is just an intermediate stop. Manufacturers need more information than just visualization. The visualization is followed by:

“Messy Middle” Nr1. Import user data (driven by conditions, equations…) e.g. Json, Xml, Csv…

Part info (size, oversize, edging length, edging application…)

Material Info (wood, laminate… if(laminate)what a sandwich consist of)

Hardware info (quantity depend on object size…)

Grain orientation

Machining pattern (depend on part coordinate system)

And so on.

All that information can be kept in 3D.

Here begins a “Messy Middle” Nr2 (For me). Export data (driven by conditions, equations…) e.g. Json, Xml, Csv…

That’s what I call “Polymorph”

A visual depiction of what I have explained is similar to @ycw diagram above.


  • Content pipelines can be either stand-alone utilities or components of other programs, and we want the output of Polymorph to be as portable as possible, so we’ve chosen C++ as Polymorph’s primary implementation language.

If this is still not set in stone, I would strongly recommend reaching out to the Rust gamedev working group by GitHub or Discord. I’ve started a discussion topic anyone’s welcome to join in on.

The language is every bit as portable as C++ and is arguably best-in-class for WebAssembly development.

Rust is being used by Mozilla to develop one of the reference implementations of WebGPU. Microsoft has also been adopting Rust in several security initiatives lately.


@Arte, hear you.

I work on a Pipeline also. For Cinematics. 3DWebPipeline.

@erlend_sh. LIKE.

1 Like

Hi all!

Sorry I’ve been out of pocket; I was largely unplugged over the holidays. This discussion feels awesome so far: lots of great and unanswered questions being brought up. I have some thoughts on a few as well as a repo that might further clarify some of the postulations in the article, so let me apologize in advance for the lengthy post here. :smile:

The manymap type, which underpins/enables the majority of the rest of what the pipeline does, was inspired by our desire to emulate JavaScript’s more loose and flexible approach to handling objects. The specific way in which that inspiration shows through might be thought of as duck typing (a la TypeScript, also inspired/enabled by JS) to allow a large object to down-project into any possible subset image of itself. The contract type is also a key player in bringing that inspiration to life.

As for property bags, I’ve heard the term used somewhat loosely to describe a dictionary or hashmap that’s behaving like an object. That’s what the manymap does under the hood in pipeline.h, and it’s similar (as I understand it) to how objects in general behave in JavaScript.

Also, I can’t find the quote now, but I believe you asked at some point about adding some more robust comments to pipeline.h. :upside_down_face: Yep, that’s definitely a good idea, and something I plan to undertake once pipeline.h (or whatever we go with instead, since we may not end up using it at all) is in a bit more of a lasting state.

Excellent point, Arte! Part of what we’re hoping to achieve with Polymorph is enabling the kind of reuse and collaboration we’re all talking about for 3D without actually confining the ecosystem to 3D by taking direct dependencies on the domain. That way (in theory), when particular developers need to add segments to their pipelines that branch off from (or even have nothing to do with) 3D, the mechanisms under the hood should be supportive of that.

Thanks for connecting us to that, erland_sh! I’ve heard awesome things about Rust, but I’ve never actually worked in it, so I’ll need to have some fun learning about it before I can comment more intelligently. From a very brief look around the Internet, it looks like it might be easier to call from a C++ codebase into an independent Rust library than the other way around. Am I understanding that correctly?

I may have mentioned in the article that I wanted to put together a better example of using pipeline.h than just the toy use case in the appendix, but (1) that would have made the article even longer than it already is and (2) I ran out of time before the holidays. Today, however, I threw together a quick repo to explore a few ideas in a fun, self-contained demo. Specifically, I made this repo

which uses pipeline.h to concatenate tools and steps in order to generate procedural heightmaps for mountains like these:

(Note: the demo only makes the heightmap: the mesh and texture shown were generated from the heightmap using the excellent L3DT.) Please feel free to dive into the repo and let me know what you think, bearing in mind that this is totally experimental and everything’s up for questioning/commentary/critique. That said, there are a few things I created this to explore/exhibit in particular, so I’ll briefly call those out.

Firstly, note that this use case (currently) isn’t actually directly related to 3D at all. I think we should make 3D examples next, but I decided to start out with a use case I already had a background in (from another project) and could do very quickly, which would allow me to focus on other details. The salient point here is just that, while the initial purpose is to enable 3D workflows, I currently don’t think that doing so must necessarily confine our utilities to that domain.

Arguably, the most important thing I wanted to play with in this repo was the structure, including the folders and the build system. Note that there’s a submodules directory and a morphs directory. In hindsight, extern might have been a better name for the submodules folder because morphs also contains code that could philosophically be submodules, but hey, the whole thing’s built as an exploration anyway. What I wanted to explore with this was how completely external, non-Polymorph utilities could be “wrapped” to make them easy to add and readily reusable in Polymorph pipelines. This project contains two examples of how this might be done. One Morph wraps the PNG library from Randy Gaul’s cute_headers project, thereby providing example of projecting a separate submodule into a Morph with pretty minimal CMake hookup required. The other Morph actually includes its dependency—Mark A. Ropper’s OpenSimplexNoise.hpp gist—which makes the CMake overhead to include it even smaller, but I think that’s only safe to do if the dependency should never be imported by two different Morphs in the same pipeline. Both of these Morphs depend on at least one thing that they don’t bring along for themselves–pipeline.h–and they’re informed about it by variables set in the project’s main CMakeLists.txt file. The idea I’m exploring here is that those *_DIR CMake variables would be named and established by convention. If that came to pass, and each Morph was just a submodule you could bring down and link to and all you had to tell it was where its dependencies lived (by way of a simple convention), that might be a way to easily and scalably add Morphs to various projects. (Just to be clear, to understand the concept being explored here, the idea is that every folder in the submodules directory and every folder in the morphs directory would be completely separate repos brought together as submodules by a simple set of build system conventions.)

I also wanted to explore what it might mean to mix and match functionality inside and outside formal Morphs. In main.cpp, three different types of Morphs are used: “formal” Morphs that are treated as dependencies and so should be easy to reuse across many different pipelines; “informal” Morphs such as TransformValues which follow the conventions of Morphs but are application-specific enough that they aren’t split, though they could be; and “free-form” Morphs like ConvertSimplexMapToPng that exist to solve an implementation-specific logistical problem and so, from an convention standpoint, just kind of do their own thing. I was actually pretty pleased by how easy it was to tailor my implementation to my particular use case; I was also happy with the safety the system offered (it sternly told me off at compile time whenever I hooked up something incorrectly). However, there were several things I noticed that I was less happy with.

  • When conventions/domains don’t match (as they intentionally don’t between the OpenSimplex Morph and the cute_png Morph), it can be a little awkward to rejigger the data to bridge the gap. This is the problem that will hopefully be minimized by good conventions; however, I suspect that no amount of conventional discipline will ever make it go away entirely. The process made me wish there were a way to convert one pipeline type into another; when one Morph calls it Width while another Morph calls it PixelsWidth, it feels like there could be an easier way to rectify that discrepancy. I’ll have to think about that.
  • The initialization process also felt a little clunky. Creating a “free-form” Morph to satisfy contracts that need things from arguments or app-specific hard-codes worked fine, and it didn’t take long, and it didn’t look too ugly; but it still feels like there might be a cleaner way to go about that.
  • Having to explicitly call pipeline->CreateManyMap() for no reason bugged me so much that I just changed it. That’s the only difference between the pipeline.h in this project and the one on the gist. I’ll bring over that change next time I update the gist version.
  • This weirdness. As mentioned in a comment, this is just a temporary evil until we further explore meta-Morphs: a mechanism for bundling multiple Morphs into a single bigger Morph to allow for things like loops, conditionals, and even perhaps out-of-the-box asynchrony. At the moment, though, it’s really ugly; and whether or not meta-Morphs are the right answer, we’ve got to do something about this.

That’s about it! The only remaining point I wanted to make is that projects like the one I linked to are (in my imagination at least) one of the ways we can explore how some of the many still-open questions for Polymorph might be answered. This project was low-investment (the entire thing, from conception to completion, fit very comfortably within a day) and extremely informal, but it quickly showed me some things that I liked and some things that I think could use improvement. So jump in! As I mentioned above, I definitely think we should tackle some 3D pipeline prototypes sooner rather than later; @bghgary is currently on vacation, but he’ll for sure be interested in that conversation once he gets back. Anything you think’s worth trying—3D or otherwise, pipeline.h or anything else—give it a shot and let us know how it goes! In fact, come to think of it, it wouldn’t take much to make this project have a 3D component, would it? Let’s see; you’d need…

  • a Morph to generate a Mesh from a heightmap,
  • a Morph to generate a topography texture,
  • various Morphs to compress/optimize your asset (if that’s something you wanted), and
  • a Morph to export to 3D—like an exporter, or even a basic wrapper on syoyo’s tinygltf.

Sounds like fun. Any takers? :smiley:

Wow. Just looked back over that after I posted it. I can’t seem to write 'em short, can I? :upside_down_face:


LIKE. Also the mountains too.

Thank you for defining POLYMORPH.

Maybe, I figure a way to ask question better, sorry my writing is so bad.

This LINK is why I ask:

Does PolyMorph include OR exclude “Default Rendering Pipeline”?

How would Polymorph relate to Blender?

Sorry for bad writing during brainstorm.


Nope :smiley:

1 Like

Sorry. I will try to write short too. Too much stuff removed. Hope it helps.

It seems like there are MANY PIPELINES. I like the thought of that.

One thing I would like to LEARN is how to STEP-DEBUG Python export from Blender to .babylon.

hi ALL

  1. i agree we have “Messy Middle” so problem exist
  2. i always think about that if we have so hard structure in code ( same as a polymorph ) or using the other experience
    but i notice everything have some benefit and also some harms too
  3. is that depend to me for think about? answer is yes if that can change the all old version structure
  4. let we have short look for project

babylonjs core is for babylonjs core developer
babylonjs is for old hand programmer
babylonjs is for simple newbe programmer
babylonjs is for big project
babylonjs is for simple project or test some PG

well i think “Messy Middle” is not the the first problem in this time
but i like we have independent parts and we keep our good algorithm their
by myself i keep my idea’s (ShaderBuilder and geometryBuilder) shortly independent but that work well by other parts i clean them several times but it is not so easy to read
but the other parts is like TREE structure’s coding
i like NET + CELL mechanism


the base of tree allowed us to use from any method and class in any new tools in the tree structure
but it is danger and make “Messy Middle” (i think )
if we define the the cell parts and the network between them all network have their actions but they used from base of cell’s core so core always stay independent and objective

for example i just need have vector3 as {x,y,z} it is vector 3 cell
but i can add normalize and dim … in the net class
now i can add shader net for support vector3 as a shader vec3 too
but in tree mode i most write my own structure for that .

*NET + CELL mechanism is design pattern in my mind i try write my code by this way so it is just new term

thanks a lot i just write what i think

1 Like

I’m interested in @syntheticmagus’s take on this

Hallo i have created a small tutorial maybe it will help

1 Like