MakeHuman in PBR on Browser

Hey,
I made a tester for my exports from Blender. While this is just a single demo from that process, I can apply this anytime I generate files out of Blender. I run them to see how the PBR materials look in a browser, on my local machine, with just a double-click of a small html file, also generated.

When the scene is of a MakeHuman, called an Automaton, extra things are testable. I have yet to make a function to iterate through all the finger shape keys for testing, but I have high confidence this is coming through correctly.

What I am working on next is getting a final KinectV2 skeleton, and finishing the Kinect capture. Samples from this is what will comprise the ‘Calisthenics’ button. I will update this demo & put an additional message here, as this progresses.

Here is the Live Scene

10 Likes

This is excellent! Thanks mate!

whoooot this is really cool !!!

cool! is this open source?

is this more for testing materials, or animations, or both?

cheers!

:wave: Chase

Not right now though it is in own repo. It is kind of custom for my work flow & JS exporter, so testing both. If you are exporting to a .babylon or .gltf, the sandbox can do the same to materials with the inspector turned on.

Things are not as directly assignable from the inspector, but one limitation here is the number of materials. The GUI system does not have drop downs, so screen space could be an issue for a large number of materials. I really only have one mesh & children per export & a 30" display, so I do not hit it, but as a generic test system it would really need to handle more materials than can fit on a screen.

Bump, Finger exercises just a 20 minute ‘exercise’ itself, so just did.

@sebavan, I noticed today, that there is now a difference showing between Studio 256px/face & the 512 environment textures. Did you do anything?

not at all :frowning: can you repro in the PG ?

It is not a bad thing. 512 is just a little shinier now on the skin. Before, I could never find any difference between the 2.

Yup there should not be, since when did this happen ?

Today is the first time I noticed it, but I only published 4 days ago. Did not run since published, but must have run a hundred times before and it was always the same.

I actually thought it was an improvement. Why bother having 2 if they produce the exact same result? Do you show a difference in the link from the first post?

It looks like there is a tiny bit of diff in the mip prefiltering but nothing scary so all good, it is weird you were not having it before.

Hi JCPalmer,

I am also working on driving MakeHuman avatar with Kinect V2 recently. I would like to ask if you have any code or demo showing how to implement this.

Thank you

I am not working on an avatar. I am working on highly animated fictitious humans, including synthetic voice. I go through Blender, and get the Kinect V2 skeleton using the MakeHuman Blender add-on. I wrote the part of the add-on which does this.

I have since modified the skeleton conversion code to not swap out the hands, but keep the original, high detailed finger bones, but never pushed this into the add-on. There is a 2nd version of the add-on, but I have not gone to it.

I think you are going about this from the wrong direction. You should look at https://forum.babylonjs.com/t/vr-avatar-support and find out what the requirements are like bone needed & names required, morph targets needed, etc.

From there decide where you are going to source things.

Hi JCPalmer,

Thank you for your advice. I will revisit the direction I am going.

Cheers

Nice, I made a green guy with a gold suit.