Best way to create our own volumetric video player?

We have access to a list of objs and a video texture and we want to create a player like this one
https://www.anayi.com/include_html/4d/21ss_4d_07.html

My naive first idea was to store a list of Draco meshes inside a single file + mp4 video texture, the problem could be that downloading all that entire mesh file and splitting it in chunks could be slow, and could take a lot until the user can see the first frame. I’m not sure if we could stream that “big” file and also I’m not sure if we could sync the mesh modification (vertex and so) with the video texture player.

I’ve seen other volumetric videos but all them are made in ThreeJS and I don’t know why.

Thank you in advance.

The example at https://www.anayi.com/include_html/4d/21ss_4d_07.html was created with Holostream. Here is the Web API - HoloStream Web API — HoloStream 2021.1 0.0 documentation

Obviously I mean using BabylonJS I’ve seen other options using ThreeJS too.

The possible solution is to take the code from https://holostream-static.akamaized.net/release/2021.1.2/remote/HoloStream-2021.1.2.min.js and port it to Babylon.js

What are the source files in your case? In the example which you mentioned they have some special video format.

The intention is to create a custom format that works as well as possible with Babylon since we have access to the raw data and we can compress/store it as we want.

Thank you.

Hello @Escobar just checking in, was your question answered?

Sorry for the laaate reply I didn’t get notified.

I had not a clear answer but at least a tip to start (digging in the threejs code)

1 Like

Although not exactly what you are looking for, here’s my recent experiment using Babylon.js to render volumetric data from DepthKit:

(Note that this is incomplete work-in-progress and currently doesn’t work on Safari/iOS or Safari/Mac).

The underlying approach is that volume/depth and color are encoded on to the image (or video). That texture is streamed to the GPU where shaders are used to calculate the vertex positions.

Again, that’s probably not exactly what you are looking and DepthKit doesn’t have the same fidelity of something like HoloStream. Still, it is an example of using Babylon.js for volumetric renders. :slight_smile:

2 Likes

@kaliatech this is very cool nonetheless! :star_struck:

I’m in the process of converting parts of my Unity project to achieve something similar.

Currently I use a very simple custom format and haven’t open sourced (yet) but I’m happy to change my goals if someone else needs it :grinning:
The original project was doing real-time capture and streaming, from an iPhone.

4 Likes