glTF Exporter - Convert a Scene on NodeJS Server

Hi,

I am trying to convert a babylonJS scene to GLTF using the GLTF exporter ( glTF Exporter - Babylon.js Documentation).

The complex part is that I need to do this on my NodeJS server. To do that I am using the NullEngine for Server side as per the following example (Server Side - Babylon.js Documentation)

Scene is created however the GLTF export does not work.

The problem is that I can not seem to be able to use the gltf.downloadFiles(); function on the server as it creates an error, “Error ReferenceError: document is not defined”.

The second issue is that the Blob / file data object is empty and Blob.size is undefined. Does someone know if the GLTF exporter works server side ?

Hope someone call help me with the issue. Cheers, Julien

1 Like

Hey Julien, it is not supposed to work but we could perhaps make some arrangements to make it work.

the gltf.downloadFiles is easy to fix as you can directly get data from the GLTFData.glTFFiles

Data being empty is another problem. Is there any error in the console?

Hi Deltakosh,

Yes the blob object is empty => GLTFData.glTFFiles[‘mesh_name’].Blob = {}
The export does not seem to work. Let me know if there is other ways to make it work.

For now another solution I found was to just simply export the mesh as OBJ using the OBJExport.OBJ([mesh]) function. It works well, but no GLTF export though!

Thanks for your help.

so no error on the console at all?

No error on the console log

Is it out of a simple scene like just a sphere for instance?

Nope those are objects that are import from https://furniture.3d.io/ and need to convert to .GLB

ok can you recreate the scene in the PG? I will use it to test what is going wrong

Is there any progress on that topic? I’m facing the same issue at the moment.

Hi Pascal,

Here is a snippet of code that allows to search for a 3D.io furniture, download it and convert it to obj to finally save it on your server! Hope you will find some part of the code helpful for you!

Cheers, Julien


// Packages Used
import io3d from ‘3dio’;
import { ArcRotateCamera, Vector3, StandardMaterial, NullEngine, Scene, Mesh, VertexData } from ‘babylonjs’;
import { OBJExport } from ‘babylonjs-serializers’;
import axios from ‘axios’;


// ExpressJS Route

router.get('/download/:furniture', async (req, res) => {

try {
const { furniture = 'Plywood DCW' } = req.params;
console.log('Search for:', furniture);

const saveFile = async (name, file, type) => {
  try {
    if (!file) return;
    const filePath = `models/${name}.${type}`;
    if (type === 'jpg') {
      const stream = fs.createWriteStream(filePath);
      const response = await axios({ url: file, method: 'GET', responseType: 'stream' });
      await response.data.pipe(stream);
    } else {
      fs.writeFile(filePath, file, (err) => { if (err) throw err; });
    }
  } catch (error) {
    console.log(error);
  }
};

const objects = await io3d.furniture.search(furniture, { limit: 1 });
const objectsCount = objects.length;
const engine = new NullEngine();
const objectsData = await Promise.all(objects.map(async (object, index) => {
  const { name, description, designer } = objects[index];
  const { meshes, materials } = await io3d.utils.data3d.load(object.data3dUrl);
  const mesh = await Promise.all(Object.keys(meshes).map(async (meshId) => {
    const { positions, normals, uvs } = meshes[meshId];
    const indices = [...Array(positions.length / 3).keys()];
    const scene = new Scene(engine);
    const mesh = new Mesh(name, scene);
    const vertexData = new VertexData();
    const positionsFormat = Array.prototype.slice.call(positions);
    const normalsFormat = Array.prototype.slice.call(normals);
    const uvsFormat = Array.prototype.slice.call(uvs);
    vertexData.positions = positionsFormat;
    vertexData.indices = indices;
    vertexData.normals = normalsFormat;
    vertexData.uvs = uvsFormat;
    vertexData.applyToMesh(mesh);
    const material = new StandardMaterial('material', scene);
    mesh.material = material;
    const OBJFile = OBJExport.OBJ([mesh]);
    const hello = positions.toString() + normals.toString() + uvs.toString();
    await saveFile('text', hello, 'txt');
    Object.keys(materials).forEach(async (materialId) => {
      const { mapDiffuseSource = null, mapNormalSource = null, mapSpecularSource = null } = 
materials[materialId];
      await saveFile(`${name}_diffuse`, mapDiffuseSource, 'jpg');
      await saveFile(`${name}_normal`, mapNormalSource, 'jpg');
      await saveFile(`${name}_specular`, mapSpecularSource, 'jpg');
    });
    saveFile(name, OBJFile, 'obj');
    return { positions: positionsFormat, normals: normalsFormat, uvs: uvsFormat, indices };
  }));
  return { ...mesh[0] };
  // return { name, designer, description, mesh };
}));
console.log('File saved:', objectsData.name);
const data = objectsData[0];
const json = Buffer.from(JSON.stringify(data), 'ascii').toString('base64');
res.send(json);
// res.send({ count: objectsCount, objects: objectsData });
} catch (error) {
console.log('Error', error);
res.status(502).send({ message: 'Error', error });
} 
});

Thanks @Julien,

that will actually help as an intermediate solution in combination with the “obj2gltf” package. Though, the overall goal is to being able to export draco compressed gltf/glb files via a nodejs pipeline (if possible, everything within babylon).
Loading and manipulating meshes in the scene works fine, only the step to export the scene causes issues at the moment.

@Deltakosh Not sure if it is planned to provide better support for server side use cases with babylon. If so, it would be great to offer the data as buffers for easier handling instead of blobs.

You might already have had a look at this GitHub - AnalyticalGraphicsInc/gltf-pipeline: Content pipeline tools for optimizing glTF assets.

Yes, that’s what I’m currently using as a last step in the pipeline to draco compress the files :slight_smile:

Unfortunately, it can only read in gltf files/data and writing the files on disk between every step in the pipeline is killing performance for the amount of data I need to process. So the best thing would be to setup the scene in babylon, do all manipulations, serialize gltf data to a buffer and process it with the gltf-pipeline lib.

@pascalbayer: what kind of API are you thinking about?

I’m facing the same issue at the moment.
I thinker the reason is blob is work on webapi,not support on nodejs,how to resolve the problem

Bro, have you solved it? I am also facing this problem now

I think @roland has some nice tools running gltf pipeline on a server.

1 Like

Can you specify which part doesn’t work for you? It would be even better if you could describe from scratch what do you want to achieve.

1 Like