Best practices to import models (sometimes with dependencies) from javascript File objects?

In our project, users can import files from a cloud/storage using a webDAV filepicker. On import, we pass the received and parsed blobs as File objects to our component hosting the Babylonjs scene. We implemented some very basic automatic peer/dependency finding in the said filepicker such that we pass the files to our viewer component in an array of objects (AssetFiles[]) where each one follows this format:

interface AssetsFiles {
    file: File;
    peerFiles: File[];
}

where, in the case of an .obj, file is the obj itself and peerFiles would be its material and textures.

Is there a tool available that can help me handle these files properly (using the OBJFileLoader?) to import them, make sure the materials and textures are applied to the correct meshes, pass them through a series of transformations I have (to rescale, remove empty vertices, and center) and to then add the resulting meshes to a predefined transformNode in our scene?
Am I mistaken for being wary of using URL.createObjectURL for the memory management part of things (keeping track of assets and revoking them when unused)?

Thanks in advance!

Maybe these posts can help:

(see also the answer from @bghgary below mine)

1 Like

Thanks, exactly what I’m looking for!
For some reason I hadn’t explored that part of the docs.

Instead of passing files, I first tried using Tool.UseCustomRequestHeaders and setting the proper headers for authentication on Tool.CustomRequestHeaders, but I kept getting 401 errors which I think stem from the fact Angular is trying to handle the outgoing/incoming web requests.

So, I decided to try and connect Babylon directly with our file picker service using an override on the FileTools.RequestFile function to handle the web requests with our instance of angular’s HttpClient, whenever the url points to the specific webDAV domain where the files are hosted. This way I could also define/use the dependency search from the SceneLoader’s plugins instead of in our filepicker. Here’s how I’m using our own requests sent through Angular’s HttpClient (filepickerService.getFile() returns an RxJS observable bound to a get request with the proper headers) :

FileTools.RequestFile = (url, onSuccess, onProgress, offlineProvider, useArrayBuffer, onError, onOpened): IFileRequest => {
  if (url.indexOf(OurCloud.baseUrl) !== -1) {
    const source = url.split(OurCloud.baseUrl).pop();
    this.filePickerService.getFile(source, OurCloud.fullnameFromPath(source), 'string').subscribe(data => {
      onSuccess(data as string);
    });
  }
  onProgress = (e) => {
    ...
  };
  onError = (err) => {
    ...
  };
};

But, I’m facing this error:

ERROR Error: Uncaught (in promise): TypeError: Cannot read property 'onCompleteObservable' of undefined

I’m having a bit of trouble figuring out where/how this observable is managed. How could I bind my rxjs observable’s resolution to the IFileRequest onCompleteObservable? Should I just overwrite RequestFile’s default observable?

Can’t you use the Tools.PreprocessUrl function to turn a standard url to a blob url as described by @bghgary?

It’s hard to tell what’s the problem is given the error message without a repro.

1 Like

Yeah, my bad. I understand trying to figure things out through tiny excerpts of code from a much larger project is rarely fun or possible, sorry about that.

I did come across the post with the PreprocessUrl solution, but I’m not sure I grasp how that would work or how it could help me. Do you mean passing urls to the SceneLoader, and keeping track of which filename corresponds to which generated blob: url to then, in the preprocessor re-insert the proper blob: url? As in, for exemple:

onImport(assets) {
    for (const asset of assets) {
        const assetUrl = URL.createObjectURL(asset);
        this.assetsUrlMap.set(asset.name, assetUrl);
        SceneLoader.Append(assetUrl, null, this.scene, null, null, null, 
            `.${getExtension(asset.name)}`);
    }
}

...

Tools.PreprocessUrl = url => this.assetsUrlMap.get(getNameFromUrl(url))

That could work, but I’ll keep trying to figure out how to properly return and update the fileRequest observable in the RequestFile function as this seems like the best option since I could profit from the file loaders’ automatic recovery of the needed .mtl and .textures in a better way than our filepicker does.

Yes, your code snippet was actually what I had in mind for the Tools.PreprocessUrl usage.

Somehow the requests are now working properly using CustomRequestHeaders, :slight_smile: only problem I’m facing now is loading the textures, just like discussed here.

So, here’s a follow-up:

One hurdle is that our webDav endpoint isn’t configured in a way that is compatible with how the LoadImage seems to work by default (creating a new HTMLImageElement object by just setting the passed url as source, without making an actual CORS ā€˜GET’ request) or with ā€œuse-credentialsā€ā€¦ and I’m not the one managing this NextCloud instance so modifying these settings isn’t an option at the moment. I tried various approaches, but the problem stays. Would there be a way to force an xhr FileRequest for images on babylon’s side to then create the HTMLImageElement with the response blob without having to do the requests myself and replace the materials with new ones built afterwards to include the received image?

The project is configured as such:

WebRequest.CustomRequestHeaders = nextcloud.Headers;
Tools.CustomRequestHeaders = nextcloud.Headers;
Tools.UseCustomRequestHeaders = true;
FileTools.CorsBehavior = 'use-credentials';

and here’s how I’m handling the import, where the main file (.obj for example) is pre-downloaded and passed to the babylonjs component as an Asset object*, containing the File with its remote (webdav) url:
*(I could also change this behavior to only pass a url and handle all requests with babylon’s webrequests instead)

const assetBaseUrl = nextcloud.baseUrl + asset.url.split(asset.file.name)[0];
const tempUrl = URL.createObjectURL(asset.file);
Tools.PreprocessUrl = url => {
    if (url === assetBaseUrl + asset.file.name) {
        return tempUrl;
    }
    else if (['jpg', 'jpeg', 'png', 'tiff', 'bmp'].includes(getExtension(getFullnameFromPath(url)))) {
        return url.replace('https://', `https://${nextcloud.username}:${nextcloud.password}@`);
    }
    else {
        return url;
    }
};
const dottedExtension = `.${getExtension(asset.file.name)}`;
if (Loader.IsPluginForExtensionAvailable(dottedExtension)) {
    Loader.ImportMeshAsync(
        '',
        assetBaseUrl,
        asset.file.name,
        this._scene,
        (e) => this.onSceneProgress(e),
        dottedExtension
    ).then(
        (value: ISceneLoaderAsyncResult) => {
            this._scene.assetsNode.addAssets(asset.file.name, this.filterMeshes(value.meshes));
            URL.revokeObjectURL(tempUrl);
        },
        (reason) => this.doStuff(reason));
    );
}

I thought forcing the basic auth credentials into the url would work, but alas.
Other than that, the .mtl file is requested properly and works just fine.

I can see there is a Tools.CorsBehaviour property and a Tools.SetCorsBehavior() function, maybe it can help somewhere… However, I’m not very knowledgeable about url requests / CORS problems, let’s see if others have some solutions for you.

Referencing to this: Another way of getting picture for new BABYLON.Texture(path, scene) - Questions & Answers - HTML5 Game Devs Forum I think I’ve almost got it working by overwritting the LoadImage function to force a proper xhr request:

Notes:

-I can’t seem to find WebRequest.onload with babylonjs’s xhr wrapper and response.onload isn’t the right thing (see update)

-The LoadImage function used by the scene loader isn’t accessible through FileTools.LoadImage nor Tools.LoadImage, hence the need to go through ThinEngine._FileToolsLoadImage. This doesn’t seem right.

ThinEngine._FileToolsLoadImage = (url, onload, onerror, database): HTMLImageElement => {
    console.log('xhr image request : ' + url);

    const img = new Image();
    const imageRequest = new XMLHttpRequest();
    imageRequest.open('GET', (url as string));
    Object.entries(Cirrus.headersValues).forEach((header) => {
        imageRequest.setRequestHeader(header[0], header[1]);
    });
    imageRequest.responseType = 'blob';
    imageRequest.onload = () => {
        img.src = URL.createObjectURL(imageRequest.response);
        console.log(img);
    };
    imageRequest.send();

    // Babylonjs Webrequest wrapper version (would profit from headers/settings set in global config)
    // const img = new Image();
    // const imageRequest = new WebRequest();
    // imageRequest.open('GET', (url as string));
    // imageRequest.responseType = 'blob';
    // imageRequest.response.onload = () => {
    //     img.src = URL.createObjectURL(imageRequest.response);
    // };
    // imageRequest.send();

    return img;
};

Using this, a request is sent and it is actually successful (200). But Something isnt working and the scene isn’t loading afterwards, I’ll keep looking into it.

Update:

Here’s how it looks now:

ThinEngine._FileToolsLoadImage = (url, onload, onerror, offlineProvider, mimeType): HTMLImageElement => {
    const img = new Image();
    const imageRequest = new WebRequest();
    imageRequest.open('GET', (url as string));
    imageRequest.responseType = 'blob';
    imageRequest.onprogress = function(this: XMLHttpRequest, e: ProgressEvent<EventTarget>) {
        this.onload = () => {
            img.src = URL.createObjectURL(imageRequest.response);
            console.log(img);
            onload(img);
        };
    };
    imageRequest.send();
    return img;
};

Although I think the LoadImage function would need to be async (since it is returning an empty image at first)? As is, it leads to this:

WebGL: INVALID_VALUE: texImage2D: no image

[.WebGL-00004BAC00144980] GL_INVALID_OPERATION:
Texture format does not support mipmap generation.