Hello. I am trying to compose a panorama from multiple loading images on a canvas and initialize the PhotoDome using the сanvas.toDataURL(‘image/jpeg’) as textureUrl. But after I pass the canvas to the Engine constructor, the canvas.getContext (‘2d’) function returns null, and I get an error.
https://jsfiddle.net/42zd8pos/15/
What am I doing wrong?
hi and welcome to the forum!
It seems to me like you are using the same canvas for Babylon and for 2D processing. This will fail due to the different contexts.
Then how best to solve this task?
First I would recommend to use two canvases. You don’t have to show the canvas you are using for the 2D processing, but it does have to be independent from the canvas used in Babylon
Many thanks! Everything worked out!
https://jsfiddle.net/7j41vt2u/1/
Only the question of optimization remained. Am I updating the dome correctly on every next image upload? Perhaps there is a better way than to initialize a new PhotoDome object every time?
You could create a HtmlElementTexture wrapping your 2d canvas and each time an image has been loaded you would only need to call update on this texture ?
I seem to be doing something wrong: https://jsfiddle.net/63ymnapo/6/
You would basically need to set the texture to the htmlelement one:
https://jsfiddle.net/b15cqvmp/
Ping @RaananW for info.
Thank you!
In some browsers, for example in Safari, I get this error:
[Error] WebGL: drawElements: texture bound to texture unit 0 is not renderable. It maybe non-power-of-2 and have incompatible texture filtering or is not ‘texture complete’, or it is a float/half-float type with linear filtering and without the relevant float/half-float linear extension enabled.
Do I understand correctly that I need to set some settings for the HtmlElementTexture?
You should use a canvas size being a power of two texture or not rely on texture wrapping or mipmapping for instance.
Those are webgl 1 limitations with texture sampling : WebGL and OpenGL Differences - WebGL Public Wiki
Now I’m trying to smoothly hide the dome using the alpha parameter. And it works in the playground: https://www.babylonjs-playground.com/#14KRGG#245
But when I try to do the same locally, instead of decreasing the transparency, the dome becomes very bright. I’m confused
My local code:
<script>
import { onMount } from "svelte";
import {
Engine,
Scene,
ArcRotateCamera,
Vector3,
PhotoDome
} from "@babylonjs/core";
onMount(() => {
var canvas = document.getElementById("canvas3D");
var engine = new Engine(canvas, true);
var scene = new Scene(engine);
engine.runRenderLoop(function () {
scene.render();
});
var camera = new ArcRotateCamera("Camera", -Math.PI / 2, Math.PI / 2, 5, Vector3.Zero(), scene);
camera.attachControl(canvas, true);
var dome = new PhotoDome(
"dome",
"https://3d-planner.ru/renders/4814da8c-e2c0-40ab-9617-e616342f508e.jpg",
{
resolution: 32,
size: 10,
useDirectMapping: false
},
scene
);
dome.mesh.material.alpha = 0;
});
</script>
<canvas id="canvas3D" />
It’s work if I set dome.mesh.visibility = 0.99, lol
That’s because you need to instruct the system your material should use alpha blending.
Try setting material.transparencyMode = BABYLON.Material.MATERIAL_ALPHABLEND;
Doesn’t work, becomes bright as in the screenshot above.
Try also to set material.useAlphaFromDiffuseTexture = true;
and material.diffuseTexture.hasAlpha = true;
.
Or you can override the needAlphaBlending()
function of the material:
material.needAlphaBlending = () => true;
material.needAlphaBlending = () => true;
it worked, thank you!