An easier way would be to have an alpha channel in your video and apply alpha test/blend.
Without this channel, you can only work with the rgb values and apply some computation to determine if a pixel must be discarded or not, as in the PG. Using a pure green (for eg) background in your video would help because your colors are quite different from it, so the discard process would produce less artifacts.
In the PG, using the emissive texture and no lighting improve the rendering:
Hi - Thanks for responding. I agree that it would be better to start with an mp4/webm with alpha channel. That is what I was trying to do in https://www.babylonjs-playground.com/#05UVJI#38 (please see opening question in this thread). The mp4 in the PG supposedly has a transparent alpha channel. However, I could not seem to enable a transparent background. Is the code in the PG correct?
I tried locally on my computer and it did not work.
Looking with PIX, I could see that the texture located in the GPU that comes from the video has no alpha channel. However, I don’t know if it’s because there’s really no alpha channel in the video itself or because webgl would not handle the alpha channel of a video…
[…] Ok, the video has no alpha channel. Using this video it does work with this PG: