What is the best approach to render a single scene across 3 connected screens and avoid view/perspective distortion on the edges?
Should I make 3 cameras each with a viewport that matches a screen? Or can a single camera be used ?
A PG that puts me on the right track is appreciated.
Do you mean that you extend the browser window over 2 or 3 screens? If so, why would there be distortions?
Thanks for replying ! The browser will be extended over multiple screens yes. With distortion I mean the fish eye lens effect on the edges as shown here, using the default FOV of 0.9.
Reducing the FOV improves this but the field of view becomes too small (vertically), shown here.
So I assume I should be looking for a solution with a viewport per screen, one canvas, right? Can it be done with one camera? I need a camera per viewport? I am aware there will be frame rate penalty and some artifacts where the viewports touch in this solution.
I’m not sure using multiple viewports would work because you may not be able to make the viewports exactly match at their edge.
I wonder if you can use the
We would probably need to change the projection matrix to implement this effect. See 3d - Perspective projection and ultrawide resolution - Game Development Stack Exchange for eg.
Yeah, like like camera.projectionPlaneYaw. Could work.
I am also considering a shader based approach, like discussed here:
VR headset displays are dealing with a similar issue in a way so why not for normal multiscreens, right?