I've two simultaneous rendering: one with the ingame perspective a bit on the left for the left eye and one a bit on the right for the right eye, it works (i've to cross the eyes [a sort of stereoscopic technique] and invert the left render and the right one).
The next step is to stream those two images to a 3D headset BUT they need to be corrected in order to be correctly seen with the lenses of the headset (which for the moment will be my phone in a phone support streaming the whole desktop).
I can deform the 2 images before rendering, should be easy, but need the basic method (or formula); or maybe stream half of the desktop to each eyes with some program that do that automatically.. i know nothing on the topic so that's why i'm asking you:
-How i deform the two images?
-How do i stream them to my phone? (For now i'm using a AnyDesk, which is just desktop stream, no any 3D implementation).
I would say its done by putting the images on a surface as a texture with the correct dimensions, but I am really not sure on that one. I would have thought the driver or library would take a normal video stream and reproject it for you, but again, I do not know.
UPDATE: I've bought a "VR Box", one very cheap smartphone support, and its lenses do deform a bit the image but not much so my game, for this support of mine, doesn't need any correction, and considering that these are just experiments made for fun its perfect ahahaha!