Can I render warped fisheye images directly, without warping fisheye images or creating cubic maps?
Yes. Assuming you are using a raytracer (or a rendering engine with the required capabilities) then you can imagine replacing the real life data projector with a virtual camera within your 3D scene. Place a perfectly reflective sphere in front of the camera, again with specifications that exactly match the projector/mirror parameters. If you now render your 3D scene as reflected in the mirror then those images are ready to be projected directly. More information found here. The main objection of doing this way are: you can’t readily share you content with other systems, in particular, planetariums who are not using the spherical mirror projection (who expect to receive content as a fisheye frames) or indeed other sites using the spherical mirror but may have a different geometry and hence warping mesh.
Yes. Assuming you are using a raytracer (or a rendering engine with the required capabilities) then you can imagine replacing the real life data projector with a virtual camera within your 3D scene. Place a perfectly reflective sphere in front of the camera, again with specifications that exactly match the projector/mirror parameters. If you now render your 3D scene as reflected in the mirror then those images are ready to be projected directly. More information found here. The main objection of doing this way are: you can’t readily share you content with other systems, in particular, planetariums who are not using the spherical mirror projection (who expect to receive content as a fisheye frames) or indeed other sites using the spherical mirror but may have a different geometry and hence warping mesh.