National Science Foundation
C. Jaynes. “Self-Configuring Multi-Projector Displays: Realizing the Metaverse”, IEEE Virtual Reality Clusters Workshop, 2003.
We have developed algorithms that allow any number of projectors and cameras to automatically compute their intrinsics (i.e., pixel size and focal length) as well as their positions relative to one another. Self-calibration is important to widespread deployment of immersive environments and allows users to reconfigure displays to fit their needs.
Once calibrated, displays can then cooperatively render large format, immersive environments. Calibration information is used to present a coherent, geometrically correct view to the user, blend overlapping regions for seamless intensity, and drive continuous monitoring algorithms from the view of each camera in the display environment.