The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLVIII-2/W7-2024
https://doi.org/10.5194/isprs-archives-XLVIII-2-W7-2024-177-2024
https://doi.org/10.5194/isprs-archives-XLVIII-2-W7-2024-177-2024
13 Dec 2024
 | 13 Dec 2024

A Global Image Orientation Method of the Self-Rotating Pan-Tilt-Zoom Camera for Photogrammetric Applications

Teng Xiao, Qi Hu, Junhua Kang, Qi Zhang, Zhiwei Ye, and Fei Deng

Keywords: Global Image Orientation, Structure from Motion (SfM), Pan-tilt-zoom (PTZ) Camera, Pure Rotation

Abstract. Pan-tilt-zoom (PTZ) cameras are widely used in surveillance systems due to their wide field of view and high resolutions. However, the lack of accurate orientation information limits their full utilization in photogrammetry. Therefore, for photogrammetric applications, the primary task for PTZ cameras is to achieve image orientation. Cameras mounted on gimbals can only self-rotate around the base, resulting in acquired images that are nearly purely rotated. Current conventional structure from motion (SfM) pipelines assume pixel parallax exists between matching correspondences and generate object points through 3D triangulation. Applying these methods to estimate the interior and exterior orientation parameters of images from a self-rotating PTZ camera is challenging. To address this issue, this paper employs the concept of global SfM and proposes an improved global image orientation method for the pure rotation motion of PTZ cameras. Initially, a subset of image pairs is selected for internal orientation, and the internal orientation parameters of the images are estimated. Subsequently, global external orientation is performed on all images to estimate their external orientation parameters. Finally, bundle adjustment of the collinearity equation without object points optimizes both the internal and external orientation parameters. Experiments on synthetic and real-scene datasets demonstrate the practicality and accuracy of this method. For synthetic datasets, the estimated focal length of our method deviates from the true value by within 1 pixel, and the mean location error of the principal points is 0.93 pixels. For real-scene datasets, the mean reprojection error of the checkpoints of our method is 2.72 pixels, with a maximum of 4.66 pixels. In contrast, Agisoft Metashape's mean reprojection error is 4.73 pixels, with a maximum reaching 8.06 pixels. This shows that our method can accurately determine the image orientation parameters of PTZ cameras and achieve higher accuracy compared to the popular commercial software Agisoft Metashape.