Conventional stitching techniques for images and videos are based on smooth warping models, and therefore, they often fail to work on multi-view images and videos with large parallax captured by cameras with wide baselines. In this paper, we propose a novel video stitching algorithm for such challenging multi-view videos. We estimate the parameters of ground plane homography, fundamental matrix, and vertical vanishing points reliably, using both of the appearance and activity based feature matches validated by geometric constraints. We alleviate the parallax artifacts in stitching by adaptively warping the off-plane pixels into geometrically accurate matching positions through their ground plane pixels based on the epipolar geometry. We also exploit the inter-view and inter-frame correspondence matching information together to estimate the ground plane pixels reliably, which are then refined by energy minimization. Experimental results show that the proposed algorithm provides geometrically accurate stitching results of multi-view videos with large parallax and outperforms the state-of-the-art stitching methods qualitatively and quantitatively.
Multiscale Saliency Detection Results
Figure 1: Multiscale saliency detection results of the proposed algorithm. From top to bottom, we visualize the vertex-wise saliency maps at the levels of semi-regular meshes of l=0, 1, 2, and 3, respectively. (a) Bunny, (b) Feline, (c) Horse, (d) Rabbit, (e) Venus, (f) Armadillo, (g) Bust, (h) Dinosaur, (i) Isis, (j) Max Planck, (k) Santa, and (l) Screwdriver.
Quantitative Comparison of Saliency Detection
Figure 2: Quantitative comparison of saliency detection performances in terms of the precision and recall curve. (a) Bunny, (b) Feline, (c) Horse, (d) Rabbit, (e) Venus, (f) Armadillo, (g) Bust, (h) Dinosaur, (i) Isis, (j) Max Planck, (k) Santa, and (l) Screwdriver.
Se-Won Jeong and Jae-Young Sim, “Saliency detection for 3D surface geometry using semi-regular meshes,” IEEE Transactions on Multimedia, vol. 19, no. 12, pp. 2692-2705, Dec. 2017.