Publication
IEEE TCSVT
Paper

Efficiently synthesizing virtual video

View publication

Abstract

Given a set of synchronized video sequence of a dynamic scene taken by different cameras, we address the problem of creating a virtual video of the scene from a novel viewpoint. A key aspect of our algorithm is a method for recursively propagating dense and physically accurate correspondences between the two video sources. By exploiting temporal continuity and suitably constraining the correspondences, we provide an efficient framework for synthesizing realistic virtual video. The stability of the propagation algorithm is analyzed, and experimental results are presented.

Date

Publication

IEEE TCSVT

Authors

Share