GraphDeco

Video-Based Rendering of Dynamic Stationary Environments from Unsynchronized Inputs
Presentation | Team members | Collaborations | Publications | Job offers | Contact

 

Video-Based Rendering of Dynamic Stationary Environments from Unsynchronized Inputs

Computer Graphics Forum (Proceedings of the Eurographics Symposium on Rendering), Volume 40, Number 4 - June 2021
Download the publication : ThonatVideoBasedRenderingSupplemental.pdf [20.2Mo]   ThonatVideoBasedRenderingUnsycnhronizedInputsCGF_EGSR21_AuthorsVersion.pdf [12Mo]  
Image-Based Rendering allows users to easily capture a scene using a single camera and then navigate freely with realistic results. However, the resulting renderings are completely static, and dynamic effects – such as fire, waterfalls or small waves – cannot be reproduced. We tackle the challenging problem of enabling free-viewpoint navigation including such stationary dynamic effects, but still maintaining the simplicity of casual capture. Using a single camera – instead of previous complex synchronized multi-camera setups – means that we have unsynchronized videos of the dynamic effect from multiple views, making it hard to blend them when synthesizing novel views. We present a solution that allows smooth free-viewpoint video-based rendering (VBR) of such scenes using temporal Laplacian pyramid decomposition video, enabling spatio-temporal blending. For effects such as fire and waterfalls, that are semi-transparent and occupy 3D space, we first estimate their spatial volume. This allows us to create per-video geometries and alpha-matte videos that we can blend using our frequency-dependent method. We also extend Laplacian blending to the temporal dimension to remove additional temporal seams. We show results on scenes containing fire, waterfalls or rippling waves at the seaside, bringing these scenes to life.

Images and movies

 

See also


Video



See also the project webpage. Full source code and datasets will be provided soon pending legal approval.

Acknowledgements and Funding

This research was funded by the ERC Advanced grant FUNGRAPH No 788065 ( http://fungraph.inria.fr ), the doctoral fellowship of the Region Provence Alpes Cote d’Azur and the ANR project SEMAPOLIS (ANR-13-CORD-0003). The authors thank J. Tompkin for valuable feedback, S. Prakash for help with comparisons and the anonymous reviewers for their valuable feedback.

BibTex references

@Article{TAAPDD21,
  author       = "Thonat, Theo and Aksoy, Yagiz and Aittala, Miika and Paris, Sylvain and Durand, Fr\'edo and Drettakis, George",
  title        = "Video-Based Rendering of Dynamic Stationary Environments from Unsynchronized Inputs",
  journal      = "Computer Graphics Forum (Proceedings of the Eurographics Symposium on Rendering)",
  number       = "4",
  volume       = "40",
  month        = "June",
  year         = "2021",
  url          = "http://www-sop.inria.fr/reves/Basilic/2021/TAAPDD21"
}

Other publications in the database

» Theo Thonat
» Miika Aittala
» Sylvain Paris
» Frédo Durand
» George Drettakis