TY - GEN
T1 - Consistent filtering of videos and dense light-fields without optic-flow
AU - Shekhar, Sumit
AU - Semmo, Amir
AU - Trapp, Matthias
AU - Tursun, Okan Tarhan
AU - Pasewaldt, Sebastian
AU - Myszkowski, Karol
AU - Döllner, Jürgen
N1 - Funding Information:
We thank Florian Wagner for his help with generating results. We thank Max Reimann for valuable discussion. We thank the reviewers for their insightful comments. The project was supported by the Fraunhofer and Max Planck cooperation program within the German pact for research and innovation (PFI), the Research School on “Service-Oriented Systems Engineering” of the Hasso Plat-tner Institute, and the Federal Ministry of Education and Research (BMBF), Germany (mdViPro, 01IS18092).
Publisher Copyright:
© 2019 The Author(s) Eurographics Proceedings © 2019 The Eurographics Association.
PY - 2019
Y1 - 2019
N2 - A convenient post-production video processing approach is to apply image filters on a per-frame basis. This allows the flexibility of extending image filters—originally designed for still images—to videos. However, per-image filtering may lead to temporal inconsistencies perceived as unpleasant flickering artifacts, which is also the case for dense light-fields due to angular inconsistencies. In this work, we present a method for consistent filtering of videos and dense light-fields that addresses these problems. Our assumption is that inconsistencies—due to per-image filtering—are represented as noise across the image sequence. We thus perform denoising across the filtered image sequence and combine per-image filtered results with their denoised versions. At this, we use saliency based optimization weights to produce a consistent output while preserving the details simultaneously. To control the degree-of-consistency in the final output, we implemented our approach in an interactive real-time processing framework. Unlike state-of-the-art inconsistency removal techniques, our approach does not rely on optic-flow for enforcing coherence. Comparisons and a qualitative evaluation indicate that our method provides better results over state-of-the-art approaches for certain types of filters and applications.
AB - A convenient post-production video processing approach is to apply image filters on a per-frame basis. This allows the flexibility of extending image filters—originally designed for still images—to videos. However, per-image filtering may lead to temporal inconsistencies perceived as unpleasant flickering artifacts, which is also the case for dense light-fields due to angular inconsistencies. In this work, we present a method for consistent filtering of videos and dense light-fields that addresses these problems. Our assumption is that inconsistencies—due to per-image filtering—are represented as noise across the image sequence. We thus perform denoising across the filtered image sequence and combine per-image filtered results with their denoised versions. At this, we use saliency based optimization weights to produce a consistent output while preserving the details simultaneously. To control the degree-of-consistency in the final output, we implemented our approach in an interactive real-time processing framework. Unlike state-of-the-art inconsistency removal techniques, our approach does not rely on optic-flow for enforcing coherence. Comparisons and a qualitative evaluation indicate that our method provides better results over state-of-the-art approaches for certain types of filters and applications.
UR - http://www.scopus.com/inward/record.url?scp=85088231157&partnerID=8YFLogxK
U2 - 10.2312/vmv.20191326
DO - 10.2312/vmv.20191326
M3 - Conference contribution
AN - SCOPUS:85088231157
T3 - Vision, Modeling and Visualization, VMV 2019
BT - Vision, Modeling and Visualization, VMV 2019
A2 - Schulz, Hans-Jorg
A2 - Teschner, Matthias
A2 - Wimmer, Michael
PB - Eurographics Association
T2 - 2019 Conference on Vision, Modeling and Visualization, VMV 2019
Y2 - 30 September 2019 through 2 October 2019
ER -