Options
Depth-guided dense dynamic filtering network for bokeh effect rendering
Date Issued
01-10-2019
Author(s)
Purohit, Kuldeep
Suin, Maitreya
Kandula, Praveen
Ambasamudram, Rajagopalan
Abstract
Bokeh effect refers to the soft defocus blur of the background, which can be achieved with different aperture and shutter settings in a camera. In this work, we present a learning-based method for rendering such synthetic depth-of-field effect on input bokeh-free images acquired using ordinary monocular cameras. The proposed network is composed of an efficient densely connected encoder-decoder backbone structure with a pyramid pooling module. Our network leverages the task-specific efficacy of joint intensity estimation and dynamic filter synthesis for the spatially-aware blurring process. Since the rendering task requires distinguishing between large foreground and background regions and their relative depth, our network is further guided by pre-trained salient-region segmentation and depth-estimation modules. Experiments on diverse scenes show that our model elegantly introduces the desired effects in the input images, enhancing their aesthetic quality while maintaining a natural appearance. Along with extensive ablation analysis and visualizations to validate its components, the effectiveness of the proposed network is also demonstrated by achieving the second-highest score in the AIM 2019 Bokeh Effect challenge: fidelity track.