论文标题

期待:通过多维回归器的关注

ExReg: Wide-range Photo Exposure Correction via a Multi-dimensional Regressor with Attention

论文作者

Chiang, Tzu-Hao, Hsueh, Hao-Chien, Hsiao, Ching-Chun, Huang, Ching-Chun

论文摘要

照片暴露校正得到了广泛的研究,但更少的研究集中在同时校正下和暴露图像下的图像。以统一的方式处理和纠正三个问题,以处理和纠正下和过度暴露的图像。首先,本地自适应的暴露调整可能更灵活,而不是学习全球映射。其次,确定本地合适的曝光值是一个错误的问题。第三,具有相同内容但暴露不同的照片可能无法达到一致的调整结果。为此,我们提出了一个新型的暴露校正网络,即Exreg,以通过将暴露校正作为多维回归过程来应对挑战。给定输入图像,引入了紧凑的多曝光生成网络,以生成具有不同暴露条件的图像,以在下一阶段进行多维回归和暴露校正。辅助模块旨在预测区域的暴露值,从而指导所提出的编码器decoder anp(细心的神经过程),以回归最终校正的图像。实验结果表明,在PSNR中,Exreg可以产生暴露良好的结果,并以1.3DB的优于SOTA方法,以解决广泛的暴露问题。另外,给定相同的图像但在测试各种暴露下,校正后的结果在视觉上更加一致且物理上准确。

Photo exposure correction is widely investigated, but fewer studies focus on correcting under and over-exposed images simultaneously. Three issues remain open to handle and correct under and over-exposed images in a unified way. First, a locally-adaptive exposure adjustment may be more flexible instead of learning a global mapping. Second, it is an ill-posed problem to determine the suitable exposure values locally. Third, photos with the same content but different exposures may not reach consistent adjustment results. To this end, we proposed a novel exposure correction network, ExReg, to address the challenges by formulating exposure correction as a multi-dimensional regression process. Given an input image, a compact multi-exposure generation network is introduced to generate images with different exposure conditions for multi-dimensional regression and exposure correction in the next stage. An auxiliary module is designed to predict the region-wise exposure values, guiding the mainly proposed Encoder-Decoder ANP (Attentive Neural Processes) to regress the final corrected image. The experimental results show that ExReg can generate well-exposed results and outperform the SOTA method by 1.3dB in PSNR for extensive exposure problems. In addition, given the same image but under various exposure for testing, the corrected results are more visually consistent and physically accurate.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源