论文标题

完全量化的图像超分辨率网络

Fully Quantized Image Super-Resolution Networks

论文作者

Wang, Hu, Chen, Peng, Zhuang, Bohan, Shen, Chunhua

论文摘要

随着智能移动设备的日益普及,开发准确,实时和节能的图像超分辨率(SR)推理方法具有重要意义。提高推断效率的主要方法是模型量化,它允许用有效的定点或位算术替换昂贵的浮点操作。迄今为止,量化的SR框架以实现可行的准确性效率折衷仍然具有挑战性。在这里,我们提出了一个完全量化的图像超分辨率框架(FQSR),以共同优化效率和准确性。特别是,我们的目标是获得所有层的端到端量化模型,尤其是包括跳过连接,这在文献中很少解决。我们进一步确定了低位SR网络面临的训练障碍,并相应地提出了两种新方法。两种艰难的是由1)激活和重量分布在不同层中的特殊性。 2)量化的不准确近似。我们将我们的量化方案应用于包括Srresnet,Srgan和Edsr在内的多个主流超分辨率架构。实验结果表明,与五个基准数据集上的全精度对应物相比,我们使用低位量化的FQSR可以实现PAR性能,并且超过了最先进的量化SR方法,其计算成本和内存消耗大大降低。

With the rising popularity of intelligent mobile devices, it is of great practical significance to develop accurate, realtime and energy-efficient image Super-Resolution (SR) inference methods. A prevailing method for improving the inference efficiency is model quantization, which allows for replacing the expensive floating-point operations with efficient fixed-point or bitwise arithmetic. To date, it is still challenging for quantized SR frameworks to deliver feasible accuracy-efficiency trade-off. Here, we propose a Fully Quantized image Super-Resolution framework (FQSR) to jointly optimize efficiency and accuracy. In particular, we target on obtaining end-to-end quantized models for all layers, especially including skip connections, which was rarely addressed in the literature. We further identify training obstacles faced by low-bit SR networks and propose two novel methods accordingly. The two difficulites are caused by 1) activation and weight distributions being vastly distinctive in different layers; 2) the inaccurate approximation of the quantization. We apply our quantization scheme on multiple mainstream super-resolution architectures, including SRResNet, SRGAN and EDSR. Experimental results show that our FQSR using low bits quantization can achieve on par performance compared with the full-precision counterparts on five benchmark datasets and surpass state-of-the-art quantized SR methods with significantly reduced computational cost and memory consumption.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源