Mathematics and Computer Science
Volume 5, Issue 2, March 2020, Pages: 56-63
Received: May 11, 2020;
Accepted: May 25, 2020;
Published: Jun. 3, 2020
Views 132 Downloads 74
Abderraouf Khodja, College of Mathematics and Computer Science, Zhejiang Normal University, Jinhua, China
Zhonglong Zheng, College of Mathematics and Computer Science, Zhejiang Normal University, Jinhua, China
Yiran He, College of Mathematics and Computer Science, Zhejiang Normal University, Jinhua, China
Deep learning has attracted a lot of attention lately, thanks. Thanks to its high modeling performance, technological advancement, and big data for training, deep learning has achieved a remarkable improvement in both high and low-level vision tasks. One crucial aspect of the success of a deep learning-based model is an adequate large data set for fueling the training stage. But in many cases, well-labeled large data is hard to acquire. Recent works have shown that it is possible to optimize denoising models by minimizing the difference between different noise instances of the same image. Yet, it is not a common practice to collect data with different noise instances of the same sample. Addressing this issue, we propose a training method that enables training deep convolutional neural network models for Gaussian denoising to be trained in cases of no ground truth data. More specifically, we propose to train a deep learning-based denoising model using only a single noise instance. With that in mind we develop a non-local self-similarity noise training method that uses only one noise instance.
Similarity Noise Training for Image Denoising, Mathematics and Computer Science.
Vol. 5, No. 2,
2020, pp. 56-63.
He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. IEEE Conference on Computer Vision and Pattern Recognition 2016, 2016-Decem, pp. 770-778.
Krizhevsky, A.; Sutskever, I.; Hinton, G. E. ImageNet classification with deep convolutional neural networks. Commun. ACM 2017, 2017, 60, pp. 84–90, doi: 10.1145/3065386.
Szegedy, C.; Ioffe, S.; Vanhoucke, V.; Alemi, A. A. Inception-v4, inception-ResNet and the impact of residual connections on learning. 31st AAAI Conf. Artif. Intell. AAAI, 2017, pp. 4278-–4284.
Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet Large Scale Visual Recognition Challenge. Int. J. Comput. Vis., 2015, 115, pp. 211–252, doi: 10.1007/s11263-015-0816-y.
Deng, J.; Dong, W.; Socher, R.; Li, L.-J.; Kai Li; Li Fei-Fei ImageNet: A large-scale hierarchical image database. IEEE. Conference on Computer Vision and Pattern Recognition., 2009, pp. 248–255.
Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. IEEE. Conference on Computer Vision and Pattern Recognition., 2014, 580-587.
Zhang, K.; Zuo, W.; Chen, Y.; Meng, D.; Zhang, L. Beyond a Gaussian denoiser: Residual learning of deep CNN for image denoising. IEEE Trans. Image Process., 2017, 26, 3142–3155, doi: 10.1109/TIP.2017.2662206.
Chen, Y.; Pock, T. Trainable Nonlinear Reaction Diffusion: A Flexible Framework for Fast and Effective Image Restoration. IEEE Trans. Pattern Anal. Mach. Intell., 2017, 39, 1256–1272, doi: 10.1109/TPAMI.2016.2596743.
Mao, X. J.; Shen, C.; Yang, Y. Bin Image restoration using very deep convolutional encoder-decoder networks with symmetric skip connections. Advances in Neural Information Processing Systems, 2016.
Divakar, N.; Babu, R. V. Image Denoising via CNNs: An Adversarial Approach. IEEE. Conference on Computer Vision and Pattern RecognitionW., 2017, 2017-July, pp. 1076-1083.
Xie, J.; Xu, L.; Chen, E. Image denoising and inpainting with deep neural networks. Advances in Neural Information Processing Systems, 2012, 1, pp. 341–349.
Kim, J.; Lee, J. K.; Lee, K. M. Accurate image super-resolution using very deep convolutional networks. IEEE Conference on Computer Vision and Pattern Recognition, 2016, 2016-Decem pp. 1646–1654.
Buades, A.; Coll, B.; Morel, J. M. A non-local algorithm for image denoising. In Proceedings of the Proceedings - 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2005; IEEE Computer Society, 2005, 2, pp. 60–65.
Dabov, K.; Foi, A.; Katkovnik, V.; Egiazarian, K. Image denoising by sparse 3-D transform-domain collaborative filtering. IEEE Trans. Image Process., 2007, 16, pp. 2080–2095, doi: 10.1109/TIP.2007.901238.
Dong, W.; Zhang, L.; Shi, G.; Li, X. Nonlocally centralized sparse representation for image restoration. IEEE Trans. Image Process., 2013, 22, pp. 1620–1630, doi: 10.1109/TIP.2012.2235847.
Gu, S.; Zhang, L.; Zuo, W.; Feng, X. Weighted nuclear norm minimization with application to image denoising. IEEE Conference on Computer Vision and Pattern Recognition, 2014, pp. 2862-2869.
Zoran, D.; Weiss, Y. From learning models of natural image patches to whole image restoration. International Conference on Computer Vision, 2011, 22, pp. 479-486.
Chen, J.; Chen, J.; Chao, H.; Yang, M. Image Blind Denoising with Generative Adversarial Network Based Noise Modeling. IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2018, June 2018, pp. 3155-3164.
Lehtinen, J.; Munkberg, J.; Hasselgren, J.; Laine, S.; Karras, T.; Aittala, M.; Aila, T. Noise2Noise: Learning image restoration without clean data. 35th Int. Conf. Mach. Learn. ICML 2018, 2018, 7, pp. 4620–4631.
Soltanayev, S.; Chun, S. Y. Training deep learning based denoisers without ground truth data. Advances in Neural Information Processing Systems, 2018, 2018-December, pp. 3257-3267.
Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. 3rd International Conference on Learning Representations, ICLR - Conference Track Proceedings, 2015.
Roth, S.; Black, M. J. Fields of experts. Int. J. Comput. Vis., 2009, 82, pp. 205–229, doi: 10.1007/s11263-008-0197-6.
Vedaldi, A.; Lenc, K. MatConvNet: Convolutional neural networks for MATLAB. In Proceedings of the MM 2015 - Proceedings of the 2015 ACM Multimedia Conference; Association for Computing Machinery, Inc, 2015, pp. 689–692.