Blur Classification Using Wavelet Transform and Feed Forward Neural Network

Full Text (PDF, 1244KB), PP.16-23

Views: 0 Downloads: 0

Author(s)

Shamik Tiwari 1,* V. P. Shukla 1 S. R. Biradar 2 A. K. Singh 1

1. FET, Mody Institute of Technology & Science, Laxmangarh, India

2. SDM College of Engineering, Hubli-Dharwad, India

* Corresponding author.

DOI: https://doi.org/10.5815/ijmecs.2014.04.03

Received: 15 Jan. 2014 / Revised: 16 Feb. 2014 / Accepted: 6 Mar. 2014 / Published: 8 Apr. 2014

Index Terms

Blur, Motion, Defocus, Wavelet Transform, Neural Network

Abstract

Image restoration deals with recovery of a sharp image from a blurred version. This approach can be defined as blind or non-blind based on the availability of blur parameters for deconvolution. In case of blind restoration of image, blur classification is extremely desirable before application of any blur parameters identification scheme. A novel approach for blur classification is presented in the paper. This work utilizes the appearance of blur patterns in frequency domain. These features are extracted in wavelet domain and a feed forward neural network is designed with these features. The simulation results illustrate the high efficiency of our algorithm.

Cite This Paper

Shamik Tiwari, V. P. Shukla, S. R. Biradar, A. K. Singh, "Blur Classification Using Wavelet Transform and Feed Forward Neural Network", International Journal of Modern Education and Computer Science (IJMECS), vol.6, no.4, pp.16-23, 2014. DOI:10.5815/ijmecs.2014.04.03

Reference

[1]J. Vartiainen, T. Kallonen, and J. Ikonen, “Barcodes and Mobile Phones as Part of Logistic Chain in Construction Industry,” The 16th International Conference on Software, Telecommunication, and Computer Networks, pp. 305 – 308, September 25-27, 2008.
[2]ISO/IEC 18004:2000. Information technology-Automatic identification and data capture techniques-Bar code symbology-QR Code, 2000.
[3]J. Rekimoto and Y. Ayatsuka, “Cybercode: Designing Augmented Reality Environments with Visual Tags,” Proceedings of ACM International Conference on Designing Augmented Reality Environments, pp. 205-215, 2000.
[4]T. S. Parikh and E. D. Lazowska, “Designing An Architecture for Delivering Mobile Information Services to the Rural Developing World,” Proceedings of ACM International Conference on World Wide Web WWW, p.p. 54-62, 2006.
[5]H. Tong, M. Li, H. Zhang, and C. Zhang, “Blur Detection For Digital Images Using Wavelet Transform,” Proceedings of IEEE International Conference on Multimedia and Expo, Vol. 1, pp. 17-20, 2004.
[6]Kai-Chieh Yang, Clark C. Guest and Pankaj Das, “Motion Blur Detecting by Support Vector Machine,” Proc. SPIE, pp. 5916-59160R, 2005.
[7]I. Aizenberg, N. Aizenberg, T. Bregin, C. Butakov, E. Farberov, N. Merzlyakov, and O. Milukova, “Blur Recognition on the Neural Network based on Multi-Valued Neurons,” Journal of Image and Graphics. Vol.5, pp. 12-18, 2000.
[8]Su Bolan, Lu Shijian, and Tan Chew Lim, “Blurred Image Region Detection and Classification,” In Proceedings of the 19th ACM international Conference on Multimedia (MM '11), pp. 34-40, 2011.
[9]Liu Renting, Li Zhaorong, Jia Jiaya, “Image Partial Blur Detection and Classification,” IEEE Conference on Computer Vision and Pattern Recognition, pp.1-8, 2008.
[10]Ayan Chakrabarti, Todd Zickler, and William T. Freeman, “Correcting Over-Exposure in Photographs,” in proc. 2010 IEEE conf. on Computer Vision and Pattern Recognition, pp. 2512-2519, 2010.
[11]R. Yan and L. Shao, “Image Blur Classification and Parameter Identification Using Two-stage Deep Belief Networks”, British Machine Vision Conference (BMVC), Bristol, UK, 2013.
[12]Shamik Tiwari, V. P. Shukla, S. R. Biradar, Ajay Kumar Singh, “ Texture Features based Blur Classification in Barcode Images,” I.J. Information Engineering and Electronic Business, MECS Publisher, vol. 5, pp. 34-41, 2013.
[13]Shamik Tiwari, Ajay Kumar Singh and V. P. Shukla, “Certain Investigations on Motion Blur Detection and Estimation,” Proceedings of International Conference on Signal, Image and Video Processing, IIT Patna, pp.108-114, 2012.
[14]B. Kara and N. Watsuji, “Using Wavelets for Texture Classification,” IJCI Proceedings of International Conference on Signal Processing, pp. 920–924, September 2003.
[15]I. Daubechies, “Wavelet Transforms and Orthonormal Wavelet Bases, Different Perspectives on Wavelets,” Proceedings of the Symposia in Applied Mathematics, vol. 47, pp. 1–33, American Mathematical Society, San Antonio, Tex, USA, 1993.
[16]G. Van de Wouver, P. Scheunders, and D. Van Dyck, “Statistical Texture Characterization From Discrete Wavelet Representation,” IEEE Trans. Image Process., vol. 8(11), pp. 592–598,1999.
[17]P. S. Hiremath, S. Shivashankar, “Wavelet Based Features for Texture Classification,” GVIP Journal, vol. 6(3), pp. 55-58, 2006.
[18]Shamik Tiwari, Ajay Kumar Singh and V. P. Shukla, “Statistical Moments based Noise Classification using Feed Forward Back Propagation Neural Network,” International Journal of Computer Applications, vol. 18(2), pp.36-40, 2011.
[19]Mohsen Ebrahimi Moghaddam, and Mansour Jamzad, “Linear Motion Blur Parameter Estimation in Noisy Images Using Fuzzy Sets and Power Spectrum Images,” EURASIP Journal on Advances in Signal Processing, Vol. 2007, pp. 1-9, 2007.
[20]Michal Dobeš, Libor Machala, and Tomáš Fürst, ―Blurred Image Restoration: A Fast Method of Finding the Motion Length and Angle,‖ Digital Signal Processing, Vol. 20(6), pp. 1677-1686, 2010.
[21]M. E. Moghaddam, “A Mathematical Model to Estimate Out of Focus Blur,” Proceedings of 5th IEEE international symposium on Image and Signal Processing and Analysis, pp. 278-281, 2007.
[22]M. Sakano, N. Suetake, and E. Uchino, “A Robust Point Spread Function Estimation for Out-of-Focus Blurred and Noisy Images Based on a Distribution of Gradient Vectors on the Polar Plane,” Journal of Optical Society of Japan, co-published with Springer-Verlag GmbH, Vol. 14(5), pp. 297-303, 2007.
[23]I. Szentandrási, M. Dubská, and A. Herout, “Fast Detection and Recognition of QR codes in High-Resolution Images,” Graph@FIT, Brno Institute of Technology, 2012.