Work place: Department of Computer Science and Engineering, Jadavpur University, Kolkata- 700032, India
E-mail: cmnasipuri@cse.jdvu.ac.in
Website:
Research Interests: Multimedia Information System, Image Processing, Pattern Recognition, Computer systems and computational processes
Biography
MITA NASIPURI received her B.E.Tel.E., M.E.Tel.E., and Ph.D. (Engg.) degrees from Jadavpur University, in 1979, 1981 and 1990, respectively. Prof. Nasipuri has been a faculty member of J.U since 1987. Her current research interest includes image processing, pattern recognition, and multimedia systems. She is a senior member of the IEEE, U.S.A., Fellow of I.E. (India) and W.B.A.S.T, Kolkata, India
By Suranjan Ganguly Debotosh Bhattacharjee Mita Nasipuri
DOI: https://doi.org/10.5815/ijigsp.2015.05.03, Pub. Date: 8 Apr. 2015
In this paper, authors have proposed two novel techniques for occlusion detection and then localization of the occluded section from a given 3D face image if occlusion is present. For both of these methods, at first, the 2.5D or range face images are created from input 3D face images. Then for detecting the occluded faces, two approaches have been followed, namely: block based and threshold based. These two methods have been investigated individually on Bosphorus database for localization of occluded portion. Bosphorus database consists of different types of occlusions, which have been considered during our research work. If 2D and 3D images are compared then 3D images provide more reliable, accurate, valid information within digitized data. In case of 2D images each point, named as pixel, is represented by a single value. One byte for gray scale and three byte for color images in a 2D grid whereas in case of 3D, there is no concept of 2D grid. Each point is represented by three values, namely X, Y and Z. The 'Z' value in X-Y plane does not contain the reflected light energy like 2D images. The facial surface's depth data is included in Z's point set. The threshold or cutoff based technique can detect the occluded faces with the accuracy 91.79% and second approach i.e. block based approach can successfully detect the same with the success rate of 99.71%. The accuracy of the proposed occlusion detection scheme has been measured as a qualitative parameter based on subjective fidelity criteria.
[...] Read more.By Arindam Kar Debotosh Bhattacharjee Dipak Kumar Basu Mita Nasipuri Mahantapas Kundu
DOI: https://doi.org/10.5815/ijitcs.2013.09.03, Pub. Date: 8 Aug. 2013
This paper exploits the feature extraction capabilities of the discrete cosine transform (DCT) together with an illumination normalization approach in the logarithm domain that increase its robustness to variations in facial geometry and illumination. Secondly in the same domain the entropy measures are applied on the DCT coefficients so that maximum entropy preserving pixels can be extracted as the feature vector. Thus the informative features of a face can be extracted in a low dimensional space. Finally, the kernel entropy component analysis (KECA) with an extension of arc cosine kernels is applied on the extracted DCT coefficients that contribute most to the entropy estimate to obtain only those real kernel ECA eigenvectors that are associated with eigenvalues having high positive entropy contribution. The resulting system was successfully tested on real image sequences and is robust to significant partial occlusion and illumination changes, validated with the experiments on the FERET, AR, FRAV2D and ORL face databases. Experimental comparison is demonstrated to prove the superiority of the proposed approach in respect to recognition accuracy. Using specificity and sensitivity we find that the best is achieved when Renyi entropy is applied on the DCT coefficients. Extensive experimental comparison is demonstrated to prove the superiority of the proposed approach in respect to recognition accuracy. Moreover, the proposed approach is very simple, computationally fast and can be implemented in any real-time face recognition system.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals