Single and Multiple Hand Gesture Recognition Systems: A Comparative Analysis

Full Text (PDF, 861KB), PP.57-65

Views: 0 Downloads: 0

Author(s)

Siddharth Rautaray 1,* Manjusha Pandey 1

1. School of Computer Engineering, KIIT University, Odisha, India

* Corresponding author.

DOI: https://doi.org/10.5815/ijisa.2014.11.08

Received: 17 Jan. 2014 / Revised: 10 Apr. 2014 / Accepted: 25 Jun. 2014 / Published: 8 Oct. 2014

Index Terms

Real Time, Gesture Recognition, Human Computer Interaction, SHGRS, MHGRS

Abstract

With the evolution of higher computing speed, efficient communication technologies, and advanced display techniques the legacy HCI techniques become obsolete and are no more helpful in accurate and fast flow of information in present day computing devices. Hence the need of user friendly human machine interfaces for real time interfaces for human computer interaction have to be designed and developed to make the man machine interaction more intuitive and user friendly. The vision based hand gesture recognition affords users with the ability to interact with computers in more natural and intuitive ways. These gesture recognition systems generally consist of three main modules like hand segmentation, hand tracking and gesture recognition from hand features, designed using different image processing techniques which are further integrated with different applications. An increase use of new interfaces based on hand gesture recognition designed to cope up with the computing devices for interaction. This paper is an effort to provide a comparative analysis between such real time vision based hand gesture recognition systems which are based on interaction using single and multiple hand gestures. Single hand gesture based recognition systems (SHGRS) have fewer complexes to implement, with a constraint to the count of different gestures which is large enough with various permutations and combinations of gesture, which is possible with multiple hands in multiple hand gesture recognition systems (MHGRS). The thorough comparative analysis has been done on various other vital parameters for the recognition systems.

Cite This Paper

Siddharth Rautaray, Manjusha Pandey, "Single and Multiple Hand Gesture Recognition Systems: A Comparative Analysis", International Journal of Intelligent Systems and Applications(IJISA), vol.6, no.11, pp.57-65, 2014. DOI:10.5815/ijisa.2014.11.08

Reference

[1]Chaudhary A, Raheja JL, Das K, Raheja S (2011) Intelligent approaches to interact with machines using hand gesture recognition in natural way: a survey. Int J Comput Sci Eng Survey (IJCSES) 2(1):122–133

[2]Hardenberg CV, Berard F (2001) Bare-hand human–computer interaction. Proceedings of the ACM workshop on perceptive user interfaces. ACM Press, pp 113–120

[3]Shanis, J. and Hedge, A. (2003) Comparison of mouse, touchpad and multitouch input technologies. Proceedings of the Human Factors and Ergonomics Society 47th Annual Meeting, Oct. 13-17, Denver, CO, 746-750.

[4]Xiujuan Chai, Yikai Fang and Kongqiao Wang, “Robust hand gesture analysis and application in gallery browsing,” In Proceeding of ICME, New York, pp. 938-94, 2009.

[5]Pavlovic VI, Sharma R, Huang TS (1997) Visual interpretation of hand gestures for human–computer interaction:a review. Trans Pattern Anal Mach Intell 19(7):677–695

[6]José Miguel Salles Dias, Pedro Nande, Pedro Santos, Nuno Barata and André Correia, “Image Manipulation through Gestures,” In Proceedings of AICG’04, pp. 1-8, 2004.

[7]N. Liu and B. Lovell, “Mmx-accelerated Realtime Hand Tracking System,” In Proceedings of IVCNZ, 2001.

[8]Ayman Atia and Jiro Tanaka, “Interaction with Tilting Gestures in Ubiquitous Environments,” In International Journal of UbiComp (IJU), Vol.1, No.3, 2010.

[9]S.S. Rautaray and A. Agrawal, “A Novel Human Computer Interface Based On Hand Gesture Recognition Using Computer Vision Techniques,” In Proceedings of ACM IITM’10, pp. 292-296, 2010.

[10]Z. Xu, C. Xiang, W. Wen-hui, Y. Ji-hai, V. Lantz and W. Kong-qiao, “ Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors,” In Proceedings of IUI’09, pp. 401-406, 2009.

[11]C. S. Lee, S. W. Ghyme, C. J. Park and K. Wohn, “The Control of avatar motion using hand gesture,” In Proceeding of Virtual Reality Software and technology (VRST), pp. 59-65, 1998.

[12]X. Zhang, X. Chen, Y. Li, V. Lantz, K. Wang and J. Yang, “A framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors,” IEEE Trans. On Systems, Man and Cybernetics- Part A: Systems and Humans, pp. 1-13, 2011.

[13]N. Conci, P. Cerseato and F. G. B. De Natale, “Natural Human- Machine Interface using an Interactive Virtual Blackboard,” In Proceeding of ICIP 2007, pp. 181-184, 2007.

[14]B. Yi, F. C. Harris Jr., L. Wang and Y. Yan, “Real-time natural hand gestures”, In Proceedings of IEEE Computing in science and engineering, pp. 92-96, 2005.