Feature Tracking and Synchronous Scene Generation with a Single Camera

Full Text (PDF, 1669KB), PP.1-12

Views: 0 Downloads: 0

Author(s)

Zheng Chai 1,* Takafumi Matsumaru 1

1. Graduate School of Information, Production and Systems, WASEDA University, Japan

* Corresponding author.

DOI: https://doi.org/10.5815/ijigsp.2016.06.01

Received: 10 Feb. 2016 / Revised: 25 Mar. 2016 / Accepted: 5 May 2016 / Published: 8 Jun. 2016

Index Terms

Tracking, Synchronous map, Camera pose update, Parallel, Tracking recovery

Abstract

This paper shows a method of tracking feature points to update camera pose and generating a synchronous map for AR (Augmented Reality) system. Firstly we select the ORB (Oriented FAST and Rotated BRIEF) [1] detection algorithm to detect the feature points which have depth information to be markers, and we use the LK (Lucas-Kanade) optical flow [2] algorithm to track four of them. Then we compute the rotation and translation of the moving camera by relationship matrix between 2D image coordinate and 3D world coordinate, and then we update the camera pose. Last we generate the map, and we draw some AR objects on it. If the feature points are missing, we can compute the same world coordinate as the one before missing to recover tracking by using new corresponding 2D/3D feature points and camera poses at that time. There are three novelties of this study: an improved ORB detection, which can obtain depth information, a rapid update of camera pose, and tracking recovery. Referring to the PTAM (Parallel Tracking and Mapping) [3], we also divide the process into two parallel sub-processes: Detecting and Tracking (including recovery when necessary) the feature points and updating the camera pose is one thread. Generating the map and drawing some objects is another thread. This parallel method can save time for the AR system and make the process work in real-time. 

Cite This Paper

Zheng Chai, Takafumi Matsumaru,"Feature Tracking and Synchronous Scene Generation with a Single Camera", International Journal of Image, Graphics and Signal Processing(IJIGSP), Vol.8, No.6, pp.1-12, 2016. DOI: 10.5815/ijigsp.2016.06.01

Reference

[1]Rublee, Ethan, et al. "ORB: an efficient alternative to SIFT or SURF." Computer Vision (ICCV), 2011 IEEE International Conference on. IEEE, 2011.

[2]Bouguet, Jean-Yves. "Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm." Intel Corporation 5 (2001): 1-10.

[3]Klein, Georg, and David Murray. "Parallel tracking and mapping for small AR workspaces." Mixed and Augmented Reality, 2007. ISMAR 2007. 6th IEEE and ACM International Symposium on. IEEE, 2007.

[4]Marius Muja and David G. Lowe, "Fast Approximate Nearest Neighbors with Automatic Algorithm Configuration", in International Conference on Computer Vision Theory and Applications (VISAPP'09), 2009.

[5]Zhang, Zhengyou. "Flexible camera calibration by viewing a plane from unknown orientations." Computer Vision, 1999. The Proceedings of the Seventh IEEE International Conference on. Vol. 1. IEEE, 1999.

[6]Zhang, Zhengyou. "A flexible new technique for camera calibration." Pattern Analysis and Machine Intelligence, IEEE Transactions on 22.11 (2000): 1330-1334.

[7]Davison, Andrew J., et al. "MonoSLAM: Real-time single camera SLAM." Pattern Analysis and Machine Intelligence, IEEE Transactions on 29.6 (2007): 1052-1067.

[8]Davison, Andrew J., Walterio W. Mayol, and David W. Murray. "Real-time localization and mapping with wearable active vision." Mixed and Augmented Reality, 2003. Proceedings. The Second IEEE and ACM International Symposium on. IEEE, 2003.

[9]Bailey, Tim, et al. "Consistency of the EKF-SLAM algorithm." Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on. IEEE, 2006.

[10]Chekhlov, Denis, et al. "Real-time and robust monocular SLAM using predictive multi-resolution descriptors." Advances in Visual Computing. Springer Berlin Heidelberg, 2006. 276-285.

[11]Davison, Andrew J. "Real-time simultaneous localisation and mapping with a single camera." Computer Vision, 2003. Proceedings. Ninth IEEE International Conference on. IEEE, 2003.

[12]Rao, G. Mallikarjuna, and Ch Satyanarayana. "Visual object target tracking using particle filter: a survey." International Journal of Image, Graphics and Signal Processing 5.6 (2013): 1250.

[13]D.G. Lowe, "Object Recognition from Local Scale-Invariant Features" Proc. Seventh Int'l Conf. Computer Vision, pp. 1150-1157, 1999.

[14]Rosten, Edward, and Tom Drummond. "Fusing points and lines for high performance tracking." Computer Vision, 2005. ICCV 2005. Tenth IEEE International Conference on. Vol. 2. IEEE, 2005.

[15]Rosten, Edward, and Tom Drummond. "Machine learning for high-speed corner detection." Computer Vision–ECCV 2006. Springer Berlin Heidelberg, 2006. 430-443.

[16]Castle, Robert O., Georg Klein, and David W. Murray. "Wide-area augmented reality using camera tracking and mapping in multiple regions." Computer Vision and Image Understanding 115.6 (2011): 854-867.

[17]Castle, Robert O., and David W. Murray. "Keyframe-based recognition and localization during video-rate parallel tracking and mapping." Image and Vision Computing 29.8 (2011): 524-532.

[18]Harris C, Stephens M. A combined corner and edge detector[C]//Alvey vision conference. 1988, 15: 50.

[19]Bay H, Tuytelaars T, Van Gool L. Surf: Speeded up robust features[M]//Computer vision–ECCV 2006. Springer Berlin Heidelberg, 2006: 404-417.

[20]M. Calonder, V. Lepetit, C. Strecha, and P. Fua. Brief: Binary robust independent elementary features. In In European Conference on Computer Vision, 2010.

[21]P. L. Rosin. Measuring corner properties. Computer Vision and Image Understanding, 73(2):291 – 307, 1999.

[22]B. D. Lucas and T. Kanade, "An iterative image registration technique with an application to stereovision," Int. Joint Conf. on Artificial Intelligentc, pp. 674-679, 1981.

[23]Horn B K, Schunck B G. Determining optical flow[C]//1981 Technical symposium east. International Society for Optics and Photonics, 1981: 319-331.

[24]Brandt J W. Improved accuracy in gradient-based optical flow estimation[J]. International Journal of Computer Vision, 1997, 25(1): 5-22.

[25]Cramer, Gabriel (1750). "Introduction à l'Analyse des lignes Courbes algébriques" (in French). Geneva: Europeana. pp. 656–659. Retrieved 2012-05-18.