An Algorithm for Detecting the Minimal Sample Frequency for Tracking a Preset Motion Scenario

Full Text (PDF, 423KB), PP.1-12

Views: 0 Downloads: 0

Author(s)

Dmytro V. Fedasyuk 1,* Tetyana A. Marusenkova 1

1. Lviv Polytechnic National University/Software Department, Lviv, 79013, Ukraine

* Corresponding author.

DOI: https://doi.org/10.5815/ijisa.2020.04.01

Received: 26 Nov. 2019 / Revised: 29 Dec. 2019 / Accepted: 26 Jan. 2020 / Published: 8 Aug. 2020

Index Terms

Sample frequency, motion scenario, MEMS gyroscope, MEMS inertial sensor, Shoemake’s algorithm

Abstract

Inertial sensors are used for human motion capture in a wide range of applications. Some kinds of human motion can be tracked by inertial sensors incorporated in smartphones or smartwatches. However, the latter can scarcely be used if misclassification of user activities is highly undesirable. In this case electronics and embedded software engineers should design, implement and verify their own human motion capture embedded systems, and oftentimes they have to do so from scratch. One of the issues the engineers should face is selection of suitable components, primarily accelerometers, gyroscopes and magnetometers, after thorough examination of commercially available items. Among technical characteristics of inertial sensors their sample frequency determines whether the sensor will be able to capture a specific motion kind or not. We propose a novel algorithm that allows the researcher or embedded software engineer to calculate the minimal sample frequency sufficient for tracking a prescribed motion scenario without significant signal losses. The algorithm utilizes the Poisson equation for motion of a triaxial rigid body, the Shoemake’s algorithm for interpolating quaternions on the unit hypersphere, and the frequency analysis of a discrete-time signal. One can use the proposed algorithm as an argument for acceptance or rejection of a gyroscope when selecting hardware components for a human motion tracking system.

Cite This Paper

Dmytro V. Fedasyuk, Tetyana A. Marusenkova, "An Algorithm for Detecting the Minimal Sample Frequency for Tracking a Preset Motion Scenario", International Journal of Intelligent Systems and Applications(IJISA), Vol.12, No.4, pp.1-12, 2020. DOI:10.5815/ijisa.2020.04.01

Reference

[1]M. El-Gohary, "Joint Angle Tracking with Inertial Sensors", Ph.D. dissertation, Portland State University, Portland, USA, 2013. doi: 10.15760/etd.661.
[2]A. Bulling, U. Blanke and B. Schiele, "A tutorial on human activity recognition using body-worn inertial sensors", ACM Computing Surveys, vol. 46, no. 3, pp. 1-33, 2014. doi: 10.1145/2499621.
[3]T. Ngo, Y. Makihara, H. Nagahara, Y. Mukaigawa and Y. Yagi, "Similar gait action recognition using an inertial sensor", Pattern Recognition, vol. 48, no. 4, pp. 1289-1301, 2015. doi: 10.1016/j.patcog.2014.10.012.
[4]C. Chen, R. Jafari and N. Kehtarnavaz, "A survey of depth and inertial sensor fusion for human action recognition", Multimedia Tools and Applications, vol. 76, no. 3, pp. 4405-4425, 2015. doi: 10.1007/s11042-015-3177-1.
[5]A. Filippeschi, N. Schmitz, M. Miezal, G. Bleser, E. Ruffaldi and D. Stricker, "Survey of Motion Tracking Methods Based on Inertial Sensors: A Focus on Upper Limb Human Motion", Sensors, vol. 17, no. 6, p. 1257, 2017. doi: 10.3390/s17061257.
[6]S. Daroogheha, T. Lasky and B. Ravani, "Position Measurement Under Uncertainty Using Magnetic Field Sensing", IEEE Transactions on Magnetics, vol. 54, no. 12, pp. 1-8, 2018. doi: 10.1109/tmag.2018.2873158.
[7]Q. Mourcou, A. Fleury, C. Franco, F. Klopcic and N. Vuillerme, "Performance Evaluation of Smartphone Inertial Sensors Measurement for Range of Motion", Sensors, vol. 15, no. 9, pp. 23168-23187, 2015. doi: 10.3390/s150923168.
[8]A. Mao, X. Ma, Y. He and J. Luo, "Highly Portable, Sensor-Based System for Human Fall Monitoring", Sensors, vol. 17, no. 9, p. 2096, 2017. doi: 10.3390/s17092096.
[9]X. Chen, "Human Motion Analysis with Wearable Inertial Sensors", PhD dissertation, University of Tennessee, Knoxville, USA, 2013.
[10]K. Różanowski, J. Lewandowski and M. Placha, "Determination of the Orientation of an Object within a 3D Space. Implementation and Example Applications", International Journal of Microelectronics and Computer Science, vol. 6, no. 4, pp. 124 - 129, 2015.
[11]D. Dinu, M. Fayolas, M. Jacquet, E. Leguy, J. Slavinski and N. Houel, "Accuracy of Postural Human-motion Tracking Using Miniature Inertial Sensors", Procedia Engineering, vol. 147, pp. 655-658, 2016. doi: 10.1016/j.proeng.2016.06.266.
[12]D. Zwartjes, T. Heida, J. van Vugt, J. Geelen and P. Veltink, "Ambulatory Monitoring of Activities and Motor Symptoms in Parkinson’s Disease", IEEE Transactions on Biomedical Engineering, vol. 57, no. 11, pp. 2778–2786, 2010.
[13]Y. Li, K. Yao and G. Zweig, "Feedback-based handwriting recognition from inertial sensor data for wearable devices", in 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Brisbane, QLD, Australia, 2015. doi: 10.1109/icassp.2015.7178375.
[14]B. Fasel, "Drift reduction for inertial sensor based orientation and position estimation in the presence of high dynamic variability during competitive skiing and daily-life walking", PhD. dissertation, EPFL, Lausanne, Switzerland, 2017. doi: 10.5075/epfl-thesis-7803.
[15]B. Fasel, J. Spörri and K. Aminian, "Improving the accuracy of low-cost GNSS by fusion with inertial and magnetic sensors in alpine ski racing", in 34rd Conference on Biomechanics in Sports 2016, Tsukuba, Japan, 2016.
[16]B. Fasel, M. Gilgien, J. Spörri and K. Aminian, "A new training assessment method for alpine ski racing: estimating center of mass trajectory by fusing inertial sensors with periodically available position anchor points", Frontiers in Physiology, vol. 9, 2018. doi: 10.3389/fphys.2018.01203.
[17]P. Siirtola, "Recognizing human activities based on wearable inertial measurements. Methods and applications", PhD dissertation, University of Oulu, Oulu, Finland, 2015.
[18]R. Aylward, "Sensemble: A Wireless Inertial Sensor System for Interactive Dance and Collective Motion Analysis", Master of Science thesis, Massachusetts Institute of Technology, Cambridge, USA, 2006.
[19]Z. Tang, M. Sekine, T. Tamura, N. Tanaka, M. Yoshida and W. Chen, "Measurement and Estimation of 3D Orientation using Magnetic and Inertial Sensors", Advanced Biomedical Engineering, vol. 4, pp. 135-143, 2015. doi: 10.14326/abe.4.135.
[20]H. Gjoreski and M. Gams, "Activity/Posture Recognition using Wearable Sensors Placed on Different Body Locations", in The Fourteenth International Conference on Artificial Intelligence and Soft Computing, 2011. doi: 10.2316/p.2011.716-067.
[21]P. Gupta and T. Dallas, "Feature Selection and Activity Recognition System Using a Single Triaxial Accelerometer", IEEE Transactions on Biomedical Engineering, vol. 61, no. 6, pp. 1780-1786, 2014. doi: 10.1109/tbme.2014.2307069.
[22]J. Guiry, P. van de Ven, J. Nelson, L. Warmerdam and H. Riper, "Activity recognition with smartphone support", Medical Engineering & Physics, vol. 36, no. 6, pp. 670-675, 2014. doi: 10.1016/j.medengphy.2014.02.009.
[23]Y. Liang, X. Zhou, Z. Yu and B. Guo, "Energy-Efficient Motion Related Activity Recognition on Mobile Devices for Pervasive Healthcare", Mobile Networks and Applications, vol. 19, no. 3, pp. 303-317, 2013. doi: 10.1007/s11036-013-0448-9.
[24]Z. Yan, V. Subbaraju, D. Chakraborty, A. Misra and K. Aberer, "Energy-Efficient Continuous Activity Recognition on Mobile Phones: An Activity-Adaptive Approach", in 2012 16th International Symposium on Wearable Computers, Newcastle, 2012. doi: 10.1109/iswc.2012.23.
[25]F. Wu, H. Zhao, Y. Zhao and H. Zhong, "Development of a Wearable-Sensor-Based Fall Detection System", International Journal of Telemedicine and Applications, vol. 2015, pp. 1-11, 2015. doi: 10.1155/2015/576364.
[26]H. Gjoreski, M. Gams and M. Luštrek, "Context-based fall detection and activity recognition using inertial and location sensors", Journal of Ambient Intelligence and Smart Environments, vol. 6, no. 4, pp. 419-433, 2014.
[27]F. Destelle, A. Ahmadi, N. O'Connor, K. Moran, A. Chatzitofis, D. Zarpalas, and P. Daras, "Low-cost accurate skeleton tracking based on fusion of Kinect and wearable inertial sensors", in 22nd European Signal Processing Conference (EUSIPCO), Lisbon, Portugal, 2014. doi: 10.5281/zenodo.44122.
[28]M. Gjoreski, H. Gjoreski, M. Luštrek and M. Gams, "How Accurately Can Your Wrist Device Recognize Daily Activities and Detect Falls?", Sensors, vol. 16, no. 6, p. 800, 2016. doi: 10.3390/s16060800.
[29]C. Murphy, "Choosing the Most Suitable MEMS Accelerometer for Your Application—Part 2", Analog Dialogue, 2017.
[30]L. Landry, "Inertial Sensor Characterization for Inertial Navigation and Human Motion Tracking Applications", Master's Thesis, Naval Postgraduate School, Monterey, USA. 2012.
[31]M. Kok, J. Hol and T. Schön, "Using Inertial Sensors for Position and Orientation Estimation", Foundations and Trends® in Signal Processing, vol. 11, no. 1-2, pp. 1-153, 2017. doi: 10.1561/2000000094.
[32]D. Fedasyuk, R. Holyaka and T. Marusenkova, "Method of Analyzing Dynamic Characteristics of MEMS Gyroscopes in Test Measurement Mode", in 2019 9th International Conference on Advanced Computer Information Technologies (ACIT), pp. 157–160, 2019. doi: 10.1109/acitt.2019.8780058.
[33]K. Shoemake, "Animating Rotation with Quaternion Curves", SIGGRAPH, vol. 19, no 3, pp. 245-254,1985.
[34]Y. Stets and T. Marusenkova, "The module for prescribing motion scenarios in hardware-software tools for inertial sensors assessment" in Int. Youth Science Forum «LITTERIS ET ARTIBUS», Lviv, Ukraine, Nov. 2019.