Open Science Research Excellence

Open Science Index

Commenced in January 2007 Frequency: Monthly Edition: International Publications Count: 30515


Select areas to restrict search in scientific publication database:
10000693
Human Motion Capture: New Innovations in the Field of Computer Vision
Authors:
Abstract:
Human motion capture has become one of the major area of interest in the field of computer vision. Some of the major application areas that have been rapidly evolving include the advanced human interfaces, virtual reality and security/surveillance systems. This study provides a brief overview of the techniques and applications used for the markerless human motion capture, which deals with analyzing the human motion in the form of mathematical formulations. The major contribution of this research is that it classifies the computer vision based techniques of human motion capture based on the taxonomy, and then breaks its down into four systematically different categories of tracking, initialization, pose estimation and recognition. The detailed descriptions and the relationships descriptions are given for the techniques of tracking and pose estimation. The subcategories of each process are further described. Various hypotheses have been used by the researchers in this domain are surveyed and the evolution of these techniques have been explained. It has been concluded in the survey that most researchers have focused on using the mathematical body models for the markerless motion capture.
Digital Object Identifier (DOI):

References:

[1] Gall, J., Rosenhahn, B., Brox, T., & Seidel, H. P. (2010). Optimization and filtering for human motion capture. International journal of computer vision, 87(1-2), 75-92
[2] Pons-Moll, G., Baak, A., Helten, T., Muller, M., Seidel, H. P., & Rosenhahn, B. (2010). Multisensor-fusion for 3d full-body human motion capture. In Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on (pp. 663-670). IEEE.
[3] Swaisaenyakorn, S., Kelly, S. W., Young, P. R., & Batchelor, J. C. (2012). Evaluation of 3D animated human model from 3D scanner and motion capture to be used in electromagnetic simulator for body-centric system. In Biomedical Engineering and Informatics (BMEI), 5th International Conference on (pp. 632-636). IEEE.
[4] Field, M., Pan, Z., Stirling, D., & Naghdy, F. (2011). Human motion capture sensors and analysis in robotics. Industrial Robot: An International Journal, 38(2), 163-171.
[5] Sigal, L., Balan, A. O., & Black, M. J. (2010). Humaneva: Synchronized video and motion capture dataset and baseline algorithm for evaluation of articulated human motion. International journal of computer vision, 87(1-2), 4-27.
[6] Keskin, C., Kıraç, F., Kara, Y. E., & Akarun, L. (2013). Real time hand pose estimation using depth sensors. In Consumer Depth Cameras for Computer Vision (pp. 119-137). Springer London.
[7] G. Johnasson, (1973). “Visual Perception of Biological Motion and a Model for Its Analysis.” Perception Psychophysics 14(2): 201-211.
[8] M. K. Leung and Y.H. Yang (1995). “First Sight: A Human Body Outline Labeling System.” IEEE Transactions on Pattern Analysis and Machine Intelligence 17(4): 359-377.
[9] Sapp, Benjamin, David Weiss, and Ben Taskar. "Parsing human motion with stretchable models." Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on. IEEE, 2011.
[10] Shao, L., Ji, L., Liu, Y., & Zhang, J. (2012). Human action segmentation and recognition via motion and shape analysis. Pattern Recognition Letters, 33(4), 438-445.
[11] J. J. a. T. S. H. “. Kuch, “Vision Based Hand Modeling and Tracking for virtual teleconferencing and telecollaboration,” in ICCV , 1995.
[12] J. W. a. A. B. Davis, “The Representation and Recognition of Action Using Temporal Templates,” in International Conference on Computer Vision and, 1995.
[13] Jiang, Z., Lin, Z., & Davis, L. S. (2012). Recognizing human actions by learning and matching shape-motion prototype trees. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 34(3), 533-547.
[14] Niebles, J. C., Han, B., & Fei-Fei, L. (2010, June). Efficient extraction of human motion volumes by tracking. In Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on (pp. 655-662). IEEE.
[15] Li, R., Tian, T. P., Sclaroff, S., & Yang, M. H. (2010). 3d human motion tracking with a coordinated mixture of factor analyzers. International Journal of Computer Vision, 87(1-2), 170-190.
[16] Vondrak, M., Sigal, L., & Jenkins, O. C. (2013). Dynamical simulation priors for human motion tracking. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 35(1), 52-65.
[17] Stone, E. E., & Skubic, M. (2011, May). Evaluation of an inexpensive depth camera for passive in-home fall risk assessment. In Pervasive Computing Technologies for Healthcare (PervasiveHealth), 2011 5th International Conference on (pp. 71-77). IEEE.
[18] Aggarwal, J. K., & Ryoo, M. S. (2011). Human activity analysis: A review. ACM Computing Surveys (CSUR), 43(3), 16.
[19] Raskin, L., Rudzsky, M., & Rivlin, E. (2011). Dimensionality reduction using a Gaussian Process Annealed Particle Filter for tracking and classification of articulated body motions. Computer Vision and Image Understanding, 115(4), 503-519.
Vol:14 No:07 2020Vol:14 No:06 2020Vol:14 No:05 2020Vol:14 No:04 2020Vol:14 No:03 2020Vol:14 No:02 2020Vol:14 No:01 2020
Vol:13 No:12 2019Vol:13 No:11 2019Vol:13 No:10 2019Vol:13 No:09 2019Vol:13 No:08 2019Vol:13 No:07 2019Vol:13 No:06 2019Vol:13 No:05 2019Vol:13 No:04 2019Vol:13 No:03 2019Vol:13 No:02 2019Vol:13 No:01 2019
Vol:12 No:12 2018Vol:12 No:11 2018Vol:12 No:10 2018Vol:12 No:09 2018Vol:12 No:08 2018Vol:12 No:07 2018Vol:12 No:06 2018Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007