Abstract
Human motion analysis, including data visualization and annotation, is crucial for understanding human behavior and intentions during various activities, aiding in the development of innovative tools that support independent living. Current wearable sensing technology provides rich 3D spatial movement data but generates multimodal complex datasets that require specialized skills for effective analysis. Despite the need, limited research exists on tools for effective visualization and easy annotation of such complex motion data. MoViAn (Motion Data Visualization and Annotation) is an innovative 3D data analysis system offering enriched visual representations of 3D human motion data (e.g., gaze, hand movements), along with an interactive user interface for data annotation. This paper presents a user study with 11 participants and demonstrates that MoViAn facilitates data annotation of human motion effectively through both quantitative and qualitative assessments. Specifically, gaze visualization improves annotation accuracy by 18.23%, while hand-tracing visualization increases annotation speed by 28.05% compared to a controlled setting. These findings highlight MoViAn’s potential for handling multimodal human motion data, paving the way for developing innovative tools that provide real-time monitoring and assessment of physical activities to support independent living.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Nvivo: a qualitative data analysis computer software package produced by lumivero. https://lumivero.com/products/nvivo/, Accessed 25 July 2024
Unity real-time development platform |3D, 2D, VR & AR engine. https://unity.com/, Accessed 25 July 2024
Bernard, J., Wilhelm, N., Krüger, B., May, T., Schreck, T., Kohlhammer, J.: MotionExplorer: exploratory search in human motion capture data based on hierarchical aggregation. IEEE Trans. Vis. Comput. Graph. (2013)
Bossavit, B., Fernández-Leiva, A.J.: A scoping review and a taxonomy of the use of motion-based technology centered on the end user. a special focus on elderly health. Multimedia tools and applications (2024)
Browen, I., Camarillo-Abad, H.M., Cibrian, F.L., Qi, T.D.: Creative insights into motion: enhancing human activity understanding with 3d data visualization and annotation. In: ACM Conference on Creativity & Cognition, ACM (2024)
Büschel, W., Lehmann, A., Dachselt, R.: MIRIA: a mixed reality toolkit for the in-situ visualization and analysis of spatio-temporal interaction data. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, ACM (2021)
Do, T., Krishnaswamy, N., Pustejovsky, J.: ECAT: Event capture annotation tool. arXiv (2016)
Emery, K.J., Zannoli, M., Warren, J., Xiao, L., Talathi, S.S.: OpenNEEDS: a dataset of gaze, head, hand, and scene signals during exploration in open-ended VR environments. In: ACM Symposium on Eye Tracking Research and Applications, ACM (2021)
Guerra, B.M.V., Torti, E., Marenzi, E., Schmid, M., Ramat, S., Leporati, F., Danese, G.: Ambient assisted living for frail people through human activity recognition: state-of-the-art, challenges and future directions. Front. Neurosci. (2023)
Jang, S., Elmqvist, N., Ramani, K.: GestureAnalyzer: visual analytics for pattern analysis of mid-air hand gestures. In: Proceedings of the 2nd ACM Symposium on Spatial user Interaction - SUI 2014, ACM Press (2014)
Jang, S., Elmqvist, N., Ramani, K.: Motionflow: visual abstraction and aggregation of sequential patterns in human motion tracking data. IEEE Trans. Vis. Comput. Graph. (2016)
Kloiber, S., Settgast, V., Schinko, C., Weinzerl, M., Fritz, J., Schreck, T., Preiner, R.: Immersive analysis of user motion in VR applications. The Visual computer (2020)
Li, A., Liu, J., Cordeil, M., Topliss, J., Piumsomboon, T., Ens, B.: Gestureexplorer: immersive visualisation and exploration of gesture data. In: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, ACM (2023)
Liu, L., Long, D., Magerko, B.: Moviz: a visualization tool for comparing motion capture data clustering algorithms. In: Proceedings of the 7th International Conference on Movement and Computing, ACM (2020)
Minh Dang, L., Min, K., Wang, H., Jalil Piran, M., Hee Lee, C., Moon, H.: Sensor-based and vision-based human activity recognition: a comprehensive survey. Pattern Recogn. (2020)
Qi, T.D., Boyd, L., Fitzpatrick, S., Raswan, M., Cibrian, F.L.: Towards a virtual reality visualization of hand-object interactions to support remote physical therapy. In: Proceedings of the 15th International Conference on Ubiquitous Computing & Ambient Intelligence (ucami 2023) volume 1, Springer Nature Switzerland (2023)
Reipschläger, P., Brudy, F., Dachselt, R., Matejka, J., Fitzmaurice, G., Anderson, F.: Avatar: an immersive analysis environment for human motion data combining interactive 3D avatars and trajectories. In: CHI Conference on Human Factors in Computing Systems, ACM (2022)
Ribeiro, C., Kuffner, R., Fernandes, C., Pereira, J.: 3D annotation in contemporary dance: enhancing the creation-tool video annotator. In: ACM International Symposium on Movement and Computing, ACM Press (2016)
dos Santos, A.D.P., Loke, L., Martinez-Maldonado, R.: Exploring video annotation as a tool to support dance teaching. In: Proceedings of the 30th Australian Conference on Computer-Human Interaction - OzCHI 2018, ACM Press (2018)
Spiro, I., Taylor, G., Williams, G., Bregler, C.: Hands by hand: crowd-sourced motion tracking for gesture annotation. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, IEEE (2010)
Wen, J., Gold, L., Ma, Q., LiKamWa, R.: Augmented coach: volumetric motion annotation and visualization for immersive sports coaching. In: 2024 IEEE Conference Virtual Reality and 3D User Interfaces, IEEE (2024)
Acknowledgements
The human subject research was approved by the Institutional Review Board at Chapman University with IRB number: IRB-22-250.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Qi, T.D., Browen, I., Zhang, D., Camarillo-Abad, H.M., Cibrian, F.L. (2024). MoViAn: Advancing Human Motion Analysis with 3D Visualization and Annotation. In: Bravo, J., Nugent, C., Cleland, I. (eds) Proceedings of the International Conference on Ubiquitous Computing and Ambient Intelligence (UCAmI 2024). UCAmI 2024. Lecture Notes in Networks and Systems, vol 1212. Springer, Cham. https://doi.org/10.1007/978-3-031-77571-0_2
Download citation
DOI: https://doi.org/10.1007/978-3-031-77571-0_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-77570-3
Online ISBN: 978-3-031-77571-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)