Skip to main content

MoViAn: Advancing Human Motion Analysis with 3D Visualization and Annotation

  • Conference paper
  • First Online:
Proceedings of the International Conference on Ubiquitous Computing and Ambient Intelligence (UCAmI 2024) (UCAmI 2024)

Abstract

Human motion analysis, including data visualization and annotation, is crucial for understanding human behavior and intentions during various activities, aiding in the development of innovative tools that support independent living. Current wearable sensing technology provides rich 3D spatial movement data but generates multimodal complex datasets that require specialized skills for effective analysis. Despite the need, limited research exists on tools for effective visualization and easy annotation of such complex motion data. MoViAn (Motion Data Visualization and Annotation) is an innovative 3D data analysis system offering enriched visual representations of 3D human motion data (e.g., gaze, hand movements), along with an interactive user interface for data annotation. This paper presents a user study with 11 participants and demonstrates that MoViAn facilitates data annotation of human motion effectively through both quantitative and qualitative assessments. Specifically, gaze visualization improves annotation accuracy by 18.23%, while hand-tracing visualization increases annotation speed by 28.05% compared to a controlled setting. These findings highlight MoViAn’s potential for handling multimodal human motion data, paving the way for developing innovative tools that provide real-time monitoring and assessment of physical activities to support independent living.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 179.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Nvivo: a qualitative data analysis computer software package produced by lumivero. https://lumivero.com/products/nvivo/, Accessed 25 July 2024

  2. Unity real-time development platform |3D, 2D, VR & AR engine. https://unity.com/, Accessed 25 July 2024

  3. Bernard, J., Wilhelm, N., Krüger, B., May, T., Schreck, T., Kohlhammer, J.: MotionExplorer: exploratory search in human motion capture data based on hierarchical aggregation. IEEE Trans. Vis. Comput. Graph. (2013)

    Google Scholar 

  4. Bossavit, B., Fernández-Leiva, A.J.: A scoping review and a taxonomy of the use of motion-based technology centered on the end user. a special focus on elderly health. Multimedia tools and applications (2024)

    Google Scholar 

  5. Browen, I., Camarillo-Abad, H.M., Cibrian, F.L., Qi, T.D.: Creative insights into motion: enhancing human activity understanding with 3d data visualization and annotation. In: ACM Conference on Creativity & Cognition, ACM (2024)

    Google Scholar 

  6. Büschel, W., Lehmann, A., Dachselt, R.: MIRIA: a mixed reality toolkit for the in-situ visualization and analysis of spatio-temporal interaction data. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, ACM (2021)

    Google Scholar 

  7. Do, T., Krishnaswamy, N., Pustejovsky, J.: ECAT: Event capture annotation tool. arXiv (2016)

    Google Scholar 

  8. Emery, K.J., Zannoli, M., Warren, J., Xiao, L., Talathi, S.S.: OpenNEEDS: a dataset of gaze, head, hand, and scene signals during exploration in open-ended VR environments. In: ACM Symposium on Eye Tracking Research and Applications, ACM (2021)

    Google Scholar 

  9. Guerra, B.M.V., Torti, E., Marenzi, E., Schmid, M., Ramat, S., Leporati, F., Danese, G.: Ambient assisted living for frail people through human activity recognition: state-of-the-art, challenges and future directions. Front. Neurosci. (2023)

    Google Scholar 

  10. Jang, S., Elmqvist, N., Ramani, K.: GestureAnalyzer: visual analytics for pattern analysis of mid-air hand gestures. In: Proceedings of the 2nd ACM Symposium on Spatial user Interaction - SUI 2014, ACM Press (2014)

    Google Scholar 

  11. Jang, S., Elmqvist, N., Ramani, K.: Motionflow: visual abstraction and aggregation of sequential patterns in human motion tracking data. IEEE Trans. Vis. Comput. Graph. (2016)

    Google Scholar 

  12. Kloiber, S., Settgast, V., Schinko, C., Weinzerl, M., Fritz, J., Schreck, T., Preiner, R.: Immersive analysis of user motion in VR applications. The Visual computer (2020)

    Google Scholar 

  13. Li, A., Liu, J., Cordeil, M., Topliss, J., Piumsomboon, T., Ens, B.: Gestureexplorer: immersive visualisation and exploration of gesture data. In: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, ACM (2023)

    Google Scholar 

  14. Liu, L., Long, D., Magerko, B.: Moviz: a visualization tool for comparing motion capture data clustering algorithms. In: Proceedings of the 7th International Conference on Movement and Computing, ACM (2020)

    Google Scholar 

  15. Minh Dang, L., Min, K., Wang, H., Jalil Piran, M., Hee Lee, C., Moon, H.: Sensor-based and vision-based human activity recognition: a comprehensive survey. Pattern Recogn. (2020)

    Google Scholar 

  16. Qi, T.D., Boyd, L., Fitzpatrick, S., Raswan, M., Cibrian, F.L.: Towards a virtual reality visualization of hand-object interactions to support remote physical therapy. In: Proceedings of the 15th International Conference on Ubiquitous Computing & Ambient Intelligence (ucami 2023) volume 1, Springer Nature Switzerland (2023)

    Google Scholar 

  17. Reipschläger, P., Brudy, F., Dachselt, R., Matejka, J., Fitzmaurice, G., Anderson, F.: Avatar: an immersive analysis environment for human motion data combining interactive 3D avatars and trajectories. In: CHI Conference on Human Factors in Computing Systems, ACM (2022)

    Google Scholar 

  18. Ribeiro, C., Kuffner, R., Fernandes, C., Pereira, J.: 3D annotation in contemporary dance: enhancing the creation-tool video annotator. In: ACM International Symposium on Movement and Computing, ACM Press (2016)

    Google Scholar 

  19. dos Santos, A.D.P., Loke, L., Martinez-Maldonado, R.: Exploring video annotation as a tool to support dance teaching. In: Proceedings of the 30th Australian Conference on Computer-Human Interaction - OzCHI 2018, ACM Press (2018)

    Google Scholar 

  20. Spiro, I., Taylor, G., Williams, G., Bregler, C.: Hands by hand: crowd-sourced motion tracking for gesture annotation. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, IEEE (2010)

    Google Scholar 

  21. Wen, J., Gold, L., Ma, Q., LiKamWa, R.: Augmented coach: volumetric motion annotation and visualization for immersive sports coaching. In: 2024 IEEE Conference Virtual Reality and 3D User Interfaces, IEEE (2024)

    Google Scholar 

Download references

Acknowledgements

The human subject research was approved by the Institutional Review Board at Chapman University with IRB number: IRB-22-250.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Trudi Di Qi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Qi, T.D., Browen, I., Zhang, D., Camarillo-Abad, H.M., Cibrian, F.L. (2024). MoViAn: Advancing Human Motion Analysis with 3D Visualization and Annotation. In: Bravo, J., Nugent, C., Cleland, I. (eds) Proceedings of the International Conference on Ubiquitous Computing and Ambient Intelligence (UCAmI 2024). UCAmI 2024. Lecture Notes in Networks and Systems, vol 1212. Springer, Cham. https://doi.org/10.1007/978-3-031-77571-0_2

Download citation

Publish with us

Policies and ethics

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy