Skip to main content
Log in

An EyeTap video-based featureless projective motion estimation assisted by gyroscopic tracking for wearable computer mediated reality

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

In this paper we present a computationally economical method of recovering the projective motion of head mounted cameras or EyeTap devices, for use in wearable computer-mediated reality. The tracking system combines featureless vision and inertial methods in a closed loop system to achieve accurate robust head tracking using inexpensive sensors. The combination of inertial and vision techniques provides the high accuracy visual registration needed for fitting computer graphics onto real images and the robustness to large interframe camera motion due to fast head rotations. Operating on a 1.2 GHz Pentium III wearable computer with graphics accelerated hardware, the system is able to register live video images with less than 2 pixels of error (0.3 degrees) at 12 frames per second. Fast image registration is achieved by offloading computer vision computation onto the graphics hardware, which is readily available on many wearable computer systems. As an application of this tracking approach, we present a system which allows wearable computer users to share views of their current environments that have been stabilised to another viewer's head position.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2. a
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7. a
Fig. 8.
Fig. 9.
Fig. 10.
Fig. 11.
Fig. 12.
Fig. 13.
Fig. 14.

Similar content being viewed by others

Notes

  1. VideoOrbits available for free at www.wearcam.org/orbits/.

  2. OpenCV is available from http://sourceforge.net/projects/opencvlibrary/.

References

  1. Artin M (1995) Algebra. Prentice-Hall, Englewood Cliffs, NJ

  2. Welch G et al. (2001) High-performance wide-area optical tracking—the hiball tracking system. Pres Teleop Virt Environ 10(1):1-21

    Google Scholar 

  3. Horn B, Schunk B(1981) Determining optical flow. Art Intellig 17:185–203

    Google Scholar 

  4. Mann S Humanisitic intelligence/humanistic computing: "wearcomp" as a new framework for intelligent signal processing. Proceedings of the IEEE 86(11):2123–2151

  5. Mann S (2001) Intelligent image processing. Wiley, New York

  6. Mann S, Fung J (in press) EyeTap devices for augmented, deliberately diminished or otherwise altered visual perception of rigid planar patches of real world scenes. Pres: Teleop Virt Environ

  7. Mann S, Fung J and Mancrieff E (1999) EyeTap technology for wireless electronic news gathering. Mob Comput Commun Rev 3(4):19–26

    Google Scholar 

  8. Satoh K et al. (2001) Townwear: an outdoor wearable MR system with high precision registration. In: Proceedings of the International Symposium for Mixed Reality (ISMR2001), Yokohama, Japan, 14–15 March 2001

  9. Tsai RY, Huang TS (1981) Estimating three-dimensional motion parameters of a rigid planar patch I. ASSP(29):1147–1152

  10. Yokokohji Y, Sugawara T and Yoshikawa T (2000) Accurate image overlay on video see-through hmds using vision and accelerometers. In: Proceedings of IEEE Virtual Reality 2000, New Brunswick, NJ, 18–22 March 2000

  11. You S, Neumann U (2001) Fusion of vision and gyro tracking for rubust augmented reality registration. In: Proceedings of IEEE Virtual Reality 2001, Yokohama, Japan, 13–17 March 2001

  12. You S, Neumann U and Azuma R (1999) Hybrid inertial and vision tracking for augmented reality registration. In: Proceedings of IEEE Virtual Reality 1999, Houston, TX, 13–17 March 1999

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chris Aimone.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Aimone, C., Fung, J. & Mann, S. An EyeTap video-based featureless projective motion estimation assisted by gyroscopic tracking for wearable computer mediated reality. Pers Ubiquit Comput 7, 236–248 (2003). https://doi.org/10.1007/s00779-003-0239-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-003-0239-6

Keywords

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy