Abstract
In this paper we present a computationally economical method of recovering the projective motion of head mounted cameras or EyeTap devices, for use in wearable computer-mediated reality. The tracking system combines featureless vision and inertial methods in a closed loop system to achieve accurate robust head tracking using inexpensive sensors. The combination of inertial and vision techniques provides the high accuracy visual registration needed for fitting computer graphics onto real images and the robustness to large interframe camera motion due to fast head rotations. Operating on a 1.2 GHz Pentium III wearable computer with graphics accelerated hardware, the system is able to register live video images with less than 2 pixels of error (0.3 degrees) at 12 frames per second. Fast image registration is achieved by offloading computer vision computation onto the graphics hardware, which is readily available on many wearable computer systems. As an application of this tracking approach, we present a system which allows wearable computer users to share views of their current environments that have been stabilised to another viewer's head position.














Similar content being viewed by others
Notes
VideoOrbits available for free at www.wearcam.org/orbits/.
OpenCV is available from http://sourceforge.net/projects/opencvlibrary/.
References
Artin M (1995) Algebra. Prentice-Hall, Englewood Cliffs, NJ
Welch G et al. (2001) High-performance wide-area optical tracking—the hiball tracking system. Pres Teleop Virt Environ 10(1):1-21
Horn B, Schunk B(1981) Determining optical flow. Art Intellig 17:185–203
Mann S Humanisitic intelligence/humanistic computing: "wearcomp" as a new framework for intelligent signal processing. Proceedings of the IEEE 86(11):2123–2151
Mann S (2001) Intelligent image processing. Wiley, New York
Mann S, Fung J (in press) EyeTap devices for augmented, deliberately diminished or otherwise altered visual perception of rigid planar patches of real world scenes. Pres: Teleop Virt Environ
Mann S, Fung J and Mancrieff E (1999) EyeTap technology for wireless electronic news gathering. Mob Comput Commun Rev 3(4):19–26
Satoh K et al. (2001) Townwear: an outdoor wearable MR system with high precision registration. In: Proceedings of the International Symposium for Mixed Reality (ISMR2001), Yokohama, Japan, 14–15 March 2001
Tsai RY, Huang TS (1981) Estimating three-dimensional motion parameters of a rigid planar patch I. ASSP(29):1147–1152
Yokokohji Y, Sugawara T and Yoshikawa T (2000) Accurate image overlay on video see-through hmds using vision and accelerometers. In: Proceedings of IEEE Virtual Reality 2000, New Brunswick, NJ, 18–22 March 2000
You S, Neumann U (2001) Fusion of vision and gyro tracking for rubust augmented reality registration. In: Proceedings of IEEE Virtual Reality 2001, Yokohama, Japan, 13–17 March 2001
You S, Neumann U and Azuma R (1999) Hybrid inertial and vision tracking for augmented reality registration. In: Proceedings of IEEE Virtual Reality 1999, Houston, TX, 13–17 March 1999
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Aimone, C., Fung, J. & Mann, S. An EyeTap video-based featureless projective motion estimation assisted by gyroscopic tracking for wearable computer mediated reality. Pers Ubiquit Comput 7, 236–248 (2003). https://doi.org/10.1007/s00779-003-0239-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00779-003-0239-6