Abstract
In recent times, intuitive user interfaces such as the touch panel and pen display have become widely used in PCs and PDAs. Previously, the authors developed the bright pupil camera. They subsequently developed an eye-tracking pen display based on this camera and a new aspherical model of the eye. In this paper, a robust gaze estimation method that uses a integrated-light-source camera is proposed for analyzing embodied interaction. Then, a prototype of the eye-tracking pen display was developed. The accuracy of the system was approximately 12 mm on a 15" pen display, which is sufficient for human interaction support.
Chapter PDF
Similar content being viewed by others
References
Yamamoto, M., Watanabe, T.: Development of an Embodied Interaction System with InterActor by Speech and Hand Motion Input. In: CD-ROM of the 2005 IEEE International Workshop on Robots and Human Interactive Communication, pp. 323–328 (2005)
Yamamoto, M., Watanabe, T.: Timing Control Effects of Utterance to Communicative Actions on Embodied Interaction with a Robot and CG Character. International Journal of Human-Computer Interaction 24/1, 103–112 (2008)
Yamamoto, M., Nagamatsu, T., Watanabe, T.: Development of eye-tracking pen display based on stereo bright pupil technique. In: Proceedings of the 2010 ACM Symposium on Eye-Tracking Research & Applications, pp. 165–168 (2010)
Yamamoto, M., Komeda, M., Nagamatsu, T., Watanabe, T.: Development of Eye-Tracking Tabletop Interface for Media Art Work. In: Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces 2010, pp. 295–296 (2010)
Nagamatsu, T., Yamamoto, M., Sato, H.: MobiGaze: Development of a Gaze Interface for Handheld Mobile Devices. In: Proceedings of the 28th of the International Conference Extended Abstracts on Human Factors in Computing Systems, pp. 3349–3354 (2010)
Tobii, http://tobii.com/
Shih, S.-W., Liu, J.: A Novel Approach to 3-D Gaze Tracking using Stereo Cameras. IEEE Transactions on Systems, Man, and Cybernetics Part B 34/1, 234–245 (2004)
Guestrin, E.D., Eizenman, M.: Remote Point-of-Gaze Estimation with Free Head Movements Requiring a Single-Point Calibration. In: Proceedings of the 29th Annual International Conference of the IEEE EMBS, pp. 4556–4560 (2007)
Nagamatsu, T., Iwamoto, Y., Kamahara, J., Tanaka, N., Yamamoto, M.: Gaze estimation method based on an aspherical model of the cornea: surface of revolution about the optical axis of the eye. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, pp. 255–258 (2010)
Nagamatsu, T., Kamahara, J., Tanaka, N.: 3D Gaze Tracking with Easy Calibration Using stereo Cameras for Robot and Human Communication. In: Proceedings of IEEE ROMAN 2008, 59–64 (2008)
Ohno, T.: One-point calibration gaze tracking method. In: Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, vol. 34 (2006)
Chen, J., Tong, Y., Gray, W., Ji, Q.: A Robust 3D Eye Gaze Tracking System using Noise Reduction. In: Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, pp. 189–196 (2008)
Arduino, http://www.arduino.cc/
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yamamoto, M., Sato, H., Yoshida, K., Nagamatsu, T., Watanabe, T. (2011). Development of an Eye-Tracking Pen Display for Analyzing Embodied Interaction. In: Smith, M.J., Salvendy, G. (eds) Human Interface and the Management of Information. Interacting with Information. Human Interface 2011. Lecture Notes in Computer Science, vol 6771. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21793-7_74
Download citation
DOI: https://doi.org/10.1007/978-3-642-21793-7_74
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21792-0
Online ISBN: 978-3-642-21793-7
eBook Packages: Computer ScienceComputer Science (R0)