Abstract
The eye-tracking heatmap is a quantitative research tool that shows the user’s gaze points. Most of the eye-tracking heatmap is a 2D visualization comprising different colors. The heatmap colors indicate gaze duration, and the color cell’s position indicates gaze position. The eye-tracking heatmap has often been used to evaluate the usability of web interfaces to understand user behavior. For example, web designers have used heatmaps to obtain actual evidence for how users use their website. Further, the collection of eye-tracking heatmap data during website viewing facilitates measurement of improvements in site usability. However, although the eye-tracking heatmap provides rich information about how users watch, focus, and interact with a site, the high informational requirements substantially increase computational burden. In many cases, the distribution of gaze points in an eye-tracking heatmap may not be easily understood and interpreted. Accordingly, manual evaluation of heatmaps is inefficient. This study aimed to evaluate web usability by focusing on signifiers as an interface element using eye-tracking heatmaps and machine learning algorithms. We also used the dimensionality reduction technique to reduce the complexity of heatmap data. The results showed that the proposed classification model that combined the decision tree and PCA technique provided more than 90% accuracy when compared with the other nine classical machine learning methods. This finding indicated that the machine learning process reached the correct decision about the interface’s usability.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Tula, A.D., Kurauchi, A., Coutinho, F., Morimoto, C.: Heatmap explorer: an interactive gaze data visualization tool for the evaluation of computer interfaces. In: Proceedings of the 15th Brazilian Symposium on Human Factors in Computing Systems, pp. 1–9 (2016)
Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: State-of-the-art of visualization for eye tracking data. In: EuroVis (STARs) (2014)
Birkett, A.: Heat Maps: What Are They Good For (Besides Looking Cool)? https://cxl.com/blog/heat-maps/. Accessed 15 Feb 2021
Tokyo Metropolitan University: Tokyo Metropolitan University. https://www.tmu.ac.jp/english/index.html. Accessed 23 Apr 2021
Granka, L.A., Joachims, T., Gay, G.: Eye-tracking analysis of user behavior in WWW search. In: Proceedings of the 27th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 478–479. Association for Computing Machinery, New York (2004). https://doi.org/10.1145/1008992.1009079
Kurzhals, K., Fisher, B., Burch, M., Weiskopf, D.: Evaluating visual analytics with eye tracking. In: Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization, pp. 61–69. Association for Computing Machinery, New York (2014). https://doi.org/10.1145/2669557.2669560
Davis, R., Gardner, J., Schnall, R.: A review of usability evaluation methods and their use for testing eHealth HIV interventions. Curr. HIV/AIDS Rep. 17(3), 203–218 (2020). https://doi.org/10.1007/s11904-020-00493-3
Hussain, A., Mkpojiogu, E.O.C., Jamaludin, N.H., Moh, S.T.L.: A usability evaluation of Lazada mobile application. AIP Conf. Proc. 1891, 020059 (2017). https://doi.org/10.1063/1.5005392
Tobiipro: Tobii Pro X3–120 screen-based eye tracker. https://www.tobiipro.com/product-listing/tobii-pro-x3-120/. Accessed 18 Feb 2021
Prendinger, H., Hyrskykari, A., Nakayama, M., Istance, H., Bee, N., Takahasi, Y.: Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimation. Univ. Access Inf. Soc. 8, 339–354 (2009)
Moran, K.: Heatmap Visualizations from Signifier Eyetracking Experiment. https://www.nngroup.com/articles/heatmap-visualizations-signifiers/. Accessed 16 Feb 2021
Duan, H., et al.: Learning to predict where the children with asd look. In: 2018 25th IEEE International Conference on Image Processing (ICIP), pp. 704–708. IEEE (2018)
Fujii, K., Rekimoto, J.: SubMe: an interactive subtitle system with English skill estimation using eye tracking. In: Proceedings of the 10th Augmented Human International Conference 2019, pp. 1–9. Association for Computing Machinery, New York (2019). https://doi.org/10.1145/3311823.3311865
Yaneva, V., Ha, L.A., Eraslan, S., Yesilada, Y., Mitkov, R.: Detecting high-functioning autism in adults using eye tracking and machine learning. IEEE Trans. Neural Syst. Rehabil. Eng. 28, 1254–1261 (2020). https://doi.org/10.1109/TNSRE.2020.2991675
Salminen, J., Nagpal, M., Kwak, H., An, J., Jung, S., Jansen, B.J.: Confusion prediction from eye-tracking data: experiments with machine learning. In: Proceedings of the 9th International Conference on Information Systems and Technologies, pp. 1–9. Association for Computing Machinery, New York (2019). https://doi.org/10.1145/3361570.3361577
Koonsanit, K., Jaruskulchai, C., Eiumnoh, A.: Band selection for dimension reduction in hyper spectral image using integrated information gain and principal components analysis technique. Int. J. Mach. Learn. Comput. 2, 248 (2012)
Van den Boom, B., et al.: Analysis of Eye-Tracking Data by Combining Visualizations Interactively
Kherif, F., Latypova, A.: Chapter 12 - Principal component analysis. In: Mechelli, A., Vieira, S. (eds.) Machine Learning, pp. 209–225. Academic Press (2020). https://doi.org/10.1016/B978-0-12-815739-8.00012-2
Schölkopf, B., Smola, A.J., Williamson, R.C., Bartlett, P.L.: New support vector algorithms. Neural Comput. 12, 1207–1245 (2000)
Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)
Zhang, M.-L., Zhou, Z.-H.: A k-nearest neighbor based algorithm for multi-label classification. In: 2005 IEEE International Conference on Granular Computing, pp. 718–721. IEEE (2005)
Haykin, S.S.: Neural Networks and Learning Machines. Prentice Hall, New York (2009)
Sharma, R., Cecotti, H.: Classification of graphical user interfaces through gaze-based features. In: Santosh, K. C., Hegadi, Ravindra S. (eds.) RTIP2R 2018. CCIS, vol. 1035, pp. 3–16. Springer, Singapore (2019). https://doi.org/10.1007/978-981-13-9181-1_1
Zhu, B., Zhang, P.Y., Chi, J.N., Zhang, T.X.: Gaze estimation based on single camera. In: Advanced Materials Research, pp. 1066–1076. Trans Tech Publ (2013)
Ng, A.Y.: Preventing “overfitting” of cross-validation data. In: ICML, pp. 245–253. Citeseer (1997)
Yadav, S., Shukla, S.: Analysis of k-fold cross-validation over hold-out validation on colossal datasets for quality classification. In: 2016 IEEE 6th International conference on advanced computing (IACC), pp. 78–83. IEEE (2016)
Sun, C., Hu, J., Lam, K.-M.: Feature subset selection for efficient AdaBoost training. In: 2011 IEEE International Conference on Multimedia and Expo, pp. 1–6 (2011). https://doi.org/10.1109/ICME.2011.6011905
Acknowledgment
This work was supported by JSPS KAKENHI Grant Number JP20K12511.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Koonsanit, K., Tsunajima, T., Nishiuchi, N. (2021). Evaluation of Strong and Weak Signifiers in a Web Interface Using Eye-Tracking Heatmaps and Machine Learning. In: Saeed, K., Dvorský, J. (eds) Computer Information Systems and Industrial Management. CISIM 2021. Lecture Notes in Computer Science(), vol 12883. Springer, Cham. https://doi.org/10.1007/978-3-030-84340-3_16
Download citation
DOI: https://doi.org/10.1007/978-3-030-84340-3_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-84339-7
Online ISBN: 978-3-030-84340-3
eBook Packages: Computer ScienceComputer Science (R0)