Eye Tracking: Biometric Evaluations of Instructional Materials for Improved Learning
References:
[1] Boucheix, J. M., Lowe, R. K., (2010). An eye tracking comparison of external pointing cues and internal continuous cues in learning with complex animations. Learning and Instruction 20, pp.123-135.
[2] Canham, M., Hegarty, M., (2010). Effects of knowledge and display design on comprehension of complex graphics. Learning and Instruction 20, pp.155-166.
[3] De Koning, B. B., Tabbers, H. K., Rikers, R. M.J.P, Paas, F. (2010). Attention guidance in learning from a complex animation: Seeing is understanding? Learning and Instruction 20, pp.111-122.
[4] Duchowski, A. T. (2003). Eye tracking methodology: theory and practice. Springer, London.
[5] Goldberg J. H., & Wichansky A. M. (2003). Eye tracking in usability evaluation: A practitioner’s guide. In: Hyona J., Radach R., Deubel H. (eds) The mind’s eye: cognitive and applied aspects of eye movement research. North-Holland, Amsterdam, pp 493-516.
[6] iMotions © (2015). 7 most used eye tracking metrics and terms. Retrieved September 25, 2017 from: https://imotions.com/blog/7-terms-metrics-eye-tracking/
[7] iMotions © (2018). 7 ways to measure human behavior. Retrieved Nov. 26, 2018 from: https://imotions.com/blog/sensor-chart/
[8] iMotions © (2018). Eye tracking: The complete pocket guide. Retrieved Dec. 2, 2017 from: https://imotions.com/blog/eye-tracking/
[9] Jacob, R. J. K. (1995). Eye tracking in advanced interface design. In: Barfield W., Furness T.A. (eds) Virtual environments and advanced interface design. Oxford University Press, New York: NY, pp 258-288.
[10] Jacob, R. J. K. (1990). What you look at is what you get: Eye movement-based interaction techniques. Human-Computer Interaction Lab, Naval Research Laboratory. Washington: DC.
[11] Jacob, R. J., & Karn, K. S. (2003). Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research. Hyona, Radach & Deubel (eds.) Oxford, England.
[12] Jarodzka, H., Scheiter, K., Gerjets, P., Van Gog, T., (2010). In the eyes of the beholder: How experts and novices interpret dynamic stimuli. Learning and Instruction 20, pp.146-154.
[13] Majaranta, P. & Bulling, A. (2014). Advances in physiological computing. Eye tracking and eye-based human-computer interaction, 3(pp. 39-65).
[14] Meyer, K., Rasch, T., Schnotz, W., (2010). Effects of animation’s speed of presentation on perceptual processing and learning. Learning and Instruction 20, pp. 136-145.
[15] Schmidt-Weigand, F., Kohnert, A., Glowalla, U. (2010). A closer look at split visual attention in system- and self-paced instruction in multimedia learning. Learning and Instruction 20, pp. 100-110.
[16] Sibert, L. E., & Jacob R. J. K. (2000). Evaluation of eye gaze interaction. Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM.
[17] Sinclair, B. (2017). IoT Inc. How your company can use the internet of things to win in the outcome economy. McGraw-Hill Education, New York: NY.
[18] Stankovic, J. A. (2014). Research directions for the Internet of things. IEEE. Retrieved March 11, 2018 from https://www.cs.virginia.edu/~stankovic/psfiles/IOT.pdf