Research Publications by Category

Research Publications by Category

Here you will find a list of research paper publications which utilize Gazepoint eye-tracking technology. These are grouped in areas of research focus.

If you have published work and would like it listed here, please let us know.

If you would also like to utilize Gazepoint eye tracking technology for your research, please contact us today to get started.

 

  1. Human-Computer Interaction (HCI) & Eye-Tracking Applications

Fu, B., & Chow, N. (2025).
AdaptLIL: A Real-Time Adaptive Linked Indented List Visualization for Ontology Mapping.

Dondi, P., Sapuppo, S., & Porta, M. (2024).
Leyenes: A gaze-based text entry method using linear smooth pursuit and target speed.

Emami, P., Jiang, Y., Guo, Z., & Leiva, L. A. (2024).
Impact of Design Decisions in Scanpath Modeling.

Tedla, S. K., MacKenzie, S., & Brown, M. (2024).
LookToFocus: Image Focus via Eye Tracking.

Fu, B., Soriano, A. R., Chu, K., Gatsby, P., & Guardado, N. (2024).
Modelling Visual Attention for Future Intelligent Flight Deck.

Cho, S. M., Taylor, R. H., & Unberath, M. (2024).
Misjudging the Machine: Gaze May Forecast Human-Machine Team Performance in Surgery.

Palacios-Ibáñez, A., Castellet-Lathan, S., & Contero, M. (2024).
Exploring the user’s gaze during product evaluation.

Kurek, K., Skublewska-Paszkowska, M., & Powroznik, P. (2024).
The impact of applying universal design principles on the usability of online accommodation booking websites.

Acharya, S. (2024).
Dynamic Eye-Tracking on Large Screens: A 3D Printed Moving Guide Rail Platform.

Lim, Y., Gardi, A., Pongsakornsathien, N., Sabatini, R., Ezer, N., & Kistan, T. (2019).
Experimental Characterisation of Eye-Tracking Sensors for Adaptive Human-Machine Systems. Measurement. DOI

Zhu, H., Salcudean, S. E., & Rohling, R. N. (2019).
A novel gaze-supported multimodal human–computer interaction for ultrasound machines. International Journal of Computer Assisted Radiology and Surgery. DOI

Coba, L., Rook, L., Zanker, M., & Symeonidis, P. (2019).
Decision Making Strategies Differ in the Presence of Collaborative Explanations: Two Conjoint Studies. Proceedings of the 24th International Conference on Intelligent User Interfaces. DOI

  1. Cognitive Psychology, Attention, and Motivation

Murphy, T. I., Abel, L. A., Armitage, J. A., & Douglass, A. G. (2024).
Effects of tracker location on the accuracy and precision of the Gazepoint GP3 HD for spectacle wearers.

Silva, F., Garrido, M. I., & Soares, S. C. (2024).
The effect of anxiety and its interplay with social cues when perceiving aggressive behaviours.

Gabor-Siatkowska, K., Stefaniak, I., & Janicki, A. (2024).
Eye tracking data cleansing for dialogue agent.

Pah, N. D., Ngo, Q. C., McConnell, N., Polus, B., Kempster, P., Bhattacharya, A., Raghav, S., & Kumar, D. K. (2024).
Reflexive eye saccadic parameters in Parkinson’s disease.

Tanner, S. A., McCarthy, M. B., & O’Reilly, S. J. (2019).
Exploring the roles of motivation and cognition in label-usage using a combined eye-tracking and retrospective think aloud approach. Appetite, 135, 146–158. DOI

Villamor, M. M., & Rodrigo, Ma. M. T. (2019).
Gaze collaboration patterns of successful and unsuccessful programming pairs using cross-recurrence quantification analysis. Research and Practice in Technology Enhanced Learning, 14(1), 25. DOI

Pavisian, B., Patel, V. P., & Feinstein, A. (2019).
Cognitive mediated eye movements during the SDMT reveal the challenges with processing speed faced by people with MS. BMC Neurology, 19(1), 340. DOI

  1. Medical and Health Technology

Yin, R., & Neyens, D. M. (2024).
Examining how information presentation methods and a chatbot impact the use of electronic health record patient portals.

Moradizeyveh, S., Tabassum, M., Liu, S., Newport, R. A., Beheshti, A., & Ieva, A. D. (2024).
When Eye-Tracking Meets Machine Learning: A Systematic Review on Applications in Medical Image Analysis.

Cho, S. M., Taylor, R. H., & Unberath, M. (2024).
Misjudging the Machine: Gaze May Forecast Human-Machine Team Performance in Surgery.

Pah, N. D., Ngo, Q. C., McConnell, N., Polus, B., Kempster, P., Bhattacharya, A., Raghav, S., & Kumar, D. K. (2024).
Reflexive eye saccadic parameters in Parkinson’s disease.

Pietracupa, M., Ben Abdessalem, H., & Frasson, C. (2024).
Detection of Pre-error States in Aircraft Pilots Through Machine Learning.

Wei-Yen Hsu, Ya-Wen Cheng, Chong-Bin Tsai . (2022).
An Effective Algorithm to Analyze the Optokinetic Nystagmus Waveforms from a Low-Cost Eye Tracker. DOI

Pfarr, J., Ganter, M. T., Spahn, D. R., Noethiger, C. B., & Tscholl, D. W. (2019).
Avatar-Based Patient Monitoring With Peripheral Vision: A Multicenter Comparative Eye-Tracking Study. Journal of Medical Internet Research, 21(7), e13041. DOI

Ćosić, K., Popović, S., Šarlija, M., Mijić, I., Kokot, M., Kesedžić, I., Strangman, G., Ivković, V., & Zhang, Q. (2019).
New Tools and Methods in Selection of Air Traffic Controllers Based on Multimodal Psychophysiological Measurements. IEEE Access, 7, 174873–174888. DOI

  1. Neuroscience and Biometric Applications

Seha, S., Papangelakis, G., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. (2019).
Improving Eye Movement Biometrics Using Remote Registration of Eye Blinking Patterns. ICASSP 2019 – 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2562–2566. DOI

Neomániová, K., Berčík, J., & Pavelka, A. (2019).
The Use of Eye‑Tracker and Face Reader as Useful Consumer Neuroscience Tools Within Logo Creation. Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis, 67(4), 1061–1070. DOI

Gupta, V., Chakraborty, T., Agarwal, M., Singh, R., Arora, M., & Vatsa, M. (2019).
Bag-of-Lies: A Multimodal Dataset for Deception Detection.

Ghiţă, A., Hernández-Serrano, O., Moreno, M., Monràs, M., Gual, A., Maurage, P., Gacto-Sánchez, M., Ferrer-García, M., Porras-García, B., & Gutiérrez-Maldonado, J. (2024).
Exploring Attentional Bias toward Alcohol Content: Insights from Eye-Movement Activity.

Hinss, M. F., Jahanpour, E. S., Brock, A. M., & Roy, R. N. (2024).
A passive Brain-Computer Interface for operator mental fatigue estimation in monotonous surveillance operations.

  1. Virtual and Augmented Reality

Wibirama, S., Santosa, P. I., Widyarani, P., Brilianto, N., & Hafidh, W. (2019).
Physical discomfort and eye movements during arbitrary and optical flow-like motions in stereo 3D contents. Virtual Reality. DOI

Kannegieser, E., Atorf, D., & Meier, J. (2019).
Conducting an Experiment for Validating the Combined Model of Immersion and Flow. CSEDU. DOI

Kobylska, A., & Dzieńkowski, M. (2024).
User experience analysis in virtual museums.

Fu, B., Gatsby, P., Soriano, A. R., Chu, K., & Guardado, N. (2024).
Towards Intelligent Flight Deck–A Preliminary Study of Applied Eye Tracking in Flight Simulations.

Bagherzadeh, A., & Tehranchi, F. (2024).
Computer-Based Experiments in VR: A Virtual Reality Environment to Conduct Experiments, Collect Participants’ Data and Cognitive Modeling in VR.

  1. User Experience and Consumer Behavior

Moutinho, L., & Cerf, M. (Eds.). (2024).
Biometrics and Neuroscience Research in Business and Management: Advances and Applications.

Kobylska, A., & Dzieńkowski, M. (2024).
User experience analysis in virtual museums.

Danielkiewicz, R., & Dzieńkowski, M. (2024).
Analysis of user experience during interaction with automotive repair workshop websites.

Duwer, A., & Dzieńkowski, M. (2024).
Analysis of the usability of selected auction websites.

Constantinides, A., Fidas, C., Belk, M., & Pitsillides, A. (2019).
“I Recall This Picture”: Understanding Picture Password Selections Based on Users’ Sociocultural Experiences. IEEE/WIC/ACM International Conference on Web Intelligence, 408–412. DOI

Russell, C., & Crusse. (2019).
“I Consent”: An Eye-Tracking Study of IRB Informed Consent Forms.

Swift, D., & Schofield, D. (2019).
THE IMPACT OF COLOR ON SECONDARY TASK TIME WHILE DRIVING. International Journal of Information Technology, 4(1), 19.

  1. Social Media & Communications

Bezgin Ediş, L., Kılıç, S., & Aydın, S. (2024).
Message Appeals of Social Media Postings: An Experimental Study on Non-Governmental Organization.

Gabor-Siatkowska, K., Stefaniak, I., & Janicki, A. (2024).
A Multimodal Approach for Improving a Dialogue Agent for Therapeutic Sessions in Psychiatry.

Lobodenko, L., Matveeva, I., Shesterkina, L., & Zagoskin, E. (2024).
Eye-Tracking Study into Patterns of Attention to Environmental Media Texts among Youth Audiences in the Context of the Communicative Strategy.

  1. Educational Technology & Learning

Cheng, G., Zou, D., Xie, H., & Wang, F. L. (2024).
Exploring differences in self-regulated learning strategy use between high- and low-performing students in introductory programming.

Chvátal, R., Slezáková, J., & Popelka, S. (2024).
Analysis of problem-solving strategies for the development of geometric imagination using eye-tracking.

Bagherzadeh, A., & Tehranchi, F. (2024).
Computer-Based Experiments in VR: A Virtual Reality Environment to Conduct Experiments, Collect Participants’ Data and Cognitive Modeling in VR.

  1. AI & Machine Learning

Huang, J., Gopalakrishnan, S., Mittal, T., Zuena, J., & Pytlarz, J. (2024).
Analysis of Human Perception in Distinguishing Real and AI-Generated Faces.

Kollias, K.-F., Maraslidis, G. S., Sarigiannidis, P., & Fragulis, G. F. (2024).
Application of machine learning on eye-tracking data for autism detection.

  1. Sports and Fitness Performance

Larson, D. J., Wetherbee, J. C., & Branscum, P. (2019).
CrossFit Athletic Identity’s Relationship to Sponsor Recall, Recognition, and Purchase Intent. International Journal of Kinesiology and Sports Science, 7(3), 6. DOI

 

 

As the first high-performance eye tracking software available at a consumer-grade price, GP3 provides an amplified level of accurate data for medical use.
Gazepoint’s innovations in eye-tracking allow developers to enhance behavioral research applications and usability studies applications.
Eliminating the guesswork behind the interactions between consumer and computer, our Analysis UX Edition allows users to track human behavior through measures such as eye movement tracking, click and scrolling behavior and more.