Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a shortlist of publications that we have found to date. If you are interested in using our best eye-tracking software for marketers in your research and don’t have the software yet, shop now or contact us to get started!

If you have published your research from your neuromarketing study that uses the Gazepoint system, please let us know and we will add a link to your work here! Our suggested reference to cite Gazepoint in your research is: Gazepoint (2024). GP3 Eye-Tracker. Retrieved from https://www.gazept.com

Fu, B., & Chow, N. (2025). AdaptLIL: A Real-Time Adaptive Linked Indented List Visualization for Ontology Mapping. In G. Demartini, K. Hose, M. Acosta, M. Palmonari, G. Cheng, H. Skaf-Molli, N. Ferranti, D. Hernández, & A. Hogan (Eds.), The Semantic Web – ISWC 2024 (Vol. 15232, pp. 3–22). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-77850-6_1
Токмовцева, А. Д., & Акельева, Е. В. (2024). Insights into Landscape Perception and Appreciation through Eye Movement Tracking. Lurian Journal, 5(2), 38–46. https://doi.org/10.15826/Lurian.2024.5.2.2
Chow, N., & Fu, B. (2024). AdaptLIL: A Gaze-Adaptive Visualization for Ontology Mapping (No. arXiv:2411.11768). arXiv. https://doi.org/10.48550/arXiv.2411.11768
Cui, Y., & Liu, X. (2024). How condensation and non-condensation impact viewers’ processing effort and comprehension – an eye-tracking study on Chinese subtitling of English documentaries. Perspectives, 1–19. https://doi.org/10.1080/0907676X.2024.2433059
Erol Barkana, D., Bartl-Pokorny, K. D., Kose, H., Landowska, A., Milling, M., Robins, B., Schuller, B. W., Uluer, P., Wrobel, M. R., & Zorcec, T. (2024). Challenges in Observing the Emotions of Children with Autism Interacting with a Social Robot. International Journal of Social Robotics. https://doi.org/10.1007/s12369-024-01185-3
Ciukaj, M., & Skublewska-Paszkowska, M. (2024). Comparative analysis of the availability of popular social networking sites. Journal of Computer Sciences Institute, 32, 217–222. https://doi.org/10.35784/jcsi.6292
Huang, J., Gopalakrishnan, S., Mittal, T., Zuena, J., & Pytlarz, J. (2024). Analysis of Human Perception in Distinguishing Real and AI-Generated Faces: An Eye-Tracking Based Study (No. arXiv:2409.15498). arXiv. https://doi.org/10.48550/arXiv.2409.15498
Palacios-Ibáñez, A., Castellet-Lathan, S., & Contero, M. (2024). Exploring the user’s gaze during product evaluation through the semantic differential: a comparison between virtual reality and photorealistic images. Virtual Reality, 28(3), 153. https://doi.org/10.1007/s10055-024-01048-2
Huang, Z., Zhu, G., Duan, X., Wang, R., Li, Y., Zhang, S., & Wang, Z. (2024). Measuring eye-tracking accuracy and its impact on usability in apple vision pro (No. arXiv:2406.00255). arXiv. https://doi.org/10.48550/arXiv.2406.00255
Wiediartini, Ciptomulyono, U., & Dewi, R. S. (2024). Evaluation of physiological responses to mental workload in n-back and arithmetic tasks. Ergonomics, 67(8), 1121–1133. https://doi.org/10.1080/00140139.2023.2284677
Lin, J.-H., Hsu, M., & Guo, L.-Y. (2024). Investigation of the Reliability of Oculomotor Assessment of Gaze and Smooth Pursuit with a Novel Approach. 2024 17th International Convention on Rehabilitation Engineering and Assistive Technology (i-CREATe), 1–6. https://doi.org/10.1109/i-CREATe62067.2024.10776135
Silva, F., Garrido, M. I., & Soares, S. C. (2024). The effect of anxiety and its interplay with social cues when perceiving aggressive behaviours. Quarterly Journal of Experimental Psychology, 17470218241258208. https://doi.org/10.1177/17470218241258209
Bezgin Ediş, L., Kılıç, S., & Aydın, S. (2024). Message Appeals of Social Media Postings: An Experimental Study on Non-Governmental Organization. Journal of Nonprofit & Public Sector Marketing, 1–21. https://doi.org/10.1080/10495142.2024.2377975
Fu, B., Soriano, A. R., Chu, K., Gatsby, P., & Guardado, N. (2024). Modelling Visual Attention for Future Intelligent Flight Deck - A Case Study of Pilot Eye Tracking in Simulated Flight Takeoff. Adjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization, 170–175. https://doi.org/10.1145/3631700.3664871
Murphy, T., Armitage, J. A., van Wijngaarden, P., Abel, L. A., & Douglass, A. (2024). Unmasking visual search: an objective framework for grouping eye tracking data. Investigative Ophthalmology & Visual Science, 65(7), 5179.
Nguyen-Ho, T.-L., Kongmeesub, O., Tran, M.-T., Nie, D., Healy, G., & Gurrin, C. (2024). EAGLE: Eyegaze-Assisted Guidance and Learning Evaluation for Lifeloging Retrieval. Proceedings of the 7th Annual ACM Workshop on the Lifelog Search Challenge, 18–23. https://doi.org/10.1145/3643489.3661115
Tedla, S. K., MacKenzie, S., & Brown, M. (2024). LookToFocus: Image Focus via Eye Tracking. Proceedings of the 2024 Symposium on Eye Tracking Research and Applications, 1–7. https://doi.org/10.1145/3649902.3656358
Moutinho, L., & Cerf, M. (Eds.). (2024). Biometrics and Neuroscience Research in Business and Management: Advances and Applications. De Gruyter. https://doi.org/10.1515/9783110708509
Taieb-Maimon, M., & Romanovskii-Chernik, L. (2024). Improving Error Correction and Text Editing Using Voice and Mouse Multimodal Interface. International Journal of Human–Computer Interaction, 1–24. https://doi.org/10.1080/10447318.2024.2352932
Emami, P., Jiang, Y., Guo, Z., & Leiva, L. A. (2024). Impact of Design Decisions in Scanpath Modeling. Proceedings of the ACM on Human-Computer Interaction, 8(ETRA), 1–16. https://doi.org/10.1145/3655602
Dondi, P., Sapuppo, S., & Porta, M. (2024). Leyenes: A gaze-based text entry method using linear smooth pursuit and target speed. International Journal of Human-Computer Studies, 184, 103204. https://doi.org/10.1016/j.ijhcs.2023.103204
Kobylska, A., & Dzieńkowski, M. (2024). User experience analysis in virtual museums. Journal of Computer Sciences Institute, 30, 31–38. https://doi.org/10.35784/jcsi.5382
Moradizeyveh, S., Tabassum, M., Liu, S., Newport, R. A., Beheshti, A., & Ieva, A. D. (2024). When Eye-Tracking Meets Machine Learning: A Systematic Review on Applications in Medical Image Analysis (No. arXiv:2403.07834). arXiv. https://doi.org/10.48550/arXiv.2403.07834
Jiang, Y., Leiva, L. A., Houssel, P. R. B., Tavakoli, H. R., Kylmälä, J., & Oulasvirta, A. (2024). UEyes: An Eye-Tracking Dataset across User Interface Types (No. arXiv:2402.05202). arXiv. https://doi.org/10.48550/arXiv.2402.05202
Yin, R., & Neyens, D. M. (2024). Examining how information presentation methods and a chatbot impact the use and effectiveness of electronic health record patient portals: An exploratory study. Patient Education and Counseling, 119, 108055. https://doi.org/10.1016/j.pec.2023.108055
Murphy, T. I., Abel, L. A., Armitage, J. A., & Douglass, A. G. (2024). Effects of tracker location on the accuracy and precision of the Gazepoint GP3 HD for spectacle wearers. Behavior Research Methods, 56(1), 43–52. https://doi.org/10.3758/s13428-022-02023-y
Taieb-Maimon, M., Romanovski-Chernik, A., Last, M., Litvak, M., & Elhadad, M. (2024). Mining Eye-Tracking Data for Text Summarization. International Journal of Human–Computer Interaction, 40(17), 4887–4905. https://doi.org/10.1080/10447318.2023.2227827
Jankowski, M., & Goroncy, A. (2024). Anatomical variants of acne differ in their impact on social perception. Journal of the European Academy of Dermatology and Venereology, 38(8), 1628–1636. https://doi.org/10.1111/jdv.19798
Chvátal, R., Slezáková, J., & Popelka, S. (2024). Analysis of problem-solving strategies for the development of geometric imagination using eye-tracking. Education and Information Technologies, 29(10), 12969–12987. https://doi.org/10.1007/s10639-023-12395-z
Chhimpa, G. R., Kumar, A., Garhwal, S., & Dhiraj. (2024). Empowering individuals with disabilities: a real-time, cost-effective, calibration-free assistive system utilizing eye tracking. Journal of Real-Time Image Processing, 21(3), 97. https://doi.org/10.1007/s11554-024-01478-w
Avoyan, A., Ribeiro, M., Schotter, A., Schotter, E. R., Vaziri, M., & Zou, M. (2024). Planned vs. Actual Attention. Management Science, 70(5), 2912–2933. https://doi.org/10.1287/mnsc.2023.4834
Conijn, R., Dux Speltz, E., & Chukharev-Hudilainen, E. (2024). Automated extraction of revision events from keystroke data. Reading and Writing, 37(2), 483–508. https://doi.org/10.1007/s11145-021-10222-w
Segedinac, M., Savić, G., Zeljković, I., Slivka, J., & Konjović, Z. (2024). Assessing code readability in Python programming courses using eye‐tracking. Computer Applications in Engineering Education, 32(1), e22685. https://doi.org/10.1002/cae.22685
Sims, J. P., Haynes, A., & Lanius, C. (2024). Exploring the utility of eye tracking for sociological research on race. The British Journal of Sociology, 75(1), 65–72. https://doi.org/10.1111/1468-4446.13054
Duwer, A., & Dzieńkowski, M. (2024). Analysis of the usability of selected auction websites. Journal of Computer Sciences Institute, 31, 138–144. https://ph.pollub.pl/index.php/jcsi/article/view/6200
Byrne, M. (2024). Master of Arts [PhD Thesis, RICE UNIVERSITY]. https://repository.rice.edu/bitstreams/1c22e77e-df9f-4541-bbbf-04b91d20755f/download
Hahn, A. C., Riedelsheimer, J. A., Royer, Z., Frederick, J., Kee, R., Crimmins, R., Huber, B., Harris, D. H., & Jantzen, K. J. (2024). Effects of cleft lip on visual scanning and neural processing of infant faces. Plos One, 19(3), e0300673. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0300673
Robertson, B. D. (2024). Relationship Between Heart Rate Variability, Saccadic Impairment, and Cognitive Performance Following Mild Traumatic Brain Injury in a Military Population [PhD Thesis, Alliant International University]. https://search.proquest.com/openview/5a86f190ba90aec459361b3cdf8ee3b0/1?pq-origsite=gscholar&cbl=18750&diss=y
Barriga, A. D. (2024). In Your Sight and in Your Mind: The Puppeteer as Cognitive Guide in Koryū Nishikawa V and Tom Lee’s Shank’s Mare. Theatre Topics, 34(3), 197–207. https://muse.jhu.edu/pub/1/article/942001/summary
Asaraf, S., Parmet, Y., & Borowsky, A. (2024). Hazard perception and attention of track safety supervisor as a function of working time. https://www.hfes-europe.org/wp-content/uploads/2024/05/Asaraf2024.pdf
Špajdel, M. (2024). Analysis of Eye Movements Reveals Longer Visual Saccades and Abnormal Preference for Social Images in Autism Spectrum Disorder. https://rediviva.sav.sk/66i1/1.pdf
Shepherd, S. S., & Kidd, C. (2024). Visual engagement is not synonymous with learning in young children. Proceedings of the Annual Meeting of the Cognitive Science Society, 46. https://escholarship.org/uc/item/0wz74769
Chavez, F. (2024). Computational Modeling of Voters’ Checking Behavior & Checking Performance [Master’s Thesis, Rice University]. https://search.proquest.com/openview/3055e7cd3118a7fd5b44379a5d5f48ba/1?pq-origsite=gscholar&cbl=18750&diss=y
Baltuttis, D., & Teubner, T. (2024). Effects of Visual Risk Indicators on Phishing Detection Behavior: An Eye-Tracking Experiment. Computers & Security, 103940. https://www.sciencedirect.com/science/article/pii/S0167404824002451
George, J. F. (2024). Discovering why people believe disinformation about healthcare. Plos One, 19(3), e0300497. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0300497
Kerr, C. (2024). Seeing the science and technology pipeline. IEEE Engineering Management Review. https://ieeexplore.ieee.org/abstract/document/10529524/
Cheng, G., Zou, D., Xie, H., & Wang, F. L. (2024). Exploring differences in self-regulated learning strategy use between high-and low-performing students in introductory programming: An analysis of eye-tracking and retrospective think-aloud data from program comprehension. Computers & Education, 208, 104948. https://www.sciencedirect.com/science/article/pii/S0360131523002257
Danielkiewicz, R., & Dzieńkowski, M. (2024). Analysis of user experience during interaction with automotive repair workshop websites. Journal of Computer Sciences Institute, 30, 39–46. https://ph.pollub.pl/index.php/jcsi/article/view/5416
Ghiţă, A., Hernández-Serrano, O., Moreno, M., Monràs, M., Gual, A., Maurage, P., Gacto-Sánchez, M., Ferrer-García, M., Porras-García, B., & Gutiérrez-Maldonado, J. (2024). Exploring Attentional Bias toward Alcohol Content: Insights from Eye-Movement Activity. European Addiction Research, 30(2), 65–79. https://karger.com/ear/article/30/2/65/896035
Stimson, K. H. (2024). Zoom dysmorphia: An eye-tracking study of self-view and attention during video conferences. https://digitalcommons.dartmouth.edu/cognitive-science_senior_theses/5/