Gazepoint Citations by Publication Year
View by Research Category
We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a shortlist of publications that we have found to date. If you are interested in using our best eye-tracking software for marketers in your research and don’t have the software yet, shop now or contact us to get started!
If you have published your research from your neuromarketing study that uses the Gazepoint system, please let us know and we will add a link to your work here! Our suggested reference to cite Gazepoint in your research is: Gazepoint (2025). GP3 Eye-Tracker. Retrieved from https://www.gazept.com
2025
Fu, B., & Chow, N. (2025). AdaptLIL: A Real-Time Adaptive Linked Indented List Visualization for Ontology Mapping. In G. Demartini, K. Hose, M. Acosta, M. Palmonari, G. Cheng, H. Skaf-Molli, N. Ferranti, D. Hernández, & A. Hogan (Eds.), The Semantic Web – ISWC 2024 (Vol. 15232, pp. 3–22). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-77850-6_1
Rodriguez, K. A., Ralph, Y. K., de la Rosa, I. M., Corro, O. P. P., Ochoa, C. D. R., & Pruden, S. M. (2025). Leveraging Eye-Tracking Technology to Understand How Young Children Solve a Mental Rotation Task. Infant and Child Development, 34(3), e70018. https://doi.org/10.1002/icd.70018
2024
Murphy, T. I., Abel, L. A., Armitage, J. A., & Douglass, A. G. (2024). Effects of tracker location on the accuracy and precision of the Gazepoint GP3 HD for spectacle wearers. Behavior Research Methods, 56(1), 43–52. https://doi.org/10.3758/s13428-022-02023-y
Yin, R., & Neyens, D. M. (2024). Examining how information presentation methods and a chatbot impact the use and effectiveness of electronic health record patient portals: An exploratory study. Patient Education and Counseling, 119, 108055. https://doi.org/10.1016/j.pec.2023.108055
Jiang, Y., Leiva, L. A., Houssel, P. R. B., Tavakoli, H. R., Kylmälä, J., & Oulasvirta, A. (2024). UEyes: An Eye-Tracking Dataset across User Interface Types (No. arXiv:2402.05202). arXiv. https://doi.org/10.48550/arXiv.2402.05202
Moradizeyveh, S., Tabassum, M., Liu, S., Newport, R. A., Beheshti, A., & Ieva, A. D. (2024). When Eye-Tracking Meets Machine Learning: A Systematic Review on Applications in Medical Image Analysis (No. arXiv:2403.07834). arXiv. https://doi.org/10.48550/arXiv.2403.07834
Kobylska, A., & Dzieńkowski, M. (2024). User experience analysis in virtual museums. Journal of Computer Sciences Institute, 30, 31–38. https://doi.org/10.35784/jcsi.5382
Dondi, P., Sapuppo, S., & Porta, M. (2024). Leyenes: A gaze-based text entry method using linear smooth pursuit and target speed. International Journal of Human-Computer Studies, 184, 103204. https://doi.org/10.1016/j.ijhcs.2023.103204
Emami, P., Jiang, Y., Guo, Z., & Leiva, L. A. (2024). Impact of Design Decisions in Scanpath Modeling. Proceedings of the ACM on Human-Computer Interaction, 8(ETRA), 1–16. https://doi.org/10.1145/3655602
Taieb-Maimon, M., & Romanovskii-Chernik, L. (2024). Improving Error Correction and Text Editing Using Voice and Mouse Multimodal Interface. International Journal of Human–Computer Interaction, 1–24. https://doi.org/10.1080/10447318.2024.2352932
Moutinho, L., & Cerf, M. (Eds.). (2024). Biometrics and Neuroscience Research in Business and Management: Advances and Applications. De Gruyter. https://doi.org/10.1515/9783110708509
Tedla, S. K., MacKenzie, S., & Brown, M. (2024). LookToFocus: Image Focus via Eye Tracking. Proceedings of the 2024 Symposium on Eye Tracking Research and Applications, 1–7. https://doi.org/10.1145/3649902.3656358
Nguyen-Ho, T.-L., Kongmeesub, O., Tran, M.-T., Nie, D., Healy, G., & Gurrin, C. (2024). EAGLE: Eyegaze-Assisted Guidance and Learning Evaluation for Lifeloging Retrieval. Proceedings of the 7th Annual ACM Workshop on the Lifelog Search Challenge, 18–23. https://doi.org/10.1145/3643489.3661115
Murphy, T., Armitage, J. A., van Wijngaarden, P., Abel, L. A., & Douglass, A. (2024). Unmasking visual search: an objective framework for grouping eye tracking data. Investigative Ophthalmology & Visual Science, 65(7), 5179.
Fu, B., Soriano, A. R., Chu, K., Gatsby, P., & Guardado, N. (2024). Modelling Visual Attention for Future Intelligent Flight Deck – A Case Study of Pilot Eye Tracking in Simulated Flight Takeoff. Adjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization, 170–175. https://doi.org/10.1145/3631700.3664871
Bezgin Ediş, L., Kılıç, S., & Aydın, S. (2024). Message Appeals of Social Media Postings: An Experimental Study on Non-Governmental Organization. Journal of Nonprofit & Public Sector Marketing, 1–21. https://doi.org/10.1080/10495142.2024.2377975
Silva, F., Garrido, M. I., & Soares, S. C. (2024). The effect of anxiety and its interplay with social cues when perceiving aggressive behaviours. Quarterly Journal of Experimental Psychology, 17470218241258209. https://doi.org/10.1177/17470218241258209
Wiediartini, Ciptomulyono, U., & Dewi, R. S. (2024). Evaluation of physiological responses to mental workload in n-back and arithmetic tasks. Ergonomics, 67(8), 1121–1133. https://doi.org/10.1080/00140139.2023.2284677
Huang, Z., Zhu, G., Duan, X., Wang, R., Li, Y., Zhang, S., & Wang, Z. (2024). Measuring eye-tracking accuracy and its impact on usability in apple vision pro (No. arXiv:2406.00255). arXiv. https://doi.org/10.48550/arXiv.2406.00255
Palacios-Ibáñez, A., Castellet-Lathan, S., & Contero, M. (2024). Exploring the user’s gaze during product evaluation through the semantic differential: a comparison between virtual reality and photorealistic images. Virtual Reality, 28(3), 153. https://doi.org/10.1007/s10055-024-01048-2
Lin, J.-H., Hsu, M., & Guo, L.-Y. (2024). Investigation of the Reliability of Oculomotor Assessment of Gaze and Smooth Pursuit with a Novel Approach. 2024 17th International Convention on Rehabilitation Engineering and Assistive Technology (i-CREATe), 1–6. https://doi.org/10.1109/i-CREATe62067.2024.10776135
Huang, J., Gopalakrishnan, S., Mittal, T., Zuena, J., & Pytlarz, J. (2024). Analysis of Human Perception in Distinguishing Real and AI-Generated Faces: An Eye-Tracking Based Study (No. arXiv:2409.15498). arXiv. https://doi.org/10.48550/arXiv.2409.15498
Ciukaj, M., & Skublewska-Paszkowska, M. (2024). Comparative analysis of the availability of popular social networking sites. Journal of Computer Sciences Institute, 32, 217–222. https://doi.org/10.35784/jcsi.6292
Erol Barkana, D., Bartl-Pokorny, K. D., Kose, H., Landowska, A., Milling, M., Robins, B., Schuller, B. W., Uluer, P., Wrobel, M. R., & Zorcec, T. (2024). Challenges in Observing the Emotions of Children with Autism Interacting with a Social Robot. International Journal of Social Robotics. https://doi.org/10.1007/s12369-024-01185-3
Cui, Y., & Liu, X. (2024). How condensation and non-condensation impact viewers’ processing effort and comprehension – an eye-tracking study on Chinese subtitling of English documentaries. Perspectives, 1–19. https://doi.org/10.1080/0907676X.2024.2433059
Chow, N., & Fu, B. (2024). AdaptLIL: A Gaze-Adaptive Visualization for Ontology Mapping (No. arXiv:2411.11768). arXiv. https://doi.org/10.48550/arXiv.2411.11768
Токмовцева, А. Д., & Акельева, Е. В. (2024). Insights into Landscape Perception and Appreciation through Eye Movement Tracking. Lurian Journal, 5(2), 38–46. https://doi.org/10.15826/Lurian.2024.5.2.2
Acharya, S. (2024). Dynamic Eye-Tracking on Large Screens: A 3D Printed Moving Guide Rail Platform. https://etda.libraries.psu.edu/catalog/28993sca5357
Asaraf, S., Parmet, Y., & Borowsky, A. (2024). Hazard perception and attention of track safety supervisor as a function of working time. https://www.hfes-europe.org/wp-content/uploads/2024/05/Asaraf2024.pdf
Avoyan, A., Ribeiro, M., Schotter, A., Schotter, E. R., Vaziri, M., & Zou, M. (2024). Planned vs. Actual Attention. Management Science, 70(5), 2912–2933. https://doi.org/10.1287/mnsc.2023.4834
Bagherzadeh, A., & Tehranchi, F. (2024). Computer-Based Experiments in VR: A Virtual Reality Environment to Conduct Experiments, Collect Participants’ Data and Cognitive Modeling in VR. Proceedings of ICCM-2024-22nd International Conference on Cognitive Modeling. https://www.researchgate.net/profile/Amirreza-Bagherzadeh/publication/384463921_Computer-Based_Experiments_in_VR_A_Virtual_Reality_Environment_to_Conduct_Experiments_Collect_Participants’_Data_and_Cognitive_Modeling_in_VR/links/66fae72b9e6e82486ffc1ea2/Computer-Based-Experiments-in-VR-A-Virtual-Reality-Environment-to-Conduct-Experiments-Collect-Participants-Data-and-Cognitive-Modeling-in-VR.pdf
Baltuttis, D., & Teubner, T. (2024). Effects of Visual Risk Indicators on Phishing Detection Behavior: An Eye-Tracking Experiment. Computers & Security, 103940. https://www.sciencedirect.com/science/article/pii/S0167404824002451
Barriga, A. D. (2024). In Your Sight and in Your Mind: The Puppeteer as Cognitive Guide in Koryū Nishikawa V and Tom Lee’s Shank’s Mare. Theatre Topics, 34(3), 197–207. https://muse.jhu.edu/pub/1/article/942001/summary
Boone, N., & Hahn, A. (2024). Effects of cleft lip and palate on visual scanning and neural processing of infant faces. https://digitalcommons.humboldt.edu/cgi/viewcontent.cgi?article=1019&context=ideafest2024
Byrne, M. (2024). Master of Arts [PhD Thesis, RICE UNIVERSITY]. https://repository.rice.edu/bitstreams/1c22e77e-df9f-4541-bbbf-04b91d20755f/download
Chavez, F. (2024). Computational Modeling of Voters’ Checking Behavior & Checking Performance [Master’s Thesis, Rice University]. https://search.proquest.com/openview/3055e7cd3118a7fd5b44379a5d5f48ba/1?pq-origsite=gscholar&cbl=18750&diss=y
Chen, A., Wong, C., Tarrit, K., & Peruma, A. (2024). Impostor Syndrome in Final Year Computer Science Students: An Eye Tracking and Biometrics Study. In D. D. Schmorrow & C. M. Fidopiastis (Eds.), Augmented Cognition (Vol. 14694, pp. 22–41). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-61569-6_2
Cheng, G., Zou, D., Xie, H., & Wang, F. L. (2024). Exploring differences in self-regulated learning strategy use between high-and low-performing students in introductory programming: An analysis of eye-tracking and retrospective think-aloud data from program comprehension. Computers & Education, 208, 104948. https://www.sciencedirect.com/science/article/pii/S0360131523002257
Chhimpa, G. R., Kumar, A., Garhwal, S., & Dhiraj. (2024). Empowering individuals with disabilities: a real-time, cost-effective, calibration-free assistive system utilizing eye tracking. Journal of Real-Time Image Processing, 21(3), 97. https://doi.org/10.1007/s11554-024-01478-w
Cho, S. M., Taylor, R. H., & Unberath, M. (2024). Misjudging the Machine: Gaze May Forecast Human-Machine Team Performance in Surgery. In M. G. Linguraru, Q. Dou, A. Feragen, S. Giannarou, B. Glocker, K. Lekadir, & J. A. Schnabel (Eds.), Medical Image Computing and Computer Assisted Intervention – MICCAI 2024 (Vol. 15006, pp. 401–410). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-72089-5_38
Chow, N. (2024). Adaptive Ontology Mapping Visualizations: Curtailing Visualizations in Real Time Through Deep Learning and Eye Gaze [Master’s Thesis, California State University, Long Beach]. https://search.proquest.com/openview/0b6f0676b42156b1cee166ea98b38eaf/1?pq-origsite=gscholar&cbl=18750&diss=y
Chung, E. S. (2024). Processing of Machine Translation Errors by Korean Learners of English: An Eye-tracking Study. 영미연구, 62, 63–86. http://builder.hufs.ac.kr/user/ibas/62/03..pdf
Chvátal, R., Slezáková, J., & Popelka, S. (2024). Analysis of problem-solving strategies for the development of geometric imagination using eye-tracking. Education and Information Technologies, 29(10), 12969–12987. https://doi.org/10.1007/s10639-023-12395-z
Ciukaj, M., & Skublewska-Paszkowska, M. (2024). Comparative analysis of the availability of popular social networking sites. Journal of Computer Sciences Institute, 32, 217–222. https://ph.pollub.pl/index.php/jcsi/article/view/6292
Cohen, M. A., Sung, S., & Alaoui, Z. (2024). Familiarity alters the bandwidth of perceptual awareness. Journal of Cognitive Neuroscience, 1–11. https://direct.mit.edu/jocn/article/doi/10.1162/jocn_a_02140/120297
Conijn, R., Dux Speltz, E., & Chukharev-Hudilainen, E. (2024). Automated extraction of revision events from keystroke data. Reading and Writing, 37(2), 483–508. https://doi.org/10.1007/s11145-021-10222-w
Danielkiewicz, R., & Dzieńkowski, M. (2024). Analysis of user experience during interaction with automotive repair workshop websites. Journal of Computer Sciences Institute, 30, 39–46. https://ph.pollub.pl/index.php/jcsi/article/view/5416
del Carmen Cabrera-Hernández, M., García-Ezquerra, C. A., Aceves-Fernández, M. A., Pedraza-Ortega, J. C., & Tovar-Arriaga, S. (2024). A dataset on eye movement tracking during the resolution of neuropsychological tests on a screen. Data in Brief, 55, 110601. https://www.sciencedirect.com/science/article/pii/S2352340924005687
Duwer, A., & Dzieńkowski, M. (2024). Analysis of the usability of selected auction websites. Journal of Computer Sciences Institute, 31, 138–144. https://ph.pollub.pl/index.php/jcsi/article/view/6200
Enhancing Website Usability Testing: Correlating Eye-Tracking, GSR, and SUS Data With Respect To Gender Preferences | Science Journal of University of Zakho. (2024). https://sjuoz.uoz.edu.krd/index.php/sjuoz/article/view/1215
Fu, B., Gatsby, P., Soriano, A. R., Chu, K., & Guardado, N. (2024). Towards Intelligent Flight Deck–A Preliminary Study of Applied Eye Tracking in The Predictions of Pilot Success and Failure During Simulated Flight Takeoff. https://ceur-ws.org/Vol-3701/paper12.pdf
Gabor-Siatkowska, K., Stefaniak, I., & Janicki, A. (2024). A Multimodal Approach for Improving a Dialogue Agent for Therapeutic Sessions in Psychiatry. In A. Marcus-Quinn, K. Krejtz, & C. Duarte (Eds.), Transforming Media Accessibility in Europe: Digital Media, Education and City Space Accessibility Contexts (pp. 397–414). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-60049-4_22
Gabor-Siatkowska, K., Stefaniak, I., & Janicki, A. (2024). Biuletyn Wrocławskiej Wyższej Szkoły Informatyki Stosowanej. Informatyka. https://repo.pw.edu.pl/docstore/download/WUT5fa99776020c43b19c1b5b45fe16b575/Eye+tracking+data+cleansing+for+dialogue+agent.pdf?entityType=article&entityId=WUT17516176dfd443be85b0040b7536e6a9
Gabor-Siatkowska, K., Stefaniak, I., & Janicki, A. (2024). A Multimodal Approach for Improving a Dialogue Agent for Therapeutic Sessions in Psychiatry. In Transforming Media Accessibility in Europe: Digital Media, Education and City Space Accessibility Contexts (pp. 397–414). Springer Nature Switzerland Cham. https://library.oapen.org/bitstream/handle/20.500.12657/93233/1/978-3-031-60049-4.pdf#page=399
Gabor-Siatkowska, K., Stefaniak, I., & Janicki, A. (2024). Eye tracking data cleansing for dialogue agent. Biuletyn Naukowy Wrocławskiej Wyższej Szkoły Informatyki Stosowanej” Informatyka”, 10. https://repo.pw.edu.pl/info/article/WUT17516176dfd443be85b0040b7536e6a9/
Gabor-Siatkowska, K., Stefaniak, I., & Janicki, A. (2024). Gaze-dependent response activation in dialogue agent for cognitive-behavioral therapy. Procedia Computer Science, 246, 2322–2331. https://www.sciencedirect.com/science/article/pii/S1877050924025997
George, J. F. (2024). Discovering why people believe disinformation about healthcare. Plos One, 19(3), e0300497. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0300497
Ghiţă, A., Hernández-Serrano, O., Moreno, M., Monràs, M., Gual, A., Maurage, P., Gacto-Sánchez, M., Ferrer-García, M., Porras-García, B., & Gutiérrez-Maldonado, J. (2024). Exploring Attentional Bias toward Alcohol Content: Insights from Eye-Movement Activity. European Addiction Research, 30(2), 65–79. https://karger.com/ear/article/30/2/65/896035
Girard, T. (2024). AOI Eye Gaze Analysis for Matrix and Linked Indented List Visualizations. https://scholarworks.calstate.edu/concern/projects/m039kd555
Hahn, A. C., Riedelsheimer, J. A., Royer, Z., Frederick, J., Kee, R., Crimmins, R., Huber, B., Harris, D. H., & Jantzen, K. J. (2024). Effects of cleft lip on visual scanning and neural processing of infant faces. Plos One, 19(3), e0300673. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0300673
Hinss, M. F., Jahanpour, E. S., Brock, A. M., & Roy, R. N. (2024). A passive Brain-Computer Interface for operatormental fatigue estimation in monotonoussurveillance operations: time-on-task andperformance labeling issues. Journal of Neural Engineering. https://iopscience.iop.org/article/10.1088/1741-2552/ad9bed/meta
Jankowski, M., & Goroncy, A. (2024). Anatomical variants of acne differ in their impact on social perception. Journal of the European Academy of Dermatology and Venereology, 38(8), 1628–1636. https://doi.org/10.1111/jdv.19798
Jurič, I., Tomić, I., & Pál, M. (2024). VISUAL DYNAMICS IN DIGITAL CATALOGUES: A COMPREHENSIVE ANALYSIS OF CINEMAGRAPH INTEGRATION THROUGH EYE-TRACKING TECHNOLOGY. https://www.grid.uns.ac.rs/symposium/download/2024/28.pdf
Kerr, C. (2024). Seeing the science and technology pipeline. IEEE Engineering Management Review. https://ieeexplore.ieee.org/abstract/document/10529524/
Kollias, K.-F., Maraslidis, G. S., Sarigiannidis, P., & Fragulis, G. F. (2024). Application of machine learning on eye-tracking data for autism detection: The case of high-functioning adults. AIP Conference Proceedings, 3220. https://pubs.aip.org/aip/acp/article-abstract/3220/1/050012/3315950
Kotyńska, K., & Matulewski, J. (2024). Usability study of gaze-based control methods in a game with time pressure. Procedia Computer Science, 246, 473–481. https://www.sciencedirect.com/science/article/pii/S1877050924024670
Kurek, K., Skublewska-Paszkowska, M., & Powroznik, P. (2024). The impact of applying universal design principles on the usability of online accommodation booking websites. Applied Computer Science, 20(1). https://yadda.icm.edu.pl/baztech/element/bwmeta1.element.baztech-61566e12-6c82-4f6d-a27a-4a71ef68836d
Le Moan, S., Amiri, M., & Herglotz, C. (2024). Exploiting Change Blindness to Reduce Bitrate and Display Luminance in Video Streaming. 2024 IEEE International Conference on Image Processing (ICIP), 3661–3666. https://ieeexplore.ieee.org/abstract/document/10648096/
Lobodenko, L., Matveeva, I., Shesterkina, L., & Zagoskin, E. (2024). Eye-Tracking Study into Patterns of Attention to Environmental Media Texts among Youth Audiences in the Context of the Communicative Strategy. 2024 Communication Strategies in Digital Society Seminar (ComSDS), 83–88. https://ieeexplore.ieee.org/abstract/document/10502069/
Lopes, A., Ward, A. D., & Cecchini, M. (2024). Eye tracking in digital pathology: A comprehensive literature review. Journal of Pathology Informatics, 15, 100383. https://www.sciencedirect.com/science/article/pii/S2153353924000221
Lyu, D., Mañas-Viniegra, L., & Xu, Z. (2024). Visual attention differences toward football stadium’s naming rights: an eye tracking study. Asia Pacific Journal of Marketing and Logistics. https://www.emerald.com/insight/content/doi/10.1108/APJML-03-2024-0281/full/html
Osińska, V., Szalach, A., & Piotrowski, D. M. (2024). Eye tracking as a tool for analysing human-AI image interactions. 2024 Progress in Applied Electrical Engineering (PAEE), 1–3. https://ieeexplore.ieee.org/abstract/document/10701449/
Pah, N. D., Ngo, Q. C., McConnell, N., Polus, B., Kempster, P., Bhattacharya, A., Raghav, S., & Kumar, D. K. (2024). Reflexive eye saccadic parameters in Parkinson’s disease. Frontiers in Medical Technology, 6, 1477502. https://www.frontiersin.org/journals/medical-technology/articles/10.3389/fmedt.2024.1477502/full
Pietracupa, M., Ben Abdessalem, H., & Frasson, C. (2024). Detection of Pre-error States in Aircraft Pilots Through Machine Learning. In A. Sifaleras & F. Lin (Eds.), Generative Intelligence and Intelligent Tutoring Systems (pp. 124–136). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-63031-6_11
Pillai, P., Balasingam, B., Jaekel, A., & Biondi, F. N. (2024). Comparison of concurrent cognitive load measures during n-back tasks. Applied Ergonomics, 117, 104244. https://www.sciencedirect.com/science/article/pii/S0003687024000218
Pyeon, J., Bagherzadeh, A., Koshani, R., Sheikhi, A., & Tehranchi, F. (2024). Understanding Human Behavior and a Cognitive Model in an Image Labeling Task. https://www.researchgate.net/profile/Amirreza-Bagherzadeh/publication/385624253_Understanding_Human_Behavior_and_a_Cognitive_Model_in_an_Image_Labeling_Task/links/672cf2ae77f274616d626681/Understanding-Human-Behavior-and-a-Cognitive-Model-in-an-Image-Labeling-Task.pdf
R, R., Jacob, L., R, R., R S, A., V L, D., M S, S., Prakash, V., Mathew, A., & A L, H. (2024). Computer Aided Detection Of Strabismus In Humans Using Computer Vision Techniques. Proceedings of the 1st International Conference on Artificial Intelligence, Communication, IoT, Data Engineering and Security, IACIDS 2023, 23-25 November 2023, Lavasa, Pune, India. Proceedings of the 1st International Conference on Artificial Intelligence, Communication, IoT, Data Engineering and Security, IACIDS 2023, 23-25 November 2023, Lavasa, Pune, India, Lavasa, India. https://doi.org/10.4108/eai.23-11-2023.2343335
Rebreikina, A., Zakharchenko, D., Shaposhnikova, A., Korotkov, N., Klimov, Y., & Batysheva, T. (2024). Voluntary Attention Assessing Tests in Children with Neurodevelopmental Disorders Using Eye Tracking. Children, 11(11), 1333. https://www.mdpi.com/2227-9067/11/11/1333
Reshetniak, V., & Faure, E. (2024). Eye-tracking as a tool for researching user behavior. COMPUTER-INTEGRATED TECHNOLOGIES: EDUCATION, SCIENCE, PRODUCTION, 55, 181–190. http://cit-journal.com.ua/index.php/cit/article/view/576
Reyes, A. (2024). Identifying the Most Relevant Eye Gaze Features When Predicting Pilot Success and Failure During an ILS Approach [Master’s Thesis, California State University, Long Beach]. https://search.proquest.com/openview/7ba6e1f3b548c0f4c2fc5bebfc614016/1?pq-origsite=gscholar&cbl=18750&diss=y
Riegel Correia, S., Pinto-Albuquerque, M., Espinha Gasiba, T., & Iosif, A.-C. (2024). Improving Industrial Cybersecurity Training: Insights into Code Reviews Using Eye-Tracking. OASIcs, Volume 122, ICPEC 2024, 122, 17:1-17:9. https://doi.org/10.4230/OASICS.ICPEC.2024.17
Robertson, B. D. (2024). Relationship Between Heart Rate Variability, Saccadic Impairment, and Cognitive Performance Following Mild Traumatic Brain Injury in a Military Population [PhD Thesis, Alliant International University]. https://search.proquest.com/openview/5a86f190ba90aec459361b3cdf8ee3b0/1?pq-origsite=gscholar&cbl=18750&diss=y
Rymarkiewicz, W., Cybulski, P., & Horbiński, T. (2024). Measuring Efficiency and Accuracy in Locating Symbols on Mobile Maps Using Eye Tracking. ISPRS International Journal of Geo-Information, 13(2), 42. https://www.mdpi.com/2220-9964/13/2/42
Segedinac, M., Savić, G., Zeljković, I., Slivka, J., & Konjović, Z. (2024). Assessing code readability in Python programming courses using eye‐tracking. Computer Applications in Engineering Education, 32(1), e22685. https://doi.org/10.1002/cae.22685
Shepherd, S. S., & Kidd, C. (2024). Visual engagement is not synonymous with learning in young children. Proceedings of the Annual Meeting of the Cognitive Science Society, 46. https://escholarship.org/uc/item/0wz74769
Silva, F., Ribeiro, S., Silva, S., Garrido, M. I., & Soares, S. C. (2024). Exploring the use of visual predictions in social scenarios while under anticipatory threat. Scientific Reports, 14(1), 10913. https://www.nature.com/articles/s41598-024-61682-3
Sims, J. P., Haynes, A., & Lanius, C. (2024). Exploring the utility of eye tracking for sociological research on race. The British Journal of Sociology, 75(1), 65–72. https://doi.org/10.1111/1468-4446.13054
Stimson, K. H. (2024). Zoom dysmorphia: An eye-tracking study of self-view and attention during video conferences. https://digitalcommons.dartmouth.edu/cognitive-science_senior_theses/5/
Taieb-Maimon, M., Romanovski-Chernik, A., Last, M., Litvak, M., & Elhadad, M. (2024). Mining Eye-Tracking Data for Text Summarization. International Journal of Human–Computer Interaction, 40(17), 4887–4905. https://doi.org/10.1080/10447318.2023.2227827
Teixeira, A. R., Brito-Costa, S., & De Almeida, H. (2024). Optimizing Reading Experience: An Eye Tracking Comparative Analysis of Single-Column, Two-Column, and Three-Column Formats. In H. Mori & Y. Asahi (Eds.), Human Interface and the Management of Information (Vol. 14689, pp. 51–59). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-60107-1_5
Tripp, D. D. (2024). Differentiating Pilot Expertise Through Gaze Behavior Analysis in Flight Simulations [Master’s Thesis, California State University, Long Beach]. https://search.proquest.com/openview/6e0daa4f0b8cb9579e7fc8299a3dde2f/1?pq-origsite=gscholar&cbl=18750&diss=y
Tural, A., & Tural, E. (2024). Exploring sense of spaciousness in interior settings: Screen-based assessments with eye tracking, and virtual reality evaluations. Frontiers in Psychology, 15, 1473520. https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2024.1473520/full
Valenko, S. I., Možanić, G., Zorko, A., & Morić, M. (2024). THE IMPACT OF THE MODELS GAZE DIRECTION ON THE USER EXPERIENCE. https://www.grid.uns.ac.rs/symposium/download/2024/21.pdf
Vasta, N., Jajo, N., Graf, F., Li, Y., Zhang, L., & Biondi, F. N. (2024). Evaluating a camera-based approach to assess cognitive load during manufacturing computer tasks. https://www.researchsquare.com/article/rs-4979457/latest
Wang, X., Yu, F., Liu, J., Garcia, D., Stahl, I., Vondenberger, A., & Song, L. (2024). Mapping the Landscape of Eye-Tracking Research: a Systematic Bibliometric and Thematic Analysis of Studies in ACM CHI and CHIIR Proceedings. https://easychair.org/publications/preprint/MhfM/download
Wijaszka, M., & Dzieńkowski, M. (2024). A usability analysis of e-commerce systems: Prestashop, Magento and Joomla. Journal of Computer Sciences Institute, 32, 239–245. https://ph.pollub.pl/index.php/jcsi/article/view/6313
Wu, Y., & Pruden, S. M. (2024). Karinna A. Rodriguez, Nick Mattox, Carlos Desme, LaTreese V. Hall. Advances in Child Development and Behavior, 67, 237. https://books.google.com/books?hl=en&lr=&id=E7QVEQAAQBAJ&oi=fnd&pg=PA237&dq=gazepoint+gp3&ots=_4Jq4iaF25&sig=i6U3rcYv2CuQchvh5RCOVxbuAcA
Wu, D., Huang, X., Chen, L., Hou, P., Liu, L., & Yang, G. (2024). Integrating artificial intelligence in strabismus management: current research landscape and future directions. Experimental Biology and Medicine, 249, 10320. https://pmc.ncbi.nlm.nih.gov/articles/PMC11625544/
Yin, R., & Neyens, D. M. (2024). Examining how information presentation methods and a chatbot impact the use and effectiveness of electronic health record patient portals: An exploratory study. Patient Education and Counseling, 119, 108055. https://www.sciencedirect.com/science/article/pii/S0738399123004366
Yu, Y.-C., Shyntassov, H., Kaushik, P., & Gabel, L. (2024). Saccadic Detection in Virtual Gaming for Dyslexia Classification. 2024 IEEE International Symposium on Biomedical Imaging (ISBI), 1–4. https://ieeexplore.ieee.org/abstract/document/10635745/
Špajdel, M. (2024). Analysis of Eye Movements Reveals Longer Visual Saccades and Abnormal Preference for Social Images in Autism Spectrum Disorder. https://rediviva.sav.sk/66i1/1.pdf
Šutinienė, L., Česnulevičius, A., & Bautrėnas, A. (2024). Investigating the Readability of School Geographic Map Symbols Using Eye-Tracking Technology. Pedagogika, 153(1), 32–49. https://ejournals.vdu.lt/index.php/Pedagogika/article/view/5254
2023
Calle, A., Ortega, P., Argudo-Vásconez, A., Cobos, M., & Alvarado, O. (2023). Exploring the Role of Visual Attention in Aggressive Behavior: Evidence from Eye-Tracking Measurements. https://doi.org/10.54941/ahfe1003024
Cui, Y., Liu, X., & Cheng, Y. (2023). A Comparative Study on the Effort of Human Translation and Post-Editing in Relation to Text Types: An Eye-Tracking and Key-Logging Experiment. SAGE Open, 13(1), 21582440231155849. https://doi.org/10.1177/21582440231155849
Messaraa, C., Mangan, M., & Crowe, M. (2023, January 1). Is There an Additive Effect of Makeup Upon Gaze and Perception? A Pilot Study. | Journal of Cosmetic Science | EBSCOhost. https://openurl.ebsco.com/contentitem/gcd:171871744?sid=ebsco:plink:crawler&id=ebsco:gcd:171871744
Yaneva, V., Ha, L. A., Eraslan, S., Yesilada, Y., & Mitkov, R. (2023). Chapter 3 – Reading differences in eye-tracking data as a marker of high-functioning autism in adults and comparison to results from web-related tasks. In A. S. El-Baz & J. S. Suri (Eds.), Neural Engineering Techniques for Autism Spectrum Disorder (pp. 63–79). Academic Press. https://doi.org/10.1016/B978-0-12-824421-0.00011-4
Han, E. (2023). Comparing Perceptual Effects of Perspectival Convergence in Architectural Images. Art & Perception, 11(1), 54–87. https://doi.org/10.1163/22134913-bja10044
Katona, J. (2023). An Eye Movement Study in Unconventional Usage of Different Software Tools. Sensors, 23(8), 3823. https://doi.org/10.3390/s23083823
Viautour, J., Naegeli, L., Braun, J., Bergauer, L., Roche, T. R., Tscholl, D. W., & Akbas, S. (2023). The Visual Patient Avatar ICU Facilitates Information Transfer of Written Information by Visualization: A Multicenter Comparative Eye-Tracking Study. Diagnostics, 13(22), 3432. https://doi.org/10.3390/diagnostics13223432
Santos, S. M. P., Fernandes, N. L., & Pandeirada, J. N. S. (2023). Same but different: The influence of context framing on subjective disgust, eye movements and pupillary responses. Consciousness and Cognition, 108, 103462. https://doi.org/10.1016/j.concog.2022.103462
Sung, B., Butcher, L., & Easton, J. (2023). Elevating Food Perceptions Through Luxury Verbal Cues: An Eye-Tracking and Electrodermal Activity Experiment. Australasian Marketing Journal, 31(1), 25–35. https://doi.org/10.1177/18393349211028676
Mani, R., Asper, L., Arunachalam, V., & Khuu, S. K. (2023). The impact of traumatic brain injury on inhibitory control processes assessed using a delayed antisaccade task. Neuroscience Letters, 797, 137081. https://doi.org/10.1016/j.neulet.2023.137081
Cui, Y., Liu, X., & Cheng, Y. (2023). Attention-consuming or attention-saving: an eye tracking study on punctuation in Chinese subtitling of English trailers. Multilingua. https://doi.org/10.1515/multi-2022-0138
Mahanama, B., Sunkara, M., Ashok, V., & Jayarathna, S. (2023). DisETrac: Distributed Eye-Tracking for Online Collaboration. Proceedings of the 2023 Conference on Human Information Interaction and Retrieval, 427–431. https://doi.org/10.1145/3576840.3578292
Shamy, M., & Feitelson, D. G. (2023). Identifying Lines and Interpreting Vertical Jumps in Eye Tracking Studies of Reading Text and Code. ACM Transactions on Applied Perception, 20(2), 6:1-6:20. https://doi.org/10.1145/3579357
Jiang, Y., Leiva, L. A., Rezazadegan Tavakoli, H., R. B. Houssel, P., Kylmälä, J., & Oulasvirta, A. (2023). UEyes: Understanding Visual Saliency across User Interface Types. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–21. https://doi.org/10.1145/3544548.3581096
Lobodenko, L., Cheredniakova, A., Shesterkina, L., & Kharitonova, O. (2023). Eye-Tracking Technologies in the Analysis of Environmental Advertising and Journalistic Texts Perception by Youth. 2023 Communication Strategies in Digital Society Seminar (ComSDS), 78–85. https://doi.org/10.1109/ComSDS58064.2023.10130433
Hahn, A., Riedelsheimer, J., Royer, Z., Frederick, J., Kee, R., Crimmins, R., Huber, B., Harris, D., & Jantzen, K. (2023). Effects of Cleft Lip on Visual Scanning and Neural Processing of Infant Faces [Preprint]. Preprints. https://doi.org/10.22541/au.168455102.24287447/v1
Moreira, C., Alvito, D. M., Sousa, S. C., Nobre, I. M. G. B., Ouyang, C., Kopper, R., Duchowski, A., & Jorge, J. (2023). Comparing Visual Search Patterns in Chest X-Ray Diagnostics. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, 1–6. https://doi.org/10.1145/3588015.3588403
Warchol-Jakubowska, A., Krejtz, I., & Krejtz, K. (2023). An irrelevant look of novice tram driver: Visual attention distribution of novice and expert tram drivers. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, 1–3. https://doi.org/10.1145/3588015.3589514
Foroughi, C. K., Devlin, S., Pak, R., Brown, N. L., Sibley, C., & Coyne, J. T. (2023). Near-Perfect Automation: Investigating Performance, Trust, and Visual Attention Allocation. Human Factors, 65(4), 546–561. https://doi.org/10.1177/00187208211032889
Pillai, P., Balasingam, B., & Biondi, F. N. (2023). Model-Based Estimation of Mental Workload in Drivers Using Pupil Size Measurements. 2023 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), 815–821. https://doi.org/10.1109/AIM46323.2023.10196230
Han, E. (2023). Comparing the Perception of In-Person and Digital Monitor Viewing of Paintings. Empirical Studies of the Arts, 41(2), 465–496. https://doi.org/10.1177/02762374231158520
(PDF) Eye Tracking as a Research and Training Tool for Ensuring Quality Education. (2023, July 23). ResearchGate. https://doi.org/10.1007/978-3-031-30498-9_28
Hong, W. C. H., Ngan, H. F. B., Yu, J., & Arbouw, P. (2023). Examining cultural differences in Airbnb naming convention and user reception: an eye-tracking study. Journal of Travel & Tourism Marketing, 40(6), 475–489. https://doi.org/10.1080/10548408.2023.2263764
Koutsogiorgi, C. C., & Michaelides, M. P. (2023). Response Tendencies to Positively and Negatively Worded Items of the Rosenberg Self-Esteem Scale With Eye-Tracking Methodology. European Journal of Psychological Assessment, 39(4), 307–315. https://doi.org/10.1027/1015-5759/a000772
Brancucci, A., Ferracci, S., D’Anselmo, A., & Manippa, V. (2023). Hemispheric functional asymmetries and sex effects in visual bistable perception. Consciousness and Cognition, 113, 103551. https://doi.org/10.1016/j.concog.2023.103551
Huang, J., Raja, J., Cantor, C., Marx, W., Galgano, S., Zarzour, J., Caridi, T., Gunn, A., Morgan, D., & Smith, A. (2023). Eye Motion Tracking for Medical Image Interpretation Training. Current Problems in Diagnostic Radiology. https://doi.org/10.1067/j.cpradiol.2023.08.013
Rocca, F., Dave, M., Duvivier, V., Van Daele, A., Demeuse, M., Derobertmasure, A., Mancas, M., & Gosselin, B. (2023). Designing an Assistance Tool for Analyzing and Modeling Trainer Activity in Professional Training Through Simulation. Proceedings of the 2023 ACM International Conference on Interactive Media Experiences, 180–187. https://doi.org/10.1145/3573381.3596475
Jaśkowiec, M., & Kowalska-Chrzanowska, M. (2023). The Use of Games in Citizen Science Based on Findings from the EyeWire User Study. Games and Culture, 15554120231196260. https://doi.org/10.1177/15554120231196260
Furukado R., & Hagiwara G. (2023). Gaze and Electroencephalography (EEG) Parameters Esports: Examinations Considering Genres and Skill Levels. IEICE Proceedings Series, 77(SS-2). https://www.ieice.org/publications/proceedings/summary.php?iconf=SISA&session_num=SS&number=SS-2&year=2023
Abeysinghe, Y., Mahanama, B., Jayawardena, G., Sunkara, M., Ashok, V., & Jayarathna, S. (2023). Gaze Analytics Dashboard for Distributed Eye Tracking. 2023 IEEE 24th International Conference on Information Reuse and Integration for Data Science (IRI), 140–145. https://doi.org/10.1109/IRI58017.2023.00031
Lu, H.-Y., Lin, Y.-C., Chen, C.-H., Wang, C.-C., Han, I.-W., & Liang, W.-L. (2023). Detecting Children with Autism Spectrum Disorder Based on Eye-tracking and Machine Learning. 2023 IEEE 6th International Conference on Knowledge Innovation and Invention (ICKII), 372–375. https://doi.org/10.1109/ICKII58656.2023.10332630
Biondi, F. N., Graf, F., Pillai, P., & Balasingam, B. (2023). On Validating a Generic Video-Based Blink Detection System for Cognitive Load Detection. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 67(1), 1425–1430. https://doi.org/10.1177/21695067231192924
Furukado, R., & Hagiwara, G. (2023). Gaze and Electroencephalography (EEG) Parameters in Esports: Examinations Considering Genres and Skill Levels.
Hatzithomas, L., Theodorakioglou, F., Margariti, K., & Boutsouki, C. (2023). Cross-media advertising strategies and brand attitude: the role of cognitive load. International Journal of Advertising, 0(0), 1–33. https://doi.org/10.1080/02650487.2023.2249342
Liu, Y., Ghaiumy Anaraky, R., Aly, H., & Byrne, K. (2023). The Effect of Privacy Fatigue on Privacy Decision-Making Behavior. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 67(1), 2428–2433. https://doi.org/10.1177/21695067231193670
Prahm, C., Konieczny, J., Bressler, M., Heinzel, J., Daigeler, A., Kolbenschlag, J., & Lauer, H. (2023). Influence of colored face masks on judgments of facial attractiveness and gaze patterns. Acta Psychologica, 239, 103994. https://doi.org/10.1016/j.actpsy.2023.103994
Yao, J., Su, S., & Liu, S. (2023). The effect of key audit matters reviewing on loan approval decisions? Finance Research Letters, 104467. https://doi.org/10.1016/j.frl.2023.104467
Aslan, M., Baykara, M., & Alakuş, T. B. (2023). LSTMNCP: lie detection from EEG signals with novel hybrid deep learning method. Multimedia Tools and Applications. https://doi.org/10.1007/s11042-023-16847-z
Aslan, M., Baykara, M., & Alakuş, T. B. (2023). LSTMNCP: lie detection from EEG signals with novel hybrid deep learning method. Multimedia Tools and Applications, 83(11), 31655–31671. https://doi.org/10.1007/s11042-023-16847-z
Sims, J. P., Haynes, A., & Lanius, C. (2023). Exploring the utility of eye tracking for sociological research on race. The British Journal of Sociology, n/a(n/a). https://doi.org/10.1111/1468-4446.13054
Kamal, M., Möbius, M., Bartella, A. K., & Lethaus, B. (2023). Perception of aesthetic features after surgical treatment of craniofacial malformations by observers of the same age: An eye-tracking study. Journal of Cranio-Maxillofacial Surgery. https://doi.org/10.1016/j.jcms.2023.09.009
Biondi, F. N., Graf, F., Pillai, P., & Balasingam, B. (2023). On validating a generic camera-based blink detection system for cognitive load assessment. Cognitive Computation and Systems, n/a(n/a). https://doi.org/10.1049/ccs2.12088
Contemori, G., Oletto, C. M., Battaglini, L., Motterle, E., & Bertamini, M. (2023). Foveal feedback in perceptual processing: Contamination of neural representations and task difficulty effects. PLOS ONE, 18(10), e0291275. https://doi.org/10.1371/journal.pone.0291275
Hwang, E., & Lee, J. (2023). Attention-based automatic editing of virtual lectures for reduced production labor and effective learning experience. International Journal of Human-Computer Studies, 103161. https://doi.org/10.1016/j.ijhcs.2023.103161
Novia, R., Titis, W., & Mirwan, U. (2023). An eye tracking study of customers’ visual attention to the fast-food chain’s page on instagram. AIP Conference Proceedings, 2510(1), 030042. https://doi.org/10.1063/5.0129351
Segedinac, M., Savić, G., Zeljković, I., Slivka, J., & Konjović, Z. (2023). Assessing code readability in Python programming courses using eye-tracking. Computer Applications in Engineering Education, n/a(n/a). https://doi.org/10.1002/cae.22685
Cheng, G., Zou, D., Xie, H., & Lee Wang, F. (2023). Exploring differences in self-regulated learning strategy use between high- and low-performing students in introductory programming: An analysis of eye-tracking and retrospective think-aloud data from program comprehension. Computers & Education, 104948. https://doi.org/10.1016/j.compedu.2023.104948
S Kumar, D., Sahadev, S., & Purani, K. (2023). Visual Aesthetic Quotient: Establishing the Effects of Computational Aesthetic Measures for Servicescape Design. Journal of Service Research, 10946705231205000. https://doi.org/10.1177/10946705231205000
Inoue, M., Nishiyama, M., & Iwai, Y. (2023). Age group identification using gaze-guided feature extraction. 2023 IEEE 12th Global Conference on Consumer Electronics (GCCE), 708–711. https://doi.org/10.1109/GCCE59613.2023.10315305
Cui, Y., Liu, X., & Cheng, Y. (2023). Reader perception of and attitude to English-Chinese advertising posters: an eye tracking study. SN Social Sciences, 3(11), 192. https://doi.org/10.1007/s43545-023-00782-9
Cybulski, P., Medyńska-Gulij, B., & Horbiński, T. (2023). Users’ Visual Experience During Temporal Navigation in Forecast Weather Maps on Mobile Devices. Journal of Geovisualization and Spatial Analysis, 7(2), 32. https://doi.org/10.1007/s41651-023-00160-2
Lee, S., Byun, G., & Ha, M. (2023). Exploring the association between environmental factors and fear of crime in residential streets: an eye-tracking and questionnaire study. Journal of Asian Architecture and Building Engineering, 1–18. https://doi.org/10.1080/13467581.2023.2278449
Kusumo, A. H. (2023). Has Website Design using Website Builder Fulfilled Usability Aspects? A Study Case of Three Website Builders. 545–557. https://doi.org/10.2991/978-94-6463-288-0_45
Dang, A., & Nichols, B. S. (2023). The effects of size referents in user-generated photos on online review helpfulness. Journal of Consumer Behaviour, n/a(n/a). https://doi.org/10.1002/cb.2281
Chvátal, R., Slezáková, J., & Popelka, S. (2023). Analysis of problem-solving strategies for the development of geometric imagination using eye-tracking. Education and Information Technologies. https://doi.org/10.1007/s10639-023-12395-z
Khairunnisa, G., & Sari, H. (2023). Eye Tracking-based Analysis of Customer Interest on The Effectiveness of Eco-friendly Product Advertising Content. Jurnal Optimasi Sistem Industri, 22, 153–164. https://doi.org/10.25077/josi.v22.n2.p153-164.2023
Cieśla, M., & Dzieńkowski, M. (2023). An Analysis Of The Implementation Of Accessibility Tools On Websites. Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska, 13(4), 51–56. https://doi.org/10.35784/iapgos.4459
Chang, Y.-C., Gandi, N., Shin, K., Mun, Y.-J., Driggs-Campbell, K., & Kim, J. (2023). Specifying Target Objects in Robot Teleoperation Using Speech and Natural Eye Gaze. 2023 IEEE-RAS 22nd International Conference on Humanoid Robots (Humanoids), 1–7. https://doi.org/10.1109/Humanoids57100.2023.10375186
Kowalewski, S. J., & Williamson, B. (2023). Fostering Advocacy, Developing Empathetic UX Bricoleurs: Ongoing Programmatic Assessment and Responsive Curriculum Design. IEEE Transactions on Professional Communication, 66(4), 382–396. https://doi.org/10.1109/TPC.2023.3320530
Mok, S., Park, S., & Whang, M. (2023). Examining the Impact of Digital Human Gaze Expressions on Engagement Induction. Biomimetics, 8(8), 610. https://doi.org/10.3390/biomimetics8080610
Sun, L., Zhang, M., Qiu, Y., & Zhang, C. (2023). Effects of Sleep Deprivation and Hazard Types on the Visual Search Patterns and Hazard Response Times of Taxi Drivers. Behavioral Sciences, 13(12), 1005. https://doi.org/10.3390/bs13121005
Zhang, C., Tian, C., Han, T., Li, H., Feng, Y., Chen, Y., Proctor, R. W., & Zhang, J. (2023). Evaluation of Infrastructure-based Warning System on Driving Behaviors – A Roundabout Study.
Alcocer, J. P. S., Cossio-Chavalier, A., Rojas-Stambuk, T., & Merino, L. (2023). An Eye-Tracking Study on the Use of Split/Unified Code Change Views for Bug Detection. IEEE Access, 11, 136195–136205. https://doi.org/10.1109/ACCESS.2023.3336859
Arnaud, C. (2023). A Design-Based Approach to Studying Algorithmic Practices: A Case Study on the Explicit Controllability of a Recommender System. https://dial.uclouvain.be/pr/boreal/object/boreal:278314
Collins, A., Pillai, P., Balasingam, B., & Jaekel, A. (2023). Machine Learning Technique for Data Fusion and Cognitive Load Classification Using an Eye Tracker. In K. Daimi & A. Al Sadoon (Eds.), Proceedings of the 2023 International Conference on Advances in Computing Research (ACR’23) (pp. 86–95). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-33743-7_7
Eyes are the Windows to AI Reliance: Toward Real-Time Human-AI Reliance Assessment. (2023).
Fu, B., Austin, A., & Garcia, M. (2023). Visualizing Mappings Between Pairwise Ontologies – An Empirical Study of Matrix and Linked Indented List in Their User Support During Class Mapping Creation and Evaluation. In T. R. Payne, V. Presutti, G. Qi, M. Poveda-Villalón, G. Stoilos, L. Hollink, Z. Kaoudi, G. Cheng, & J. Li (Eds.), The Semantic Web – ISWC 2023 (pp. 579–598). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-47240-4_31
Inoue, M., Iwasaki, F., Nishiyama, M., & Iwai, Y. (2023). Heatmap Overlay Using Neutral Body Model for Visualizing the Measured Gaze Distributions of Observers. In H. Lu, M. Blumenstein, S.-B. Cho, C.-L. Liu, Y. Yagi, & T. Kamiya (Eds.), Pattern Recognition (pp. 102–114). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-47637-2_8
Kim, S. K., Liersch, J., & Kirchner, E. A. (2023). Classification of Error-Related Potentials Evoked During Observation of Human Motion Sequences. In D. D. Schmorrow & C. M. Fidopiastis (Eds.), Augmented Cognition (pp. 142–152). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-35017-7_10
Lee, S., Byun, G., & Ha, M. (2023). Exploring the association between environmental factors and fear of crime in residential streets: an eye-tracking and questionnaire study. Journal of Asian Architecture and Building Engineering, 0(0), 1–18. https://doi.org/10.1080/13467581.2023.2278449
Li, Z., Li, Z., & Li, F. (2023). Visual Attention Analytics for Individual Perception Differences and Task Load-Induced Inattentional Blindness. In P.-L. P. Rau (Ed.), Cross-Cultural Design (pp. 71–83). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-35939-2_6
Mora-Salinas, R. J., Perez-Rojas, D., & De La Trinidad-Rendon, J. S. (2023). Real-Time Sensory Adaptive Learning for Engineering Students. In M. E. Auer, W. Pachatz, & T. Rüütmann (Eds.), Learning in the Age of Digital and Green Transition (pp. 820–831). Springer International Publishing. https://doi.org/10.1007/978-3-031-26876-2_78
Pacheco-González, D., Argudo-Vasconez, A., Ortega-Chasi, P., Cobos-Cali, M., & Alvarado-Cando, O. (2023). Fixation Analysis of Affective Picture Processing in Aggressive Adolescent. Physical Ergonomics and Human Factors.
Rodrigo, M. M. T., & Tablatin, C. L. S. (2023). How Do Programming Students Read and Act upon Compiler Error Messages? In D. D. Schmorrow & C. M. Fidopiastis (Eds.), Augmented Cognition (pp. 153–168). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-35017-7_11
Shekh Khalil, N., Eraslan, S., & Yesilada, Y. (2023). Predicting Trending Elements on Web Pages Using Machine Learning. International Journal of Human–Computer Interaction, 0(0), 1–16. https://doi.org/10.1080/10447318.2023.2261677
Taha, B., Seha, S. N. A., Hwang, D. Y., & Hatzinakos, D. (2023). EyeDrive: A Deep Learning Model for Continuous Driver Authentication. IEEE Journal of Selected Topics in Signal Processing, 1–11. https://doi.org/10.1109/JSTSP.2023.3235302
Taieb-Maimon, M., Romanovski-Chernik, A., Last, M., Litvak, M., & Elhadad, M. (2023). Mining Eye-Tracking Data for Text Summarization. International Journal of Human–Computer Interaction, 0(0), 1–19. https://doi.org/10.1080/10447318.2023.2227827
Tang, P., Yuen, I., Demuth, K., & Xu Rattanasone, N. (2023). The acquisition of contrastive focus during online sentence-comprehension by children learning Mandarin Chinese. Developmental Psychology, 59(5), 845–861. https://doi.org/10.1037/dev0001498
Timme, S., Brand, R., & Raboldt, M. (2023). Exercise or not? An empirical illustration of the role of behavioral alternatives in exercise motivation and resulting theoretical considerations. Frontiers in Psychology, 14. https://www.frontiersin.org/articles/10.3389/fpsyg.2023.1049356
Wiediartini, Ciptomulyono, U., & Dewi, R. S. (2023). Evaluation of physiological responses to mental workload in n-back and arithmetic tasks. Ergonomics, 0(0), 1–13. https://doi.org/10.1080/00140139.2023.2284677
Xu, J., Guo, K., Zhang, X., & Sun, P. Z. H. (2023). Left Gaze Bias between LHT and RHT: A Recommendation Strategy to Mitigate Human Errors in Left- and Right-Hand Driving. IEEE Transactions on Intelligent Vehicles, 1–12. https://doi.org/10.1109/TIV.2023.3298481
2022
Calvo, L., Christel, I., Terrado, M., Cucchietti, F., & Pérez-Montoro, M. (2022). Users’ Cognitive Load: A Key Aspect to Successfully Communicate Visual Climate Information. Bulletin of the American Meteorological Society, 103(1), E1–E16. https://doi.org/10.1175/BAMS-D-20-0166.1
López, P., D. Camba, J., & Contero, M. (2022). An Analysis of Influencer Marketing Effectiveness in Luxury Brands using Eye tracking Technology. https://doi.org/10.54941/ahfe1002051
White, A., & O’Hare, D. (2022). In plane sight: Inattentional blindness affects visual detection of external targets in simulated flight. Applied Ergonomics, 98, 103578. https://doi.org/10.1016/j.apergo.2021.103578
Coyne, J. T., Dollinger, S., Brown, N., Foroughi, C., Sibley, C., & Phillips, H. (2022). Limitations of current spatial ability testing for military aviators. Military Psychology, 34(1), 33–46. https://doi.org/10.1080/08995605.2021.1965786
Koutsogiorgi, C. C., & Michaelides, M. P. (2022). Response tendencies due to item wording using eye-tracking methodology accounting for individual differences and item characteristics. Behavior Research Methods. https://doi.org/10.3758/s13428-021-01719-x
Katona, J. (2022). Measuring Cognition Load Using Eye-Tracking Parameters Based on Algorithm Description Tools. Sensors, 22(3), 912. https://doi.org/10.3390/s22030912
Porta, M., Dondi, P., Zangrandi, N., & Lombardi, L. (2022). Gaze-Based Biometrics From Free Observation of Moving Elements. IEEE Transactions on Biometrics, Behavior, and Identity Science, 4(1), 85–96. https://doi.org/10.1109/TBIOM.2021.3130798
Salaken, S. M., Hettiarachchi, I., Munia, A. A., Hasan, M. M., Khosravi, A., Mohamed, S., & Rahman, A. (2022). Predicting Cognitive Load of an Individual With Knowledge Gained From Others: Improvements in Performance Using Crowdsourcing. IEEE Systems, Man, and Cybernetics Magazine, 8(1), 4–15. https://doi.org/10.1109/MSMC.2021.3103498
Ivančić Valenko, S., Keček, D., Čačić, M., & Slanec, K. (2022). The Impact of a Web Banner Position on the Webpage User Experience. Tehnički Glasnik, 16(1), 93–97. https://doi.org/10.31803/tg-20211119110843
Zhu, H., Salcudean, S., & Rohling, R. (2022). Gaze-Guided Class Activation Mapping: Leveraging Human Attention for Network Attention in Chest X-rays Classification. arXiv:2202.07107 [Cs, Eess]. http://arxiv.org/abs/2202.07107
Dang, A., & Nichols, B. S. (2022). Consumer response to positive nutrients on the facts up front (FUF) label: A comparison between healthy and unhealthy foods and the role of nutrition motivation. Journal of Marketing Theory and Practice, 0(0), 1–20. https://doi.org/10.1080/10696679.2021.2020662
Han, E. (2022). Representing Hierarchies of Visual Regard in Eye-Tracking Analysis. Leonardo, 55(1), 51–56. https://doi.org/10.1162/leon_a_02096
Srinivasan, R., Turpin, A., & McKendrick, A. M. (2022). Developing a Screening Tool for Areas of Abnormal Central Vision Using Visual Stimuli With Natural Scene Statistics. Translational Vision Science & Technology, 11(2), 34. https://doi.org/10.1167/tvst.11.2.34
D’Anselmo, A., Pisani, A., & Brancucci, A. (2022). A tentative I/O curve with consciousness: Effects of multiple simultaneous ambiguous figures presentation on perceptual reversals and time estimation. Consciousness and Cognition, 99, 103300. https://doi.org/10.1016/j.concog.2022.103300
Singh, G., Maurya, A., & Goel, R. (2022). Integrating New Technologies in International Business: Opportunities and Challenges. CRC Press.
Hidalgo, C., Mohamed, I., Zielinski, C., & Schön, D. (2022). The effect of speech degradation on the ability to track and predict turn structure in conversation. Cortex. https://doi.org/10.1016/j.cortex.2022.01.020
Pietras, K., & Ganczarek, J. (2022). Aesthetic Reactions to Violations in Contemporary Art: The Role of Expertise and Individual Differences. Creativity Research Journal, 0(0), 1–15. https://doi.org/10.1080/10400419.2022.2046909
Katona, J. (2022). Examination of the Advantage of the Clean Code Technique by Analyzing Eye Movement Parameters.
Cuve, H. C., Stojanov, J., Roberts-Gaal, X., Catmur, C., & Bird, G. (2022). Validation of Gazepoint low-cost eye-tracking and psychophysiology bundle. Behavior Research Methods, 54(2), 1027–1049. https://doi.org/10.3758/s13428-021-01654-x
Maniglia, M., Contemori, G., Marini, E., & Battaglini, L. (2022). Contrast adaptation of flankers reduces collinear facilitation and inhibition. Vision Research, 193, 107979. https://doi.org/10.1016/j.visres.2021.107979
Stojmenović, M., Spero, E., Stojmenović, M., & Biddle, R. (2022). What is Beautiful is Secure. ACM Transactions on Privacy and Security. https://doi.org/10.1145/3533047
Kävrestad, J., Hagberg, A., Nohlberg, M., Rambusch, J., Roos, R., & Furnell, S. (2022). Evaluation of Contextual and Game-Based Training for Phishing Detection. Future Internet, 14(4), 104. https://doi.org/10.3390/fi14040104
Veerabhadrappa, R., Hettiarachchi, I. T., & Bhatti, A. (2022). Gaze Convergence Based Collaborative Performance Prediction in a 3-Member Joint Activity Setting. 2022 IEEE International Systems Conference (SysCon), 1–7. https://doi.org/10.1109/SysCon53536.2022.9773865
Veerabhadrappa, R., Hettiarachchi, I. T., & Bhatti, A. (2022). Using Eye-tracking To Investigate The Effect of Gaze Co-occurrence and Distribution on Collaborative Performance. 2022 IEEE International Systems Conference (SysCon), 1–8. https://doi.org/10.1109/SysCon53536.2022.9773860
Cybulski, P. (2022). An Empirical Study on the Effects of Temporal Trends in Spatial Patterns on Animated Choropleth Maps. ISPRS International Journal of Geo-Information, 11(5), 273. https://doi.org/10.3390/ijgi11050273
Spitzer, L., & Mueller, S. (2022). Using a test battery to compare three remote, video-based eye-trackers. 2022 Symposium on Eye Tracking Research and Applications, 1–7. https://doi.org/10.1145/3517031.3529644
Robison, M. K., Coyne, J. T., Sibley, C., Brown, N. L., Neilson, B., & Foroughi, C. (2022). An examination of relations between baseline pupil measures and cognitive abilities. Psychophysiology, n/a(n/a), e14124. https://doi.org/10.1111/psyp.14124
Destyanto, T. Y. R., & Lin, R. F. (2022). Evaluating the Effectiveness of Complexity Features of Eye Movement on Computer Activities Detection. Healthcare, 10(6), 1016. https://doi.org/10.3390/healthcare10061016
Kollias, K.-F., Syriopoulou-Delli, C. K., & Sarigiannidis, P. (2022). Autism detection in High-Functioning Adults with the application of Eye-Tracking technology and Machine Learning. 4.
Veerabhadrappa, R., Hettiarachchi, I. T., Hanoun, S., Jia, D., Hosking, S. G., & Bhatti, A. (2022). Evaluating Operator Training Performance Using Recurrence Quantification Analysis of Autocorrelation Transformed Eye Gaze Data. Human Factors, 00187208221116953. https://doi.org/10.1177/00187208221116953
Gallant, S. N., Kennedy, B. L., Bachman, S. L., Huang, R., Cho, C., Lee, T.-H., & Mather, M. (2022). Behavioral and fMRI evidence that arousal enhances bottom-up selectivity in young but not older adults. Neurobiology of Aging. https://doi.org/10.1016/j.neurobiolaging.2022.08.006
Gawade, V., Bifulco, C., & (Grace) Guo, W. (2022). Lessons Learned to Effectively Teach and Evaluate Undergraduate Engineers in Work Design and Ergonomics Laboratory from a World Before, During, and After COVID-19. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 66(1), 756–760. https://doi.org/10.1177/1071181322661505
Lewandowska, A., Dziśko, M., & Jankowski, J. (2022). Investigation the role of contrast on habituation and sensitisation effects in peripheral areas of graphical user interfaces. Scientific Reports, 12(1), 15281. https://doi.org/10.1038/s41598-022-16284-2
Zyrianov, V., Peterson, C. S., Guarnera, D. T., Behler, J., Weston, P., Sharif, B., & Maletic, J. I. (2022). Deja Vu: semantics-aware recording and replay of high-speed eye tracking and interaction data to support cognitive studies of software engineering tasks—methodology and analyses. Empirical Software Engineering, 27(7), 168. https://doi.org/10.1007/s10664-022-10209-3
Antoine, M., Abdessalem, H. B., & Frasson, C. (2022). Cognitive Workload Assessment of Aircraft Pilots. Journal of Behavioral and Brain Science, 12(10), 474–484. https://doi.org/10.4236/jbbs.2022.1210027
Pillai, P., Balasingam, B., & Biondi, F. (2022). USING SIGNAL-TO-NOISE RATIO TO EXPLORE THE COGNITIVE COST OF THE DETECTION RESPONSE TASK. https://doi.org/10.1177/1071181322661481
Steffens, J., & Himmelein, H. (2022). Induced cognitive load influences unpleasantness judgments of modulated noise.
Souza, A., & Freitas, D. (2022). Towards the Improvement of the Cognitive Process of the Synthesized Speech of Mathematical Expression in MathML: An Eye-Tracking. 2022 International Conference on Interactive Media, Smart Systems and Emerging Technologies (IMET), 1–8. https://doi.org/10.1109/IMET54801.2022.9929541
Zhou, H., Doggett, E. V., Qi, K., Tang, B., Wolak, A., Nahavandi, S., & Nguyen, D. T. (2022). Image Saliency Prediction in Novel Production Scenarios. 2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 3367–3372. https://doi.org/10.1109/SMC53654.2022.9945490
Menzel, T., Teubner, T., Adam, M. T. P., & Toreini, P. (2022). Home is where your Gaze is – Evaluating effects of embedding regional cues in user interfaces. Computers in Human Behavior, 136, 107369. https://doi.org/10.1016/j.chb.2022.107369
Cao, S., & Huang, C.-M. (2022). Understanding User Reliance on AI in Assisted Decision-Making. Proc. ACM Hum.-Comput. Interact., 6(CSCW2), 471:1-471:23. https://doi.org/10.1145/3555572
Contemori, G., Oletto, C. M., Cessa, R., Marini, E., Ronconi, L., Battaglini, L., & Bertamini, M. (2022). Investigating the role of the foveal cortex in peripheral object discrimination. Scientific Reports, 12(1), 19952. https://doi.org/10.1038/s41598-022-23720-w
Murphy, T. I., Abel, L. A., Armitage, J. A., & Douglass, A. G. (2022). Effects of tracker location on the accuracy and precision of the Gazepoint GP3 HD for spectacle wearers. Behavior Research Methods. https://doi.org/10.3758/s13428-022-02023-y
Salminen, J., Jung, S., Nielsen, L., Şengün, S., & Jansen, B. J. (2022). How does varying the number of personas affect user perceptions and behavior? Challenging the ‘small personas’ hypothesis! International Journal of Human-Computer Studies, 168, 102915. https://doi.org/10.1016/j.ijhcs.2022.102915
Sethi, T., & Ziat, M. (2022). Dark mode vogue: Do light-on-dark displays have measurable benefits to users? Ergonomics, 0(0), 1–15. https://doi.org/10.1080/00140139.2022.2160879
Chmal, J., Ptasińska, M., & Skublewska-Paszkowska, M. (2022). Analysis of the ergonomics of e-commerce websites. Journal of Computer Sciences Institute, 25, 330–336. https://doi.org/10.35784/jcsi.3016
Cui, Z., Tang, Y.-Y., & Kim, M.-K. (2022). The Effects of Different Kinds of Smooth Pursuit Exercises on Center of Pressure and Muscle Activities during One Leg Standing. Healthcare, 10(12), 2498. https://doi.org/10.3390/healthcare10122498
Beşer, A., Sengewald, J., & Lackes, R. (2022). Drawing Attention on (Visually) Competitive Online Shopping Platforms – An Eye-Tracking Study Analysing the Effects of Visual Cues on the Amazon Marketplace. In Ē. Nazaruka, K. Sandkuhl, & U. Seigerroth (Eds.), Perspectives in Business Informatics Research (pp. 159–174). Springer International Publishing. https://doi.org/10.1007/978-3-031-16947-2_11
Edughele, H. O., Zhang, Y., Muhammad-Sukki, F., Vien, Q.-T., Morris-Cafiero, H., & Opoku Agyeman, M. (2022). Eye-Tracking Assistive Technologies for Individuals With Amyotrophic Lateral Sclerosis. IEEE Access, 10, 41952–41972. https://doi.org/10.1109/ACCESS.2022.3164075
Gao, H., Fan, W., Qiu, L., Yang, X., Li, Z., Zuo, X., Li, Y., Meng, M. Q.-H., & Ren, H. (2022). SAVAnet: Surgical Action-Driven Visual Attention Network for Autonomous Endoscope Control. IEEE Transactions on Automation Science and Engineering, 1–13. https://doi.org/10.1109/TASE.2022.3203631
Kim, M., Jeong, H., Kantharaju, P., Yoo, D., Jacobson, M., Shin, D., Han, C., & Patton, J. L. (2022). Visual guidance can help with the use of a robotic exoskeleton during human walking. Scientific Reports, 12(1), 3881. https://doi.org/10.1038/s41598-022-07736-w
Li, H. X., Mancuso, V., & McGuire, S. (2022). Integrated Sensors Platform. In D. Harris & W.-C. Li (Eds.), Engineering Psychology and Cognitive Ergonomics (pp. 64–73). Springer International Publishing. https://doi.org/10.1007/978-3-031-06086-1_5
Mariam, K., Afzal, O. M., Hussain, W., Javed, M. U., Kiyani, A., Rajpoot, N., Khurram, S. A., & Khan, H. A. (2022). On Smart Gaze based Annotation of Histopathology Images for Training of Deep Convolutional Neural Networks. IEEE Journal of Biomedical and Health Informatics, 1–1. https://doi.org/10.1109/JBHI.2022.3148944
Tang, P., Yuen, I., Demuth, K., & Rattanasone, N. X. (2022). The acquisition of contrastive focus during online sentence-comprehension by children learning mandarin Chinese. Developmental Psychology, No Pagination Specified-No Pagination Specified. https://doi.org/10.1037/dev0001498
Thang, S. M., Priyadarshini, M., Tan, J. P. S., Wong, H. K., Wong, H., Iman, A. N., Arshad, N., & Sue, C. H. (2022). Is There a Relationship between Prereaders’ Visual Attention and Their Storytelling Performance? Evidence from Eye-Tracking and Qualitative Data. 21.
Xu, J., Guo, K., & Sun, P. Z. H. (2022). Driving Performance Under Violations of Traffic Rules: Novice Vs. Experienced Drivers. IEEE Transactions on Intelligent Vehicles, 1–10. https://doi.org/10.1109/TIV.2022.3200592
Yu-Wen, H., Yu-Ju, Y., & Wei, J. (2022). User Perception and Eye Movement on A Pandemic Data Visualization Dashboard. Proceedings of the Association for Information Science and Technology, 59(1), 121–131. https://doi.org/10.1002/pra2.610
2021
Katona, J. (2021). Analyse the Readability of LINQ Code using an Eye-Tracking-based Evaluation. Acta Polytechnica Hungarica, 18, 193–215. https://doi.org/10.12700/APH.18.1.2021.1.12
Seha, S. N. A., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. E. (2021). Improving eye movement biometrics in low frame rate eye-tracking devices using periocular and eye blinking features. Image and Vision Computing, 104124. https://doi.org/10.1016/j.imavis.2021.104124
Sulikowski, P., Zdziebko, T., Coussement, K., Dyczkowski, K., Kluza, K., & Sachpazidu-Wójcicka, K. (2021). Gaze and Event Tracking for Evaluation of Recommendation-Driven Purchase. Sensors, 21, 1381. https://doi.org/10.3390/s21041381
Ghiţă, A., Hernández Serrano, O., Fernández-Ruiz, J., Moreno, M., Monras, M., Ortega, L., Mondon, S., Teixidor, L., Gual, A., Gacto-Sanchez, M., Porras Garcia, B., Ferrer-García, M., & Gutiérrez-Maldonado, J. (2021). Attentional Bias, Alcohol Craving, and Anxiety Implications of the Virtual Reality Cue-Exposure Therapy in Severe Alcohol Use Disorder: A Case Report. Frontiers in Psychology, 12, 543586. https://doi.org/10.3389/fpsyg.2021.543586
Karargyris, A., Kashyap, S., Lourentzou, I., Wu, J. T., Sharma, A., Tong, M., Abedin, S., Beymer, D., Mukherjee, V., Krupinski, E. A., & Moradi, M. (2021). Creation and validation of a chest X-ray dataset with eye-tracking and report dictation for AI development. Scientific Data, 8(1), 92. https://doi.org/10.1038/s41597-021-00863-5
Avoyan, A., Ribeiro, M., Schotter, A., Schotter, E. R., Vaziri, M., & Zou, M. (2021). PLANNED VS. ACTUAL ATTENTION (SSRN Scholarly Paper No. ID 3836157). Social Science Research Network. https://doi.org/10.2139/ssrn.3836157
Hu, X., Nakatsuru, S., Ban, Y., Fukui, R., & Warisawa, S. (2021). A Physiology-Based Approach for Estimation of Mental Fatigue Levels With Both High Time Resolution and High Level of Granularity. Informatics in Medicine Unlocked, 100594. https://doi.org/10.1016/j.imu.2021.100594
Moriishi, C., Shunta, M., Ogishima, H., & Shimada, H. (2021). Effects of cortisol on retrieval of extinction memory in individuals with social anxiety. Comprehensive Psychoneuroendocrinology, 100060. https://doi.org/10.1016/j.cpnec.2021.100060
Ranalli, J. (2021). L2 student engagement with automated feedback on writing: Potential for learning and issues of trust. Journal of Second Language Writing, 52, 100816. https://doi.org/10.1016/j.jslw.2021.100816
Srinivasan, R., Turpin, A., & McKendrick, A. M. (2021). Developing a new method to assess central vision using visual stimuli with natural scene statistics. Investigative Ophthalmology & Visual Science, 62(8), 3366.
Corsi, M., Giaconi, C., & Perry, V. (2021). The special pedagogy between research and training during Covid-19. The possible inclusion after pandemic. Education Sciences & Society – Open Access, 12(1). https://doi.org/10.3280/ess1-2021oa12056
Čisar, S., Pinter, R., Kővári, A., & Miklos, P. (2021). Application of Eye Movement Monitoring Technique in Teaching Process. TRANSACTIONS ON ADVANCED RESEARCH 17, 32–36.
Planke, L. J., Gardi, A., Sabatini, R., Kistan, T., & Ezer, N. (2021). Online Multimodal Inference of Mental Workload for Cognitive Human Machine Systems. Computers, 10(6), 81. https://doi.org/10.3390/computers10060081
Nassar, A. A., & Elsamahy, E. (2021). Towards creating air force pilots’ selection model: A comparison between most accurate mental stress markers in contact and non-contact techniques. 2021 International Telecommunications Conference (ITC-Egypt), 1–4. https://doi.org/10.1109/ITC-Egypt52936.2021.9513942
Tomri̇S Küçün, N., & Gönenç Güler, E. (2021). Examination of Consumer Purchase Decisions via Neuromarketing Methods: A Social Psychology Approach. PRIZREN SOCIAL SCIENCE JOURNAL, 5(2), 14–29. https://doi.org/10.32936/pssj.v5i2.245
Gambiraža, M., Kesedžić, I., Šarlija, M., Popovic, S., & Cosic, K. (2021). Classification of Cognitive Load based on Oculometric Features. https://doi.org/10.23919/MIPRO52101.2021.9597067
Mossad, O., Diab, K., Amer, I., & Hefeeda, M. (2021). DeepGame: Efficient Video Encoding for Cloud Gaming. Proceedings of the 29th ACM International Conference on Multimedia, 1387–1395. https://doi.org/10.1145/3474085.3475594
Oxford Business College, 65 George Street, Oxford, UK & Institute for Neuromarketing, Jurja Ves III spur no 4, Zagreb, Croatia, & Sola, Dr. H. M. (2021). How Neuroscience-Based Research Methodologies Can Deliver New Insights to Marketers. International Journal of Social Science and Human Research, 04(10). https://doi.org/10.47191/ijsshr/v4-i10-41
Šola, H. M., Steidl, P., Mikac, M., Qureshi, F., & Khawaja, S. (2021). HOW NEUROSCIENCE-BASED RESEARCH METHODOLOGIES CAN DELIVER NEW INSIGHTS TO MARKETERS. International Journal of Social Science and Humanity, 04, 2963–2972. https://doi.org/10.47191/ijsshr/v4-i10-41
Zhou, Y. (2021). Eyes Move, Drones Move Explore the Feasibility of Various Eye Movement Control Intelligent Drones. 2021 IEEE International Conference on Data Science and Computer Application (ICDSCA), 508–513. https://doi.org/10.1109/ICDSCA53499.2021.9650336
Bayani, K. Y. T., Natraj, N., Khresdish, N., Pargeter, J., Stout, D., & Wheaton, L. A. (2021). Emergence of perceptuomotor relationships during paleolithic stone toolmaking learning: intersections of observation and practice. Communications Biology, 4(1), 1–12. https://doi.org/10.1038/s42003-021-02768-w
Conijn, R., Dux Speltz, E., & Chukharev-Hudilainen, E. (2021). Automated extraction of revision events from keystroke data. Reading and Writing. https://doi.org/10.1007/s11145-021-10222-w
Shen, Y., Wijayaratne, N., Sriram, P., Hasan, A., Du, P., & Driggs-Campbell, K. (2021). CoCAtt: A Cognitive-Conditioned Driver Attention Dataset (No. arXiv:2111.10014). arXiv. http://arxiv.org/abs/2111.10014
Patel, A. N., Chau, G., Chang, C., Sun, A., Huang, J., Jung, T.-P., & Gilja, V. (2021). Affective response to volitional input perturbations in obstacle avoidance and target tracking games. 2021 43rd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), 6679–6682. https://doi.org/10.1109/EMBC46164.2021.9630523
Nakamura, G., Tatsukawa, S., Omori, K., Fukui, K., Sagara, J., & Chin, T. (2021). Evaluation and Training System of PC Operation for Elderly, Using Gazing Point and Mouse Operation. 2021 International Conference on Electrical, Computer and Energy Technologies (ICECET), 1–5. https://doi.org/10.1109/ICECET52533.2021.9698528
Bažantová, S., Štiková, E., Novák, M., & Gunina, D. (2021). Erotic appeals in advertising: visual attention and perceived appropriateness. Media Studies, 12(24), 21–39. https://hrcak.srce.hr/ojs/index.php/medijske-studije/article/view/14371
Bhowmick, S., Arjunan, S. P., Sarossy, M., Radcliffe, P., & Kumar, D. K. (2021). Pupillometric recordings to detect glaucoma eyes. Physiological Measurement. https://doi.org/10.1088/1361-6579/abf05c
Capellini, S. A., Metzner, I. P., Bianco, N. D., D’Angelo, I., Caldarelli, A., & Giaconi, C. (2021). Perceptual-visual-motor measures, reading and properties of eye movements of students with attention deficit hyperactivity disorder. EDUCATION SCIENCES AND SOCIETY, 2021/1. https://doi.org/10.3280/ess1-2021oa11927
Constantinides, A., Belk, M., Fidas, C., & Pitsillides, A. (2021). Understanding Insider Attacks in Personalized Picture Password Schemes. In C. Ardito, R. Lanzilotti, A. Malizia, H. Petrie, A. Piccinno, G. Desolda, & K. Inkpen (Eds.), Human-Computer Interaction – INTERACT 2021 (pp. 722–731). Springer International Publishing. https://doi.org/10.1007/978-3-030-85610-6_42
Contero-López, P., Torrecilla-Moreno, C., Escribá-Pérez, C., & Contero, M. (2021). Understanding Fashion Brand Awareness Using Eye-Tracking: The Mix-and-Match Approach. In E. Markopoulos, R. S. Goonetilleke, A. G. Ho, & Y. Luximon (Eds.), Advances in Creativity, Innovation, Entrepreneurship and Communication of Design (pp. 432–440). Springer International Publishing. https://doi.org/10.1007/978-3-030-80094-9_51
Intelligent Tutor Assistant: Predicting Confusion from Pupillometry Data with Multiple Classification Models – ProQuest. (2021). https://www.proquest.com/openview/d32ef32bbaa911d171dc6982ec92e5ed/1?pq-origsite=gscholar&cbl=51908
Kannegieser, E., Atorf, D., & Herold, J. (2021). Measuring Flow, Immersion and Arousal/Valence for Application in Adaptive Learning Systems. In R. A. Sottilare & J. Schwarz (Eds.), Adaptive Instructional Systems. Adaptation Strategies and Methods (pp. 62–78). Springer International Publishing. https://doi.org/10.1007/978-3-030-77873-6_5
L.s, K., G.a, Y., V.i, Z., I.i, G., & B.Yu, P. (2021). Assessing the Aircraft Crew Activity Basing on Video Oculography Data. Experimental Psychology (Russia), 14(1), 204–222. https://doi.org/10.17759/exppsy.2021140110
Leonidou, P., Constantinides, A., Belk, M., Fidas, C., & Pitsillides, A. (2021). Eye Gaze and Interaction Differences of Holistic Versus Analytic Users in Image-Recognition Human Interaction Proof Schemes. In A. Moallem (Ed.), HCI for Cybersecurity, Privacy and Trust (pp. 66–75). Springer International Publishing. https://doi.org/10.1007/978-3-030-77392-2_5
Mahanama, B., Jayawardena, G., & Jayarathna, S. (2021). Analyzing Unconstrained Reading Patterns of Digital Documents Using Eye Tracking. 2021 ACM/IEEE Joint Conference on Digital Libraries (JCDL), 282–283. https://doi.org/10.1109/JCDL52503.2021.00036
Obadă, R., & Cuza, A. I. (2021). Pretesting Flow Questionnaire Design Using Eye-Tracking: An Exploratory Study. 16.
Rosenlacher, P., & Tichý, J. (2021). Design of corporate logo from the perspective of eye tracking method. 10(01).
Sibley, C., Foroughi, C., Brown, N., Drollinger, S., Phillips, H., & Coyne, J. (2021). Augmenting Traditional Performance Analyses with Eye Tracking Metrics. In H. Ayaz & U. Asgher (Eds.), Advances in Neuroergonomics and Cognitive Engineering (pp. 118–125). Springer International Publishing. https://doi.org/10.1007/978-3-030-51041-1_17
Sun, X., & Balasingam, B. (2021). Reading Line Classification Using Eye-Trackers. IEEE Transactions on Instrumentation and Measurement, 70, 1–10. https://doi.org/10.1109/TIM.2021.3094817
Wang, M., Sharmin, S., Wang, M., & Yu, F. (2021). A Mixed-Method Usability Study on User Experience with Systematic Review Software. Proceedings of the Association for Information Science and Technology, 58(1), 346–356. https://doi.org/10.1002/pra2.462
2020
Kővári, A., Katona, J., & Pop, C. (2020). Quantitative Analysis of Relationship Between Visual Attention and Eye-Hand Coordination. Acta Polytechnica Hungarica, 17, 77–95. https://doi.org/10.12700/APH.17.2.2020.2.5
Kővári, A., Katona, J., & Pop, C. (2020). Evaluation of Eye-Movement Metrics in a Software Debugging Task using GP3 Eye Tracker. Acta Polytechnica Hungarica, 17, 57–76. https://doi.org/10.12700/APH.17.2.2020.2.4
Lewis, G. A., & Bidelman, G. M. (2020). Autonomic Nervous System Correlates of Speech Categorization Revealed Through Pupillometry. Frontiers in Neuroscience, 13. https://doi.org/10.3389/fnins.2019.01418
Holubova, R., Krčmářová, A., Richterek, L., & Ríha, J. (2020, January 13). ANALYSIS OF SOME SELECTED FCI ANALYSIS OF SOME SELECTED FCI TASKS USING EYE-TRACKING AND TASKS USING EYE-TRACKING AND CORRELATION WITH SCIENTIFIC CORRELATION WITH SCIENTIFIC REASONING SKILLS REASONING SKILLS.
Eraslan, S., Yesilada, Y., Yaneva, V., & Ha, L. A. (2020). “Keep it simple!”: an eye-tracking study for exploring complexity and distinguishability of web pages for people with autism. Universal Access in the Information Society. https://doi.org/10.1007/s10209-020-00708-9
Kim, S., Pollanen, M., Reynolds, M. G., & Burr, W. S. (2020). Problem Solving as a Path to Comprehension. Mathematics in Computer Science. https://doi.org/10.1007/s11786-020-00457-1
Park, S., Nguyen, B. N., & McKendrick, A. M. (2020). Ageing elevates peripheral spatial suppression of motion regardless of divided attention. Ophthalmic and Physiological Optics, n/a(n/a). https://doi.org/10.1111/opo.12674
Kato, N., Inoue, M., Nishiyama, M., & Iwai, Y. (2020). Comparing the Recognition Accuracy of Humans and Deep Learning on a Simple Visual Inspection Task. 14.
Sulikowski, P., & Zdziebko, T. (2020). Deep Learning-Enhanced Framework for Performance Evaluation of a Recommending Interface with Varied Recommendation Position and Intensity Based on Eye-Tracking Equipment Data Processing. Electronics, 9(2), 266. https://doi.org/10.3390/electronics9020266
Cox, D. J., Owens, J. M., Barnes, L., Moncrief, M., Boukhechba, M., Buckman, S., Banton, T., & Wotring, B. (2020). A Pilot Study Comparing Newly Licensed Drivers With and Without Autism and Experienced Drivers in Simulated and On-Road Driving. Journal of Autism and Developmental Disorders, 50(4), 1258–1268. https://doi.org/10.1007/s10803-019-04341-1
Brand, J., Masterson, T. D., Emond, J. A., Lansigan, R., & Gilbert-Diamond, D. (2020). Measuring attentional bias to food cues in young children using a visual search task: An eye-tracking study. Appetite, 148, 104610. https://doi.org/10.1016/j.appet.2020.104610
Ganczarek, J., Pietras, K., & Rosiek, R. (2020). Perceived cognitive challenge predicts eye movements while viewing contemporary paintings. PsyCh Journal, 9. https://doi.org/10.1002/pchj.365
Pires, L. de F. (2020). Master’s students’ post-editing perception and strategies: Exploratory study. FORUM. Revue Internationale d’interprétation et de Traduction / International Journal of Interpretation and Translation, 18(1), 26–44. https://doi.org/10.1075/forum.19014.pir
Biondi, F. N., Balasingam, B., & Ayare, P. (2020). On the Cost of Detection Response Task Performance on Cognitive Load. Human Factors, 0018720820931628. https://doi.org/10.1177/0018720820931628
Ebaid, D., & Crewther, S. G. (2020). The Contribution of Oculomotor Functions to Rates of Visual Information Processing in Younger and Older Adults. Scientific Reports, 10(1), 10129. https://doi.org/10.1038/s41598-020-66773-5
Doerflinger, J. T., & Gollwitzer, P. M. (2020). Emotion emphasis effects in moral judgment are moderated by mindsets. Motivation and Emotion. https://doi.org/10.1007/s11031-020-09847-1
Kuo, C.-F., Bavik, A., Ngan, H. F. B., & Yu, C.-E. (2020). The sweet spot in the eye of the beholder? Exploring the sweet sour spots of Asian restaurant menus. Journal of Hospitality Marketing & Management, 0(0), 1–16. https://doi.org/10.1080/19368623.2020.1790076
Bhowmik, S., Motin, M. A., Sarossy, M., Radcliffe, P., & Kumar, D. (2020). Sample entropy analysis of pupillary signals in glaucoma patients and control via light-induced pupillometry. 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), 280–283. https://doi.org/10.1109/EMBC44109.2020.9176558
Bristol, S., Agostine, S., Dallman, A., Harrop, C., Crais, E., Baranek, G., & Watson, L. (2020). Visual Biases and Attentional Inflexibilities Differentiate Those at Elevated Likelihood of Autism: An Eye-Tracking Study. American Journal of Occupational Therapy, 74(4_Supplement_1), 7411505221p1-7411505221p1. https://doi.org/10.5014/ajot.2020.74S1-PO8133
Volonte, M., Anaraky, R. G., Venkatakrishnan, R., Venkatakrishnan, R., Knijnenburg, B. P., Duchowski, A. T., & Babu, S. V. (2020). Empirical evaluation and pathway modeling of visual attention to virtual humans in an appearance fidelity continuum. Journal on Multimodal User Interfaces. https://doi.org/10.1007/s12193-020-00341-z
Anaraky, R. G., Bahirat, P., & Nasiri, M. (2020). Effect of Priming on Smart Home Privacy Preferences. 5.
Shi, M., Ming, H., Liu, Y., Mao, T., Zhu, D., Wang, Z., & Zhang, F. (2020). Saliency-dependent adaptive remeshing for cloth simulation. Textile Research Journal, 0040517520944248. https://doi.org/10.1177/0040517520944248
Ngan, H. F. B., Bavik, A., Kuo, C.-F., & Yu, C.-E. (2020). WHERE YOU LOOK DEPENDS ON WHAT YOU ARE WILLING TO AFFORD: EYE TRACKING IN MENUS. Journal of Hospitality & Tourism Research, 1096348020951226. https://doi.org/10.1177/1096348020951226
Clark, S., & Jasra, S. K. (2020). Detecting Differences Between Concealed and Unconcealed Emotions Using iMotions EMOTIENT. Journal of Emerging Forensic Sciences Research, 5(1), 1–24. https://jefsr.uwindsor.ca/index.php/jefsr/article/view/6376
Pirruccio, M., Monaco, S., Della Libera, C., & Cattaneo, L. (2020). Gaze direction influences grasping actions towards unseen, haptically explored, objects. Scientific Reports, 10(1), 15774. https://doi.org/10.1038/s41598-020-72554-x
Millán, Y. A., Chaves, M. L., & Barrero, J. C. (2020). A Review on Biometric Devices to be Applied in ASD Interventions. 2020 Congreso Internacional de Innovación y Tendencias En Ingeniería (CONIITI), 1–6. https://doi.org/10.1109/CONIITI51147.2020.9240291
Pritalia, G. L., Wibirama, S., Adji, T. B., & Kusrohmaniah, S. (2020). Classification of Learning Styles in Multimedia Learning Using Eye-Tracking and Machine Learning. 2020 FORTEI-International Conference on Electrical Engineering (FORTEI-ICEE), 145–150. https://doi.org/10.1109/FORTEI-ICEE50915.2020.9249875
Pillai, P., Ayare, P., Balasingam, B., Milne, K., & Biondi, F. (2020). Response Time and Eye Tracking Datasets for Activities Demanding Varying Cognitive Load. Data in Brief, 106389. https://doi.org/10.1016/j.dib.2020.106389
Chauhan, H., Prasad, A., & Shukla, J. (2020). Engagement Analysis of ADHD Students using Visual Cues from Eye Tracker. Companion Publication of the 2020 International Conference on Multimodal Interaction, 27–31. https://doi.org/10.1145/3395035.3425256
Karpova, V., Popenova, P., Glebko, N., Lyashenko, V., & Perepelkina, O. (2020). “Was It You Who Stole 500 Rubles?” – The Multimodal Deception Detection. Companion Publication of the 2020 International Conference on Multimodal Interaction, 112–119. https://doi.org/10.1145/3395035.3425638
Crameri, L., Hettiarachchi, I., & Hanoun, S. (2020). Feasibility Study of Skin Conductance Response for Quantifying Individual Dynamic Resilience. 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 1764–1771. https://doi.org/10.1109/SMC42975.2020.9283300
University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Jovančić, K., Milić Keresteš, N., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Nedeljković, U., & University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia. (2020). Influence of white space on text scanning. Proceedings – The Tenth International Symposium GRID 2020, 699–706. https://doi.org/10.24867/GRID-2020-p79
University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Vladić, G., Mijatović, S., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Bošnjaković, G., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Jurič, I., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Dimovski, V., & University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia. (2020). Analysis of the loading animation performance and viewer perception. Proceedings – The Tenth International Symposium GRID 2020, 667–675. https://doi.org/10.24867/GRID-2020-p76
Destyanto, T. Y. R., & Lin, R. F. (2020). Detecting computer activities using eye-movement features. Journal of Ambient Intelligence and Humanized Computing. https://doi.org/10.1007/s12652-020-02683-8
Hong, W. C. H., Ngan, H. F. B., Yu, J., & Zhao, Y. (2020). An eye-tracking study of exoticism in intra-national destinations in the Greater Bay area of China. Tourism Recreation Research, 0(0), 1–14. https://doi.org/10.1080/02508281.2020.1846431
Brand, J., Diamond, S. G., Thomas, N., & Gilbert-Diamond, D. (2020). Evaluating the data quality of the Gazepoint GP3 low-cost eye tracker when used independently by study participants. Behavior Research Methods. https://doi.org/10.3758/s13428-020-01504-2
Zuo, C., Ding, L., & Meng, L. (2020). A Feasibility Study of Map-Based Dashboard for Spatiotemporal Knowledge Acquisition and Analysis. ISPRS International Journal of Geo-Information, 9(11), 636. https://doi.org/10.3390/ijgi9110636
Lee, T. L., Yeung, M. K., Sze, S. L., & Chan, A. S. (2020). Computerized Eye-Tracking Training Improves the Saccadic Eye Movements of Children with Attention-Deficit/Hyperactivity Disorder. Brain Sciences, 10(12). https://doi.org/10.3390/brainsci10121016
Acero-Mondragon, E. J., Chaustre-Nieto, L. C., Urdaneta-Paredes, D. A., Cortes-Cabrera, J. A., & Gallego-Correa, J. J. (2020). Left -Right Pupil Diameter Difference-During Radiographic Reading of Broncopulmonary Carcinoma: An Exploration with Cognitive Load Among Novices and Experts. The FASEB Journal, 34(S1), 1–1. https://doi.org/10.1096/fasebj.2020.34.s1.09819
Bottos, S., & Balasingam, B. (2020). Tracking the Progression of Reading Using Eye-Gaze Point Measurements and Hidden Markov Models. IEEE Transactions on Instrumentation and Measurement, 1–1. https://doi.org/10.1109/TIM.2020.2983525
dos Santos, J. P. M., Ferreira, H., Reis, J., Prata, D., Simões, S. P., & Borges, I. D. (2020). The Use of Consumer Neuroscience Knowledge in Improving Real Promotional Media: The Case of Worten. In Á. Rocha, J. L. Reis, M. K. Peter, & Z. Bogdanović (Eds.), Marketing and Smart Technologies (pp. 202–218). Springer. https://doi.org/10.1007/978-981-15-1564-4_20
Furukado, R., Hagiwara, G., Ito, T., & Isogai, H. (2020). Comparison of EEG biofeedback and visual search strategies during e-sports play according to skill level. 10.
Furukado, R., Hagiwara, G., Ito, T., & Isogai, H. (2020). Comparison of EEG biofeedback and visual search strategies during e-sports play according to skill level. https://doi.org/10.14198/jhse.2020.15.Proc4.13
Gwizdka, J., & Dillon, A. (2020). Eye-Tracking as a Method for Enhancing Research on Information Search. In W. T. Fu & H. van Oostendorp (Eds.), Understanding and Improving Information Search: A Cognitive Approach (pp. 161–181). Springer International Publishing. https://doi.org/10.1007/978-3-030-38825-6_9
Inoue, M., Nishiyama, M., & Iwai, Y. (2020). Gender Classification using the Gaze Distributions of Observers on Privacy-Protected Training Images. 8.
Knogler, V. (2020). Viewing Behaviour and Task Performance on Austrian Destination Websites: Comparing Generation Y and the Baby Boomers. In M. Rainoldi & M. Jooss (Eds.), Eye Tracking in Tourism (pp. 225–241). Springer International Publishing. https://doi.org/10.1007/978-3-030-49709-5_14
Malhotra, A., Sankaran, A., Vatsa, M., Singh, R., Morris, K. B., & Noore, A. (2020). Understanding ACE-V Latent Fingerprint Examination Process via Eye-Gaze Analysis. IEEE Transactions on Biometrics, Behavior, and Identity Science, 1–1. https://doi.org/10.1109/TBIOM.2020.3027144
Prichard, C., & Atkins, A. (2020). Online Research Strategies of L2 Readers: Evaluating Strategic Competence through Mixed Methods. The Reading Matrix: An International Online Journal, 17.
2019
Jeong, H., & Liu, Y. (2019). Effects of non-driving-related-task modality and road geometry on eye movements, lane-keeping performance, and workload while driving. Transportation Research Part F: Traffic Psychology and Behaviour, 60, 157–171. https://doi.org/10.1016/j.trf.2018.10.015
Bottos, S., & Balasingam, B. (2019). An Approach to Track Reading Progression Using Eye-Gaze Fixation Points. ArXiv:1902.03322 [Cs]. http://arxiv.org/abs/1902.03322
Hienert, D., Kern, D., Mitsui, M., Shah, C., & Belkin, N. J. (2019). Reading Protocol: Understanding what has been Read in Interactive Information Retrieval Tasks. ArXiv:1902.04262 [Cs]. http://arxiv.org/abs/1902.04262
Lim, Y., Gardi, A., Pongsakornsathien, N., Sabatini, R., Ezer, N., & Kistan, T. (2019). Experimental Characterisation of Eye-Tracking Sensors for Adaptive Human-Machine Systems. Measurement. https://doi.org/10.1016/j.measurement.2019.03.032
Tanner, S. A., McCarthy, M. B., & O’Reilly, S. J. (2019). Exploring the roles of motivation and cognition in label-usage using a combined eye-tracking and retrospective think aloud approach. Appetite, 135, 146–158. https://doi.org/10.1016/j.appet.2018.11.015
Zhu, H., Salcudean, S. E., & Rohling, R. N. (2019). A novel gaze-supported multimodal human–computer interaction for ultrasound machines. International Journal of Computer Assisted Radiology and Surgery. https://doi.org/10.1007/s11548-019-01964-8
Seha, S., Papangelakis, G., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. (2019). Improving Eye Movement Biometrics Using Remote Registration of Eye Blinking Patterns. ICASSP 2019 – 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2562–2566. https://doi.org/10.1109/ICASSP.2019.8683757
Wibirama, S., Santosa, P. I., Widyarani, P., Brilianto, N., & Hafidh, W. (2019). Physical discomfort and eye movements during arbitrary and optical flow-like motions in stereo 3D contents. Virtual Reality. https://doi.org/10.1007/s10055-019-00386-w
Bottos, S., & Balasingam, B. (2019). A Novel Slip-Kalman Filter to Track the Progression of Reading Through Eye-Gaze Measurements. ArXiv:1907.07232 [Cs, Eess]. http://arxiv.org/abs/1907.07232
Larson, D. J., Wetherbee, J. C., & Branscum, P. (2019). CrossFit Athletic Identity’s Relationship to Sponsor Recall, Recognition, and Purchase Intent. International Journal of Kinesiology and Sports Science, 7(3), 6. https://doi.org/10.7575/aiac.ijkss.v.7n.3p.6
Prichard, C., & Atkins, A. (2019). Selective attention of L2 learners in task-based reading online. 22.
Václavíková, Z. (2019). Eye-tracker technology in didactic research. AIP Conference Proceedings, 2186(1), 060019. https://doi.org/10.1063/1.5137973
Villamor, M. M., & Rodrigo, Ma. M. T. (2019). Gaze collaboration patterns of successful and unsuccessful programming pairs using cross-recurrence quantification analysis. Research and Practice in Technology Enhanced Learning, 14(1), 25. https://doi.org/10.1186/s41039-019-0118-z
Pavisian, B., Patel, V. P., & Feinstein, A. (2019). Cognitive mediated eye movements during the SDMT reveal the challenges with processing speed faced by people with MS. BMC Neurology, 19(1), 340. https://doi.org/10.1186/s12883-019-1543-8
Calado, J., Marcelino-Jesus, E., Ferreira, F., & Sarraipa, J. (2019). EYE-TRACKING STUDENT’S BEHAVIOUR FOR E-LEARNING IMPROVEMENT. 8978–8986. https://doi.org/10.21125/edulearn.2019.2221
Coba, L., Rook, L., Zanker, M., & Symeonidis, P. (2019). Decision Making Strategies Differ in the Presence of Collaborative Explanations: Two Conjoint Studies. Proceedings of the 24th International Conference on Intelligent User Interfaces, 291–302. https://doi.org/10.1145/3301275.3302304
Coba, L., Zanker, M., & Rook, L. (2019). Decision Making Based on Bimodal Rating Summary Statistics – An Eye-Tracking Study of Hotels. In J. Pesonen & J. Neidhardt (Eds.), Information and Communication Technologies in Tourism 2019 (pp. 40–51). Springer International Publishing. https://doi.org/10.1007/978-3-030-05940-8_4
Constantinides, A., Fidas, C., Belk, M., & Pitsillides, A. (2019). “I Recall This Picture”: Understanding Picture Password Selections Based on Users’ Sociocultural Experiences. IEEE/WIC/ACM International Conference on Web Intelligence, 408–412. https://doi.org/10.1145/3350546.3352557
Constantinides, A., Belk, M., Fidas, C., & Pitsillides, A. (2019). On the Accuracy of Eye Gaze-driven Classifiers for Predicting Image Content Familiarity in Graphical Passwords. Proceedings of the 27th ACM Conference on User Modeling, Adaptation and Personalization, 201–205. https://doi.org/10.1145/3320435.3320474
Duchowski, A., Krejtz, K., Zurawska, J., & House, D. (2019). Using Microsaccades to Estimate Task Difficulty During Visual Search of Layered Surfaces. IEEE Transactions on Visualization and Computer Graphics, 1–1. https://doi.org/10.1109/TVCG.2019.2901881
Gupta, V., Chakraborty, T., Agarwal, M., Singh, R., Arora, M., & Vatsa, M. (2019). Bag-of-Lies: A Multimodal Dataset for Deception Detection. 8.
Kannegieser, E., Atorf, D., & Meier, J. (2019). Conducting an Experiment for Validating the Combined Model of Immersion and Flow. CSEDU. https://doi.org/10.5220/0007688902520259
Matthews, O., Eraslan, S., Yaneva, V., Davies, A., Yesilada, Y., Vigo, M., & Harper, S. (2019). Combining Trending Scan Paths with Arousal to Model Visual Behaviour on the Web: A Case Study of Neurotypical People vs People with Autism. Proceedings of the 27th ACM Conference on User Modeling, Adaptation and Personalization, 86–94. https://doi.org/10.1145/3320435.3320446
Neomániová, K., Berčík, J., & Pavelka, A. (2019). The Use of Eye‑Tracker and Face Reader as Useful Consumer Neuroscience Tools Within Logo Creation. Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis, 67(4), 1061–1070. https://doi.org/10.11118/actaun201967041061
Obaidellah, U., Raschke, M., & Blascheck, T. (2019). Classification of Strategies for Solving Programming Problems using AoI Sequence Analysis. 10.
Ohligs, M., Pereira, C., Voigt, V., Koeny, M., Janß, A., Rossaint, R., & Czaplik, M. (2019). Evaluation of an Anesthesia Dashboard Functional Model Based on a Manufacturer-Independent Communication Standard: Comparative Feasibility Study. JMIR Human Factors, 6(2), e12553. https://doi.org/10.2196/12553
Pfarr, J., Ganter, M. T., Spahn, D. R., Noethiger, C. B., & Tscholl, D. W. (2019). Avatar-Based Patient Monitoring With Peripheral Vision: A Multicenter Comparative Eye-Tracking Study. Journal of Medical Internet Research, 21(7), e13041. https://doi.org/10.2196/13041
Russell, C., & crusse. (2019). “I Consent”: An Eye-Tracking Study of IRB Informed Consent Forms.
Swift, D., & Schofield, D. (2019). THE IMPACT OF COLOR ON SECONDARY TASK TIME WHILE DRIVING. International Journal of Information Technology, 4(1), 19.
Volonte, M., Duchowski, A. T., & Babu, S. V. (2019). Effects of a Virtual Human Appearance Fidelity Continuum on Visual Attention in Virtual Reality. Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents, 141–147. https://doi.org/10.1145/3308532.3329461
Yaneva, V., & Eraslan, S. (2019). Adults with High-functioning Autism Process Web Pages With Similar Accuracy but Higher Cognitive Effort Compared to Controls. 4.
Ćosić, K., Popović, S., Šarlija, M., Mijić, I., Kokot, M., Kesedžić, I., Strangman, G., Ivković, V., & Zhang, Q. (2019). New Tools and Methods in Selection of Air Traffic Controllers Based on Multimodal Psychophysiological Measurements. IEEE Access, 7, 174873–174888. https://doi.org/10.1109/ACCESS.2019.2957357
2018
Ranalli, J., Feng, H.-H., & Chukharev-Khudilaynen, E. (2018). The affordances of process-tracing technologies for supporting L2 writing instruction. Language Learning & Technology. https://lib.dr.iastate.edu/engl_pubs/236
Hefley, M., Wethor, G., & Hale, M. L. (2018). Multimodal Data Fusion and Behavioral Analysis Tooling for Exploring Trust, Trust-propensity, and Phishing Victimization in Online Environments. Hawaii International Conference on System Sciences 2018 (HICSS-51). https://aisel.aisnet.org/hicss-51/da/behavioral_data_analytics/3
Selivanova, A., & Krabbe, P. F. M. (2018). Eye tracking to explore attendance in health-state descriptions. PLOS ONE, 13(1), e0190111. https://doi.org/10.1371/journal.pone.0190111
Obaidellah, U., Al Haek, M., & Cheng, P. C.-H. (2018). A Survey on the Usage of Eye-Tracking in Computer Programming. ACM Comput. Surv., 51(1), 5:1-5:58. https://doi.org/10.1145/3145904
Durkee, P. K., Goetz, A. T., & Lukaszewski, A. W. (2018). Formidability assessment mechanisms: Examining their speed and automaticity. Evolution and Human Behavior, 39(2), 170–178. https://doi.org/10.1016/j.evolhumbehav.2017.12.006
Malakhova, E. Y., Shelepin, E. Y., & Malashin, R. O. (2018). Temporal data processing from webcam eye tracking using artificial neural networks. Journal of Optical Technology, 85(3), 186–188. https://doi.org/10.1364/JOT.85.000186
Zandi, A. S., Quddus, A., Comeau, F. J. E., & Fogel, S. (2018). Novel non-intrusive approach to assess drowsiness based on eye movements and blinking (Patent No. US20180055354A1). https://patents.google.com/patent/US20180055354A1/en
Hanna, K. J. (2018). Combining Driver Alertness With Advanced Driver Assistance Systems (ADAS) (Patent No. US20180086339A1). https://patents.google.com/patent/US20180086339A1/en
Tkach, B. (2018). NEUROPSYCHOLOGICAL CHARACTERISTICS OF PEOPLE WITH DEVIANT BEHAVIOUR. PSYCHOLOGICAL JOURNAL, 13(3), 156–171. https://doi.org/10.31108/2018vol13iss3pp156-171
Nassar, A., Elsamahy, E., Awadallah, A., & Elmahlawy, M. (2018). Behavior of Different Physiological Markers in Relation to Computer Based Mental Activities. The International Conference on Electrical Engineering, 11(11), 1–8. https://doi.org/10.21608/iceeng.2018.30244
Iskander, J., Hettiarachchi, I., Hanoun, S., Hossny, M., Nahavandi, S., & Bhatti, A. (2018). A classifier approach to multi-screen switching based on low cost eye-trackers. 2018 Annual IEEE International Systems Conference (SysCon), 1–6. https://doi.org/10.1109/SYSCON.2018.8369597
Iskander, J., Hanoun, S., Hettiarachchi, I., Hossny, M., Saleh, K., Zhou, H., Nahavandi, S., & Bhatti, A. (2018). Eye behaviour as a hazard perception measure. 2018 Annual IEEE International Systems Conference (SysCon), 1–6. https://doi.org/10.1109/SYSCON.2018.8369509
Edelman, B. J., Meng, J., Gulachek, N., Cline, C. C., & He, B. (2018). Exploring Cognitive Flexibility With a Noninvasive BCI Using Simultaneous Steady-State Visual Evoked Potentials and Sensorimotor Rhythms. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26(5), 936–947. https://doi.org/10.1109/TNSRE.2018.2817924
Wijayanto, T., Marcilia, S. R., & Lufityanto, G. (2018). Visual Attention, Driving Behavior and Driving Performance among Young Drivers in Sleep-deprived Condition. KnE Life Sciences, 4(5), 424–434. https://doi.org/10.18502/kls.v4i5.2573
Tkach, B. (2018). Neuropsychological features personalities with deviant behavior. Fundamental and Applied Researches in Practice of Leading Scientific Schools, 27(3), 201–206. https://doi.org/10.33531/farplss.2018.3.24
Antunes, J., & Santana, P. (2018). A Study on the Use of Eye Tracking to Adapt Gameplay and Procedural Content Generation in First-Person Shooter Games. Multimodal Technologies and Interaction, 2(2), 23. https://doi.org/10.3390/mti2020023
Aliyev, F., Ürkmez, T., & Wagner, R. (2018). Luxury brands do not glitter equally for everyone. Journal of Brand Management, 25(4), 337–350. https://doi.org/10.1057/s41262-017-0085-x
Jaikumar, S. (2018). How Do Consumers Choose Sellers In E-Marketplaces?: The Role of Display Price And Sellers’ Review Volume. Journal of Advertising Research, JAR-2018-028. https://doi.org/10.2501/JAR-2018-028
Ngan, H. F. B., & Yu, C.-E. (2018). To smile or not to smile – an eye-tracking study on service recovery. Current Issues in Tourism, 0(0), 1–6. https://doi.org/10.1080/13683500.2018.1502260
Chang, C., Chen, C., & Lin, Y. (2018). A Visual Interactive Reading System Based on Eye Tracking Technology to Improve Digital Reading Performance. 2018 7th International Congress on Advanced Applied Informatics (IIAI-AAI), 182–187. https://doi.org/10.1109/IIAI-AAI.2018.00043
Morasco Junior, M. A. (2018). Parâmetros gráfico-inclusivos para o desenvolvimento de objetos de aprendizagem digitais voltados ao público infantil. https://repositorio.unesp.br/handle/11449/157459
Nishiyama, M., Matsumoto, R., Yoshimura, H., & Iwai, Y. (2018). Extracting discriminative features using task-oriented gaze maps measured from observers for personal attribute classification. Pattern Recognition Letters, 112, 241–248. https://doi.org/10.1016/j.patrec.2018.08.001
Sibley, C., Foroughi, C., Brown, N., & Coyne, J. T. (2018). Low Cost Eye Tracking: Ready for Individual Differences Research? Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 62(1), 741–745. https://doi.org/10.1177/1541931218621168
Kar, A., & Corcoran, P. (2018). Performance Evaluation Strategies for Eye Gaze Estimation Systems with Quantitative Metrics and Visualizations. Sensors, 18(9), 3151. https://doi.org/10.3390/s18093151
Lim, Y., Gardi, A., Sabatini, R., Ramasamy, S., Kistan, T., Ezer, N., Vince, J., & Bolia, R. (2018). Avionics Human-Machine Interfaces and Interactions for Manned and Unmanned Aircraft. Progress in Aerospace Sciences, 102, 1–46. https://doi.org/10.1016/j.paerosci.2018.05.002
Raveh, E., Friedman, J., & Portnoy, S. (2018). Visuomotor behaviors and performance in a dual-task paradigm with and without vibrotactile feedback when using a myoelectric controlled hand. Assistive Technology, 30(5), 274–280. https://doi.org/10.1080/10400435.2017.1323809
Yaman, C., Küçün, N. T., Güngör, S., & Eroğlu, S. (2018). THE CONTEXTUAL EFFECT AND MEASUREMENT OF ATTENTION TO ADVERTISEMENTS VIA EYE TRACKING METHOD. JOURNAL OF LIFE ECONOMICS, 5(4), 221–232. https://doi.org/10.15637/jlecon.271
Iskander, J., Jia, D., Hettiarachchi, I., Hossny, M., Saleh, K., Nahavandi, S., Best, C., Hosking, S., Rice, B., Bhatti, A., & Hanoun, S. (2018). Age-Related Effects of Multi-screen Setup on Task Performance and Eye Movement Characteristics. 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 3480–3485. https://doi.org/10.1109/SMC.2018.00589
Saleh, K., Iskander, J., Jia, D., Hossny, M., Nahavandi, S., Best, C., Hosking, S., Rice, B., Bhatti, A., & Hanoun, S. (2018). Reliable Switching Mechanism for Low Cost Multi-screen Eye Tracking Devices via Deep Recurrent Neural Networks. 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 3492–3497. https://doi.org/10.1109/SMC.2018.00591
Zhou, H., Wei, L., Cao, R., Hanoun, S., Bhatti, A., Tai, Y., & Nahavandi, S. (2018). The Study of Using Eye Movements to Control the Laparoscope Under a Haptically-Enabled Laparoscopic Surgery Simulation Environment. 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 3022–3026. https://doi.org/10.1109/SMC.2018.00513
Beattie, K. L., & Morrison, B. W. (2018). Navigating the Online World: Gaze, Fixations, and Performance Differences between Younger and Older Users. International Journal of Human–Computer Interaction, 0(0), 1–14. https://doi.org/10.1080/10447318.2018.1541545
Meng, J., Streitz, T., Gulachek, N., Suma, D., & He, B. (2018). Three-Dimensional Brain–Computer Interface Control Through Simultaneous Overt Spatial Attentional and Motor Imagery Tasks. IEEE Transactions on Biomedical Engineering, 65(11), 2417–2427. https://doi.org/10.1109/TBME.2018.2872855
Natraj, N., Alterman, B., Basunia, S., & Wheaton, L. A. (2018). The Role of Attention and Saccades on Parietofrontal Encoding of Contextual and Grasp-specific Affordances of Tools: An ERP Study. Neuroscience, 394, 243–266. https://doi.org/10.1016/j.neuroscience.2018.10.019
Notaro, G. M., & Diamond, S. G. (2018). Simultaneous EEG, eye-tracking, behavioral, and screen-capture data during online German language learning. Data in Brief, 21, 1937–1943. https://doi.org/10.1016/j.dib.2018.11.044
Eraslan, S., Yaneva, V., Yesilada, Y., & Harper, S. (2018). Web users with autism: eye tracking evidence for differences. Behaviour & Information Technology, 0(0), 1–23. https://doi.org/10.1080/0144929X.2018.1551933
Koury, H. F., Leonard, C. J., Carry, P. M., & Lee, L. M. J. (2018). An Expert Derived Feedforward Histology Module Improves Pattern Recognition Efficiency in Novice Students. Anatomical Sciences Education, 0(0). https://doi.org/10.1002/ase.1854
Flynn, S. R., Quartuccio, J. S., Sibley, C., & Coyne, J. T. (2018). Variation in Pupil Diameter by Day and Time of Day. In D. Harris (Ed.), Engineering Psychology and Cognitive Ergonomics (pp. 296–305). Springer International Publishing.
Hauser, F., Mottok, J., & Gruber, H. (2018). Eye Tracking Metrics in Software Engineering. Proceedings of the 3rd European Conference of Software Engineering Education, 39–44. https://doi.org/10.1145/3209087.3209092
Husvogt, L., Moult, E., Waheed, N., Fujimoto, J. G., & Maier, A. (2018). Abstract: Efficient Labeling of Optical Coherence Tomography Angiography Data using Eye Tracking. In A. Maier, T. M. Deserno, H. Handels, K. H. Maier-Hein, C. Palm, & T. Tolxdorff (Eds.), Bildverarbeitung für die Medizin 2018 (pp. 20–21). Springer Berlin Heidelberg.
Notaro, G. M., & Diamond, S. G. (2018). Development and Demonstration of an Integrated EEG, Eye-tracking, and Behavioral Data Acquisition System to Assess Online Learning. Proceedings of the 10th International Conference on Education Technology and Computers, 105–111. https://doi.org/10.1145/3290511.3290526
Qvarfordt, P., & Lee, M. (2018). Gaze Patterns During Remote Presentations While Listening and Speaking. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, 33:1-33:9. https://doi.org/10.1145/3204493.3204540
Rajanna, V., & Hammond, T. (2018). A Fitts’ Law Evaluation of Gaze Input on Large Displays Compared to Touch and Mouse Inputs. Proceedings of the Workshop on Communication by Gaze Interaction, 8:1-8:5. https://doi.org/10.1145/3206343.3206348
Spinelli, L., Pandey, M., & Oney, S. (2018). Attention Patterns for Code Animations: Using Eye Trackers to Evaluate Dynamic Code Presentation Techniques. Conference Companion of the 2Nd International Conference on Art, Science, and Engineering of Programming, 99–104. https://doi.org/10.1145/3191697.3214338
Unno, M., Ohba, H., Suzuki, Y., & Mizuno, S. (2018). Use Timestamp and Eye Tracking to Improve the Quality of Video Content. Proceedings of the 20th International Conference on Information Integration and Web-Based Applications & Services, 230–233. https://doi.org/10.1145/3282373.3282388
Yadav, D., Kohli, N., Kalsi, E., Vatsa, M., Singh, R., & Noore, A. (2018). Unraveling Human Perception of Facial Aging Using Eye Gaze. 2140–2147. http://openaccess.thecvf.com/content_cvpr_2018_workshops/w41/html/Yadav_Unraveling_Human_Perception_CVPR_2018_paper.html
Yaneva, V., Ha, L. A., Eraslan, S., Yesilada, Y., & Mitkov, R. (2018). Detecting Autism Based on Eye-Tracking Data from Web Searching Tasks. Proceedings of the Internet of Accessible Things, 16:1-16:10. https://doi.org/10.1145/3192714.3192819
2017
Krejtz, K., Duchowski, A., Zhou, H., Jörg, S., & Niedzielska, A. (2017). Perceptual evaluation of synthetic gaze jitter. Computer Animation and Virtual Worlds, n/a-n/a. https://doi.org/10.1002/cav.1745
Chen, Y., & Huang, B. (2017). The Impacts of Bonus and Penalty on Creativity: Insights from an Eye-Tracking Study (SSRN Scholarly Paper No. ID 2900669). Social Science Research Network. https://papers.ssrn.com/abstract=2900669
Naicker, P., Anoopkumar-Dukie, S., Grant, G. D., Modenese, L., & Kavanagh, J. J. (2017). Medications influencing central cholinergic pathways affect fixation stability, saccadic response time and associated eye movement dynamics during a temporally-cued visual reaction time task. Psychopharmacology, 234(4), 671–680. https://doi.org/10.1007/s00213-016-4507-3
Wibirama, S., Mahesa, R. R., Nugroho, H. A., & Hamamoto, K. (2017). Estimating 3D gaze in physical environment: a geometric approach on consumer-level remote eye tracker. 10225, 102251H. https://doi.org/10.1117/12.2266109
Salomao, L. A. T., Mahfouf, M., El-Samahy, E., & Ting, C. H. (2017). Psychophysiologically Based Real-Time Adaptive General Type 2 Fuzzy Modeling and Self-Organizing Control of Operator’s Performance Undertaking a Cognitive Task. IEEE Transactions on Fuzzy Systems, 25(1), 43–57. https://doi.org/10.1109/TFUZZ.2016.2598363
Murrugarra-Llerena, N., & Kovashka, A. (2017). Learning Attributes from Human Gaze. 2017 IEEE Winter Conference on Applications of Computer Vision (WACV), 510–519. https://doi.org/10.1109/WACV.2017.63
Li, S., Webb, J., Zhang, X., & Nelson, C. A. (2017). User evaluation of a novel eye-based control modality for robot-assisted object retrieval. Advanced Robotics, 31(7), 382–393. https://doi.org/10.1080/01691864.2016.1271748
Jeong, H., & Liu, Y. (2017, June 1). Modeling of stimulus-response secondary tasks with different modalities while driving in a computational cognitive architecture.
Kersbergen, I., & Field, M. (2017). Visual attention to alcohol cues and responsible drinking statements within alcohol advertisements and public health campaigns: Relationships with drinking intentions and alcohol consumption in the laboratory. Psychology of Addictive Behaviors: Journal of the Society of Psychologists in Addictive Behaviors, 31(4), 435–446. https://doi.org/10.1037/adb0000284
Coyne, J. T., Sibley, C., Sherwood, S., Foroughi, C. K., Olson, T., & Vorm, E. (2017). Assessing Workload with Low Cost Eye Tracking During a Supervisory Control Task. Augmented Cognition. Neurocognition and Machine Learning, 139–147. https://doi.org/10.1007/978-3-319-58628-1_12
Foroughi, C. K., Coyne, J. T., Sibley, C., Olson, T., Moclaire, C., & Brown, N. (2017). Pupil Dilation and Task Adaptation. Augmented Cognition. Neurocognition and Machine Learning, 304–311. https://doi.org/10.1007/978-3-319-58628-1_24
Harrison, A., Livingston, M. A., Brock, D., Decker, J., Perzanowski, D., Dolson, C. V., Mathews, J., Lulushi, A., & Raglin, A. (2017). The Analysis and Prediction of Eye Gaze When Viewing Statistical Graphs. Augmented Cognition. Neurocognition and Machine Learning, 148–165. https://doi.org/10.1007/978-3-319-58628-1_13
Mannaru, P., Balasingam, B., Pattipati, K., Sibley, C., & Coyne, J. T. (2017). Performance Evaluation of the Gazepoint GP3 Eye Tracking Device Based on Pupil Dilation. Augmented Cognition. Neurocognition and Machine Learning, 166–175. https://doi.org/10.1007/978-3-319-58628-1_14
Sibley, C., Foroughi, C. K., Olson, T., Moclaire, C., & Coyne, J. T. (2017). Practical Considerations for Low-Cost Eye Tracking: An Analysis of Data Loss and Presentation of a Solution. Augmented Cognition. Neurocognition and Machine Learning, 236–250. https://doi.org/10.1007/978-3-319-58628-1_19
Craig, T. (2017). Interacting with the Human Eye: Gaze Vector Shape Based Recognition and the Design of an Improved Episcleral Venomanometer. Embargoed Master’s Theses. http://digitalcommons.unl.edu/embargotheses/111
Bellisle, R., Steele, P., Bartels, R., Ding, L., Sunderam, S., & Besio, W. (2017). Identifying the effects of microsaccades in tripolar EEG signals. 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 4151–4154. https://doi.org/10.1109/EMBC.2017.8037770
Saisara, U., Boonbrahm, P., & Chaiwiriya, A. (2017). Strabismus screening by Eye Tracker and games. 2017 14th International Joint Conference on Computer Science and Software Engineering (JCSSE), 1–5. https://doi.org/10.1109/JCSSE.2017.8025956
Stoica, B., Florea, L., Bădeanu, A., Racoviţeanu, A., Felea, I., & Florea, C. (2017). Visual saliency analysis in paintings. 2017 International Symposium on Signals, Circuits and Systems (ISSCS), 1–4. https://doi.org/10.1109/ISSCS.2017.8034906
Li, S., & Zhang, X. (2017). Implicit Intention Communication in Human-Robot Interaction Through Visual Behavior Studies. IEEE Transactions on Human-Machine Systems, 47(4), 437–448. https://doi.org/10.1109/THMS.2017.2647882
Coyne, J. T., Foroughi, C., & Sibley, C. (2017). Pupil Diameter and Performance in a Supervisory Control Task: A Measure of Effort or Individual Differences? Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 61(1), 865–869. https://doi.org/10.1177/1541931213601689
Gomes, K. M., & Riggs, S. L. (2017). Analyzing Visual Search Techniques using Eye Tracking for a Computerized Provider Order Entry (CPOE) Task. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 61(1), 691–695. https://doi.org/10.1177/1541931213601659
Zorko, A., Ivančić Valenko, S., Tomiša, M., Keček, D., & Čerepinko, D. (2017). The impact of the text and background color on the screen reading experience. Tehnički Glasnik, 11(3), 78–82. https://hrcak.srce.hr/index.php?show=clanak&id_clanak_jezik=275271
Breeden, K., & Hanrahan, P. (2017). Gaze Data for the Analysis of Attention in Feature Films. ACM Trans. Appl. Percept., 14(4), 23:1-23:14. https://doi.org/10.1145/3127588
Foroughi, C. K., Sibley, C., & Coyne, J. T. (2017). Pupil size as a measure of within-task learning. Psychophysiology, 54(10), 1436–1443. https://doi.org/10.1111/psyp.12896
Andreu, L., & Sanz-Torrent, M. (2017). The Visual World Paradigm in Children with Spoken Language Disorders. In Eye-Tracking Technology Applications in Educational Research (pp. 262–282). IGI Global. http://books.google.com/books?hl=en&lr=&id=ca80DQAAQBAJ&oi=fnd&pg=PA262&dq=info:RrF-ChKR5H8J:scholar.google.com&ots=4HLRij621x&sig=Vcn6zwUHysDlfxXsRIcj-r2keoQ
Barik, T., Smith, J., Lubick, K., Holmes, E., Feng, J., Murphy-Hill, E., & Parnin, C. (2017). Do Developers Read Compiler Error Messages? Proceedings of the 39th International Conference on Software Engineering, 575–585. https://doi.org/10.1109/ICSE.2017.59
Burch, I., Li, X., McDade, S., & Swanson, K. (2017). Tracking Gaze Patterns in Human Facial Recognition. 5.
Durkee, P. (2017). Examining the Speed and Automaticity of Formidability Assessment Mechanisms. California State University, Fullerton.
Eraslan, S., Yaneva, V., Yesilada, Y., & Harper, S. (2017). Do Web Users with Autism Experience Barriers When Searching for Information Within Web Pages? Proceedings of the 14th Web for All Conference on The Future of Accessible Work, 20:1-20:4. https://doi.org/10.1145/3058555.3058566
Frydman, C., & Mormann, M. (2017). The Role of Salience and Attention in Choice Under Risk: An Experimental Investigation.
Halwani, Y., Salcudean, S. E., Lessoway, V. A., & Fels, S. S. (2017). Enhancing Zoom and Pan in Ultrasound Machines with a Multimodal Gaze-based Interface. Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, 1648–1654. https://doi.org/10.1145/3027063.3053174
Knoll, M. A. (2017). Developing a bicyclist hazard perception test: Explorative research comparing adult and adolescent cyclists on a visual scanning and a key press measure.
Lim, H., & Fussell, S. R. (2017). Understanding How People Attend to and Engage with Foreign Language Posts in Multilingual Newsfeeds. ICWSM, 588–591.
Ni, Y. (2017). A Study of Danmaku Video on Attention Allocation, Social Presence, Transportation to Narrative, Cognitive Workload and Enjoyment. Syracuse University.
Nugter, A. (2017). The effect of driving experience on hazard perception in relation to visual attention.
Sarkar, A. R., Sanyal, G., & Majumder, S. (2017). Performance Comparison of Different Low Cost Cameras for Development of an Eye-Tracking System. International Journal of Applied Engineering Research, 12(9), 1819–1825.
Sibley, C., Coyne, J., & Sherwood, S. (2017). Research Considerations and Tools for Evaluating Human-Automation Interaction with Future Unmanned Systems. In Autonomy and Artificial Intelligence: A Threat or Savior? (pp. 157–178). Springer, Cham. https://doi.org/10.1007/978-3-319-59719-5_7
Uribe-Quevedo, A., Valdivia, S., Prada, E., Navia, M., Rincon, C., Ramos, E., Ortiz, S., & Perez, B. (2017). Development of an Occupational Health Care Exergaming Prototype Suite. In Recent Advances in Technologies for Inclusive Well-Being (pp. 127–145). Springer, Cham. https://doi.org/10.1007/978-3-319-49879-9_7
Yedla, N. (2017). An Eye Tracking Study Assessing Code Readability [Youngstown State University]. https://etd.ohiolink.edu/pg_10?0::NO:10:P10_ACCESSION_NUM:ysu149512047317961
2016
Leifman, G., Rudoy, D., Swedish, T., Bayro-Corrochano, E., & Raskar, R. (2016). Learning Gaze Transitions from Depth to Improve Video Saliency Estimation. ArXiv:1603.03669 [Cs]. http://arxiv.org/abs/1603.03669
Nguyen, A. N., & Sheridan, D. (2016, June 28). Eye-Tracking: A Cost-Effective Workstation for Usability Studies. https://researchspace.auckland.ac.nz/handle/2292/30191
Jankowski, J., aw, Saganowski, S., aw, Bró, & Dka, P. (2016). Evaluation of TRANSFoRm Mobile eHealth Solution for Remote Patient Monitoring during Clinical Trials. Mobile Information Systems, 2016, e1029368. https://doi.org/10.1155/2016/1029368
Torres-Salomao, L. A., Mahfouf, M., El-Samahy, E., & Ting, C. (2016). Psycho-Physiologically-Based Real Time Adaptive General Type 2 Fuzzy Modelling and Self-Organising Control of Operator’s Performance Undertaking a Cognitive Task. IEEE Transactions on Fuzzy Systems, 10.1109/TFUZZ.2016.2598363. http://dx.doi.org/10.1109/TFUZZ.2016.2598363
Craig, T. L., Nelson, C. A., Li, S., & Zhang, X. (2016). Human gaze commands classification: A shape based approach to interfacing with robots. 2016 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA), 1–6. https://doi.org/10.1109/MESA.2016.7587154
Coyne, J., & Sibley, C. (2016). Investigating the Use of Two Low Cost Eye Tracking Systems for Detecting Pupillary Response to Changes in Mental Workload. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 60(1), 37–41. https://doi.org/10.1177/1541931213601009
Naicker, P., Anoopkumar-Dukie, S., Grant, G. D., & Kavanagh, J. J. (2016). Medications influencing central cholinergic neurotransmission affect saccadic and smooth pursuit eye movements in healthy young adults. Psychopharmacology. https://doi.org/10.1007/s00213-016-4436-1
Naicker, P., Anoopkumar-Dukie, S., Grant, G. D., Neumann, D. L., & Kavanagh, J. J. (2016). Central cholinergic pathway involvement in the regulation of pupil diameter, blink rate and cognitive function. Neuroscience, 334, 180–190. https://doi.org/10.1016/j.neuroscience.2016.08.009
Battle, D., Meaders, G., Link, N., Fitt, C., & Duchowski, A. T. (2016). Effects of Time Pressure on Fixation Clustering During a Seek and Find Game. 5.
Best, D. S., & Duchowski, A. T. (2016). A Rotary Dial for Gaze-based PIN Entry. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, 69–76. https://doi.org/10.1145/2857491.2857527
Duchowski, A. T., Jörg, S., Allen, T. N., Giannopoulos, I., & Krejtz, K. (2016). Eye Movement Synthesis. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, 147–154. https://doi.org/10.1145/2857491.2857528
Filip, J., Havran, V., & Myszkowski, K. (2016). Gaze Analysis of BRDF Distortions. http://library.utia.cas.cz/separaty/2016/RO/filip-0462548.pdf
Gomes, J., Marques, F., Lourenço, A., Mendonça, R., Santana, P., & Barata, J. (2016). Gaze-Directed Telemetry in High Latency Wireless Communications: The Case of Robot Teleoperation. https://www.researchgate.net/profile/Andre_Lourenco4/publication/307936647_Gaze-Directed_Telemetry_in_High_Latency_Wireless_Communications_The_Case_of_Robot_Teleoperation/links/57d2a73d08ae6399a38d7a95.pdf
Havran, V., Filip, J., & Myszkowski, K. (2016). Perceptually motivated BRDF comparison using single image. Computer Graphics Forum, 35, 1–12. http://onlinelibrary.wiley.com/doi/10.1111/cgf.12944/full
Jaikumar, S., Sahay, A., & others. (2016). Effect of Overlapping Price Ranges on Price Perception: Revisiting the Range Theory of Price Perception. Indian Institute of Management Ahmedabad, Research and Publication Department. http://64.207.185.160/assets/snippets/workingpaperpdf/16266030742016-02-02.pdf
Kawase, S., & Obata, S. (2016). Audience gaze while appreciating a multipart musical performance. Consciousness and Cognition, 46, 15–26. http://www.sciencedirect.com/science/article/pii/S1053810016302975
Prichard, C., & Atkins, A. (2016). Evaluating L2 Readers’ Previewing Strategies Using Eye Tracking. The Reading Matrix: An International Online Journal, 16(2). http://www.readingmatrix.com/files/15-992935s1.pdf
Rodríguez, F., Gustavo, A., Perezchica, M., Guadalupe, M., & Alvarado Herrera, A. (2016). Desmitificando el valor de las imágenes de celebridades en sitios web de establecimientos de alojamiento turístico. https://digitum.um.es/xmlui/handle/10201/51419
Ward, N. G., Jurado, C. N., Garcia, R. A., & Ramos, F. A. (2016). On the Possibility of Predicting Gaze Aversion to Improve Video-chat Efficiency. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, 267–270. https://doi.org/10.1145/2857491.2857497
2015
Dünser, A., Lochner, M., Engelke, U., & Fernández, D. R. (2015). Visual and Manual Control for Human-Robot Teleoperation. IEEE Computer Graphics and Applications, 35(3), 22–32. https://doi.org/10.1109/MCG.2015.4
Radecký, M., Vykopal, J., & Smutný, P. (2015). Analysis of syntactic elements and structure of web pages using eye-tracking technology. Carpathian Control Conference (ICCC), 2015 16th International, 420–425. https://doi.org/10.1109/CarpathianCC.2015.7145116
Nelson, C. A., Zhang, X., Webb, J., & Li, S. (2015). Fuzzy Control for Gaze-Guided Personal Assistance Robots: Simulation and Experimental Application. International Journal On Advances in Intelligent Systems, 8(1 and 2), 77–84. https://www.thinkmind.org/index.php?view=article&articleid=intsys_v8_n12_2015_7
Wibirama, S., Wijayanto, T., Nugroho, H. A., Bahit, M., & Winadi, M. N. (2015). Quantifying visual attention and visually induced motion sickness during day-night driving and sleep deprivation. 2015 International Conference on Data and Software Engineering (ICoDSE), 191–194. https://doi.org/10.1109/ICODSE.2015.7436996
El-Samahy, E., Mahfouf, M., Torres-Salomao, L. A., & Anzurez-Marin, J. (2015). A new computer control system for mental stress management using fuzzy logic. 2015 IEEE International Conference on Evolving and Adaptive Intelligent Systems (EAIS), 1–7. https://doi.org/10.1109/EAIS.2015.7368785
Burke, T., & Welter, J. (2015). Quantitative Analysis of Reading Comprehension and Reading Speed Based on Serif and Sans-serif Fonts. https://www.semanticscholar.org/paper/Quantitative-Analysis-of-Reading-Comprehension-and-Burke-Welter/b2a17cbaf8f1c74bb25ac6be3b03fd069a8aa96e
Folk, E. (2015). Quantitative Analysis of the Effects of Stage Lighting on Attention. https://www.semanticscholar.org/paper/Quantitative-Analysis-of-the-Effects-of-Stage-Folk/c9abf0946133098ae4e75df80f28aaa9c28cae05
Fong, S. S. M. (2015). Single-channel Electroencephalographic Recording in Children with Developmental Coordination Disorder: Validity and influence of Eye Blink Artifacts. Journal of Novel Physiotherapies, 05(04). https://doi.org/10.4172/2165-7025.1000270
Sarkar, A. R., Sanyal, G., & Majumder, S. (2015). Methodology for a Low-Cost Vision-Based Rehabilitation System for Stroke Patients. In S. Gupta, S. Bag, K. Ganguly, I. Sarkar, & P. Biswas (Eds.), Advancements of Medical Electronics (pp. 365–377). Springer India. https://doi.org/10.1007/978-81-322-2256-9_34
Taylor, P., Bilgrien, N., He, Z., & Siegelmann, H. T. (2015). EyeFrame: real-time memory aid improves human multitasking via domain-general eye tracking procedures. Human-Media Interaction, 17. https://doi.org/10.3389/fict.2015.00017
Yaneva, V., Temnikova, I., & Mitkov, R. (2015). Accessible Texts for Autism: An Eye-Tracking Study. Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, 49–57. https://doi.org/10.1145/2700648.2809852
2014
Zugal, S., & Pinggera, J. (2014). Low–Cost Eye–Trackers: Useful for Information Systems Research? In L. Iliadis, M. Papazoglou, & K. Pohl (Eds.), Advanced Information Systems Engineering Workshops (pp. 159–170). Springer International Publishing. https://doi.org/10.1007/978-3-319-07869-4_14
Pence, T. B., Dukes, L. C., Hodges, L. F., Meehan, N. K., & Johnson, A. (2014). An Eye Tracking Evaluation of a Virtual Pediatric Patient Training System for Nurses. In T. Bickmore, S. Marsella, & C. Sidner (Eds.), Intelligent Virtual Agents (pp. 329–338). Springer International Publishing. https://doi.org/10.1007/978-3-319-09767-1_43
Barber, T., Bertrand, J., Christ, C., Melloy, B. J., & Neyens, D. M. (2014). Comparing the Use of Active versus Passive Navigational Tools In a Virtual Desktop Environment via Eye Tracking. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 58(1), 1954–1958. https://doi.org/10.1177/1541931214581408
2013
Bolte, T., Nolan, A., & Andujar, M. (2013). Eye Scan-Path Training with Voice Dialog on Chest X-Rays and Analyzing Results using Gridinger’s Classification. 4.
2010
Hennessey, C., & Duchowski, A. T. (2010). An open source eye-gaze interface: Expanding the adoption of eye-gaze in everyday applications. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, 81–84. http://dl.acm.org/citation.cfm?id=1743686