Gazepoint Citations
We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a shortlist of publications that we have found to date. If you are interested in using our best eye-tracking software for marketers in your research and don’t have the software yet, shop now or contact us to get started!
If you have published your research from your neuromarketing study that uses the Gazepoint system, please let us know and we will add a link to your work here! Our suggested reference to cite Gazepoint in your research is: Gazepoint (2024). GP3 Eye-Tracker. Retrieved from https://www.gazept.com
3151148
1
apa
50
date
year
451
https://www.gazept.com/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A450%2C%22request_next%22%3A50%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22IU4U6NIX%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Fu%20and%20Chow%22%2C%22parsedDate%22%3A%222025%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EFu%2C%20B.%2C%20%26amp%3B%20Chow%2C%20N.%20%282025%29.%20AdaptLIL%3A%20A%20Real-Time%20Adaptive%20Linked%20Indented%20List%20Visualization%20for%20Ontology%20Mapping.%20In%20G.%20Demartini%2C%20K.%20Hose%2C%20M.%20Acosta%2C%20M.%20Palmonari%2C%20G.%20Cheng%2C%20H.%20Skaf-Molli%2C%20N.%20Ferranti%2C%20D.%20Hern%26%23xE1%3Bndez%2C%20%26amp%3B%20A.%20Hogan%20%28Eds.%29%2C%20%3Ci%3EThe%20Semantic%20Web%20%26%23x2013%3B%20ISWC%202024%3C%5C%2Fi%3E%20%28Vol.%2015232%2C%20pp.%203%26%23x2013%3B22%29.%20Springer%20Nature%20Switzerland.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2F978-3-031-77850-6_1%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22bookSection%22%2C%22title%22%3A%22AdaptLIL%3A%20A%20Real-Time%20Adaptive%20Linked%20Indented%20List%20Visualization%20for%20Ontology%20Mapping%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Gianluca%22%2C%22lastName%22%3A%22Demartini%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Katja%22%2C%22lastName%22%3A%22Hose%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Maribel%22%2C%22lastName%22%3A%22Acosta%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Matteo%22%2C%22lastName%22%3A%22Palmonari%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Gong%22%2C%22lastName%22%3A%22Cheng%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Hala%22%2C%22lastName%22%3A%22Skaf-Molli%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Nicolas%22%2C%22lastName%22%3A%22Ferranti%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Daniel%22%2C%22lastName%22%3A%22Hern%5Cu00e1ndez%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Aidan%22%2C%22lastName%22%3A%22Hogan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bo%22%2C%22lastName%22%3A%22Fu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicholas%22%2C%22lastName%22%3A%22Chow%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22bookTitle%22%3A%22The%20Semantic%20Web%20%5Cu2013%20ISWC%202024%22%2C%22date%22%3A%222025%22%2C%22language%22%3A%22en%22%2C%22ISBN%22%3A%229783031778490%209783031778506%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2F978-3-031-77850-6_1%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A54%3A49Z%22%7D%7D%2C%7B%22key%22%3A%22LFXNXDPB%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22%5Cu0422%5Cu043e%5Cu043a%5Cu043c%5Cu043e%5Cu0432%5Cu0446%5Cu0435%5Cu0432%5Cu0430%20and%20%5Cu0410%5Cu043a%5Cu0435%5Cu043b%5Cu044c%5Cu0435%5Cu0432%5Cu0430%22%2C%22parsedDate%22%3A%222024-12-17%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%26%23x422%3B%26%23x43E%3B%26%23x43A%3B%26%23x43C%3B%26%23x43E%3B%26%23x432%3B%26%23x446%3B%26%23x435%3B%26%23x432%3B%26%23x430%3B%2C%20%26%23x410%3B.%20%26%23x414%3B.%2C%20%26amp%3B%20%26%23x410%3B%26%23x43A%3B%26%23x435%3B%26%23x43B%3B%26%23x44C%3B%26%23x435%3B%26%23x432%3B%26%23x430%3B%2C%20%26%23x415%3B.%20%26%23x412%3B.%20%282024%29.%20Insights%20into%20Landscape%20Perception%20and%20Appreciation%20through%20Eye%20Movement%20Tracking.%20%3Ci%3ELurian%20Journal%3C%5C%2Fi%3E%2C%20%3Ci%3E5%3C%5C%2Fi%3E%282%29%2C%2038%26%23x2013%3B46.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.15826%5C%2FLurian.2024.5.2.2%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.15826%5C%2FLurian.2024.5.2.2%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Insights%20into%20Landscape%20Perception%20and%20Appreciation%20through%20Eye%20Movement%20Tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22%5Cu0410%5Cu043d%5Cu043d%5Cu0430%20%5Cu0414%5Cu0435%5Cu043d%5Cu0438%5Cu0441%5Cu043e%5Cu0432%5Cu043d%5Cu0430%22%2C%22lastName%22%3A%22%5Cu0422%5Cu043e%5Cu043a%5Cu043c%5Cu043e%5Cu0432%5Cu0446%5Cu0435%5Cu0432%5Cu0430%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22%5Cu0415%5Cu043b%5Cu0438%5Cu0437%5Cu0430%5Cu0432%5Cu0435%5Cu0442%5Cu0430%20%5Cu0412%5Cu043b%5Cu0430%5Cu0434%5Cu0438%5Cu043c%5Cu0438%5Cu0440%5Cu043e%5Cu0432%5Cu043d%5Cu0430%22%2C%22lastName%22%3A%22%5Cu0410%5Cu043a%5Cu0435%5Cu043b%5Cu044c%5Cu0435%5Cu0432%5Cu0430%22%7D%5D%2C%22abstractNote%22%3A%22The%20variety%20of%20Russian%20natural%20scenery%20provides%20its%20natives%20with%20unique%20perceptive%20experience.%20Previous%20studies%20of%20this%20issue%20show%20cross-cultural%20differences%20among%20Russians%20and%20other%20cultures%20in%20their%20descriptions%20and%20emotional%20evaluation%20of%20landscapes.%20Our%20research%20aims%20to%20broaden%20the%20results%20of%20previous%20literature%20by%20exploring%20the%20connection%20between%20subjective%20judgements%20of%20landscapes%20by%20Russians%20and%20visual-spatial%20attention.%20Participants%20%28n%20%3D%20187%3B%20136%20females%29%20were%20shown%20102%20pictures%20of%20native%20and%20foreign%20natural%20scenes%20and%20were%20asked%20to%20assess%20them%20by%20affective%20and%20location%20qualities%20using%20the%20semantic%20differential.%20During%20the%20picture%20presentation%2C%20eye%20movements%20were%20recorded%20with%20the%20usage%20of%20Tobii%20and%20Gazepoint%20eye-trackers.%20Our%20results%20suggest%20a%20link%20between%20individuals%5Cu2019%20cognitive%20evaluation%20of%20landscapes%20and%20visual%20behavior.%20Duration%20of%20fixation%20correlated%20significantly%20with%20exotism%20and%20familiarity%20marks%20%28p%20%26lt%3B%20.01%29%20for%20the%20landscapes%2C%20indicating%20the%20novel%20visual%20information%20which%20is%20gained%20from%20an%20exotic%20scenery%20demands%20additional%20attention%20resources.%20The%20correlation%20was%20also%20found%20among%20average%20saccade%20duration%2C%20emotional%20and%20aesthetic%20assessments%20%28p%20%26lt%3B%20.01%29.%20Therefore%2C%20the%20elongation%20of%20eye%20movement%20shifts%20might%20be%20a%20signifier%20of%20interest%20in%20the%20landscape.%20In%20the%20current%20study%2C%20we%20demonstrate%20the%20connection%20between%20visual%20attention%20and%20subjective%20judgment%20of%20a%20landscape%20image%2C%20which%20have%20not%20been%20shown%20previously%20for%20the%20Russian%20sample.%22%2C%22date%22%3A%222024-12-17%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.15826%5C%2FLurian.2024.5.2.2%22%2C%22ISSN%22%3A%222949-6373%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flurianjournal.ru%5C%2Fojs%5C%2Findex.php%5C%2Flurian%5C%2Farticle%5C%2Fview%5C%2F124%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A46%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22YPM8YNF9%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chow%20and%20Fu%22%2C%22parsedDate%22%3A%222024-12-14%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EChow%2C%20N.%2C%20%26amp%3B%20Fu%2C%20B.%20%282024%29.%20%3Ci%3EAdaptLIL%3A%20A%20Gaze-Adaptive%20Visualization%20for%20Ontology%20Mapping%3C%5C%2Fi%3E%20%28No.%20arXiv%3A2411.11768%29.%20arXiv.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2411.11768%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2411.11768%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22AdaptLIL%3A%20A%20Gaze-Adaptive%20Visualization%20for%20Ontology%20Mapping%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicholas%22%2C%22lastName%22%3A%22Chow%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bo%22%2C%22lastName%22%3A%22Fu%22%7D%5D%2C%22abstractNote%22%3A%22This%20paper%20showcases%20AdaptLIL%2C%20a%20real-time%20adaptive%20link-indented%20list%20ontology%20mapping%20visualization%20that%20uses%20eye%20gaze%20as%20the%20primary%20input%20source.%20Through%20a%20multimodal%20combination%20of%20real-time%20systems%2C%20deep%20learning%2C%20and%20web%20development%20applications%2C%20this%20system%20uniquely%20curtails%20graphical%20overlays%20%28adaptations%29%20to%20pairwise%20mappings%20of%20link-indented%20list%20ontology%20visualizations%20for%20individual%20users%20based%20solely%20on%20their%20eye%20gaze.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2411.11768%22%2C%22date%22%3A%222024-12-14%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2411.11768%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.11768%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A48%3A39Z%22%7D%7D%2C%7B%22key%22%3A%22URAAFVQX%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Cui%20and%20Liu%22%2C%22parsedDate%22%3A%222024-12-06%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ECui%2C%20Y.%2C%20%26amp%3B%20Liu%2C%20X.%20%282024%29.%20How%20condensation%20and%20non-condensation%20impact%20viewers%26%23x2019%3B%20processing%20effort%20and%20comprehension%20%26%23x2013%3B%20an%20eye-tracking%20study%20on%20Chinese%20subtitling%20of%20English%20documentaries.%20%3Ci%3EPerspectives%3C%5C%2Fi%3E%2C%201%26%23x2013%3B19.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F0907676X.2024.2433059%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F0907676X.2024.2433059%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22How%20condensation%20and%20non-condensation%20impact%20viewers%5Cu2019%20processing%20effort%20and%20comprehension%20%5Cu2013%20an%20eye-tracking%20study%20on%20Chinese%20subtitling%20of%20English%20documentaries%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ying%22%2C%22lastName%22%3A%22Cui%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xiao%22%2C%22lastName%22%3A%22Liu%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-12-06%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1080%5C%2F0907676X.2024.2433059%22%2C%22ISSN%22%3A%220907-676X%2C%201747-6623%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.tandfonline.com%5C%2Fdoi%5C%2Ffull%5C%2F10.1080%5C%2F0907676X.2024.2433059%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A03%3A51Z%22%7D%7D%2C%7B%22key%22%3A%22SG4X9W57%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Erol%20Barkana%20et%20al.%22%2C%22parsedDate%22%3A%222024-11-08%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EErol%20Barkana%2C%20D.%2C%20Bartl-Pokorny%2C%20K.%20D.%2C%20Kose%2C%20H.%2C%20Landowska%2C%20A.%2C%20Milling%2C%20M.%2C%20Robins%2C%20B.%2C%20Schuller%2C%20B.%20W.%2C%20Uluer%2C%20P.%2C%20Wrobel%2C%20M.%20R.%2C%20%26amp%3B%20Zorcec%2C%20T.%20%282024%29.%20Challenges%20in%20Observing%20the%20Emotions%20of%20Children%20with%20Autism%20Interacting%20with%20a%20Social%20Robot.%20%3Ci%3EInternational%20Journal%20of%20Social%20Robotics%3C%5C%2Fi%3E.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs12369-024-01185-3%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs12369-024-01185-3%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Challenges%20in%20Observing%20the%20Emotions%20of%20Children%20with%20Autism%20Interacting%20with%20a%20Social%20Robot%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Duygun%22%2C%22lastName%22%3A%22Erol%20Barkana%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Katrin%20D.%22%2C%22lastName%22%3A%22Bartl-Pokorny%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hatice%22%2C%22lastName%22%3A%22Kose%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Agnieszka%22%2C%22lastName%22%3A%22Landowska%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Manuel%22%2C%22lastName%22%3A%22Milling%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ben%22%2C%22lastName%22%3A%22Robins%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bj%5Cu00f6rn%20W.%22%2C%22lastName%22%3A%22Schuller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pinar%22%2C%22lastName%22%3A%22Uluer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michal%20R.%22%2C%22lastName%22%3A%22Wrobel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tatjana%22%2C%22lastName%22%3A%22Zorcec%22%7D%5D%2C%22abstractNote%22%3A%22Abstract%5Cn%20%20%20%20%20%20%20%20%20%20%20%20This%20paper%20concerns%20the%20methodology%20of%20multi-modal%20data%20acquisition%20in%20observing%20emotions%20experienced%20by%20children%20with%20autism%20while%20they%20interact%20with%20a%20social%20robot.%20As%20robot-enhanced%20therapy%20gains%20more%20and%20more%20attention%20and%20proved%20to%20be%20effective%20in%20autism%2C%20such%20observations%20might%20influence%20the%20future%20development%20and%20use%20of%20such%20technologies.%20The%20paper%20is%20based%20on%20an%20observational%20study%20of%20child-robot%20interaction%2C%20during%20which%20multiple%20modalities%20were%20captured%20and%20then%20analyzed%20to%20retrieve%20information%20on%20a%20child%5Cu2019s%20emotional%20state.%20Over%2030%20children%20on%20the%20autism%20spectrum%20from%20Macedonia%2C%20Turkey%2C%20Poland%2C%20and%20the%20United%20Kingdom%20took%20part%20in%20our%20study%20and%20interacted%20with%20the%20social%20robot%20Kaspar.%20We%20captured%20facial%20expressions%5C%2Fbody%20posture%2C%20voice%5C%2Fvocalizations%2C%20physiological%20signals%2C%20and%20eyegaze-related%20data.%20The%20main%20contribution%20of%20the%20paper%20is%20reporting%20challenges%20and%20lessons%20learned%20with%20regard%20to%20interaction%2C%20its%20environment%2C%20and%20observation%20channels%20typically%20used%20for%20emotion%20estimation.%20The%20main%20challenge%20is%20the%20limited%20availability%20of%20channels%2C%20especially%20eyegaze-related%20%2829%25%29%20and%20voice-related%20%286%25%29%20data%20are%20not%20available%20throughout%20the%20entire%20session.%20The%20challenges%20are%20of%20a%20diverse%20nature%5Cu2014we%20distinguished%20task-based%2C%20child-based%2C%20and%20environment-based%20ones.%20Choosing%20the%20tasks%20%28scenario%29%20and%20adapting%20environment%2C%20such%20as%20room%2C%20equipment%2C%20accompanying%20person%2C%20is%20crucial%20but%20even%20with%20those%20works%20done%2C%20the%20child-related%20challenge%20is%20the%20most%20important%20one.%20Therapists%20have%20pointed%20out%20to%20a%20good%20potential%20of%20those%20technologies%2C%20however%2C%20the%20main%20challenge%20to%20keep%20a%20child%20engaged%20and%20focused%2C%20remains.%20The%20technology%20must%20follow%20a%20child%5Cu2019s%20interest%2C%20movement%2C%20and%20mood.%20The%20main%20observations%20are%20the%20necessity%20to%20train%20personalized%20models%20of%20emotions%20as%20children%20with%20autism%20differ%20in%20level%20of%20skills%20and%20expressions%2C%20and%20emotion%20recognition%20technology%20adaptation%20in%20real%20time%20%28e.%5Cu00a0g.%2C%20switching%20modalities%29%20to%20capture%20variability%20in%20emotional%20outcomes.%22%2C%22date%22%3A%222024-11-08%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs12369-024-01185-3%22%2C%22ISSN%22%3A%221875-4791%2C%201875-4805%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2Fs12369-024-01185-3%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A56%3A25Z%22%7D%7D%2C%7B%22key%22%3A%22C5VZ8XDT%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ciukaj%20and%20Skublewska-Paszkowska%22%2C%22parsedDate%22%3A%222024-09-30%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ECiukaj%2C%20M.%2C%20%26amp%3B%20Skublewska-Paszkowska%2C%20M.%20%282024%29.%20Comparative%20analysis%20of%20the%20availability%20of%20popular%20social%20networking%20sites.%20%3Ci%3EJournal%20of%20Computer%20Sciences%20Institute%3C%5C%2Fi%3E%2C%20%3Ci%3E32%3C%5C%2Fi%3E%2C%20217%26%23x2013%3B222.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.35784%5C%2Fjcsi.6292%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.35784%5C%2Fjcsi.6292%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Comparative%20analysis%20of%20the%20availability%20of%20popular%20social%20networking%20sites%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Maciej%22%2C%22lastName%22%3A%22Ciukaj%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Maria%22%2C%22lastName%22%3A%22Skublewska-Paszkowska%22%7D%5D%2C%22abstractNote%22%3A%22In%20the%20era%20of%20growing%20number%20of%20Internet%20users%20and%20the%20importance%20of%20social%20media%2C%20ensuring%20the%20accessibility%20of%20these%20platforms%20for%20all%20audiences%20has%20become%20a%20crucial%20aspect%20of%20digital%20inclusion.%20This%20article%20presents%20a%20study%20on%20the%20accessibility%20of%20Facebook%2C%20Instagram%2C%20X%2C%20and%20TikTok%20using%20Google%20Lighthouse%20and%20eye%20tracking.%20Two%20hypotheses%20were%20analyzed%3A%20H1%2C%20assuming%20a%20relationship%20between%20accessibility%20scores%20and%20the%20time%20to%20the%20first%20eye%20fixation%20on%20key%20elements%20of%20social%20networks%2C%20and%20H2%2C%20suggesting%20a%20relationship%20between%20accessibility%20and%20performance.%20Eye%20tracking%20research%20on%20a%20sample%20of%2015%20people%20showed%20that%20high%20accessibility%20does%20not%20always%20translate%20to%20ease%20of%20use%2C%20and%20no%20significant%20correlation%20between%20accessibility%20and%20platform%20performance%20was%20found.%22%2C%22date%22%3A%222024-09-30%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.35784%5C%2Fjcsi.6292%22%2C%22ISSN%22%3A%222544-0764%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fph.pollub.pl%5C%2Findex.php%5C%2Fjcsi%5C%2Farticle%5C%2Fview%5C%2F6292%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A43%3A35Z%22%7D%7D%2C%7B%22key%22%3A%22BQWMQMBY%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Huang%20et%20al.%22%2C%22parsedDate%22%3A%222024-09-23%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHuang%2C%20J.%2C%20Gopalakrishnan%2C%20S.%2C%20Mittal%2C%20T.%2C%20Zuena%2C%20J.%2C%20%26amp%3B%20Pytlarz%2C%20J.%20%282024%29.%20%3Ci%3EAnalysis%20of%20Human%20Perception%20in%20Distinguishing%20Real%20and%20AI-Generated%20Faces%3A%20An%20Eye-Tracking%20Based%20Study%3C%5C%2Fi%3E%20%28No.%20arXiv%3A2409.15498%29.%20arXiv.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2409.15498%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2409.15498%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Analysis%20of%20Human%20Perception%20in%20Distinguishing%20Real%20and%20AI-Generated%20Faces%3A%20An%20Eye-Tracking%20Based%20Study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jin%22%2C%22lastName%22%3A%22Huang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Subhadra%22%2C%22lastName%22%3A%22Gopalakrishnan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Trisha%22%2C%22lastName%22%3A%22Mittal%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jake%22%2C%22lastName%22%3A%22Zuena%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jaclyn%22%2C%22lastName%22%3A%22Pytlarz%22%7D%5D%2C%22abstractNote%22%3A%22Recent%20advancements%20in%20Artificial%20Intelligence%20have%20led%20to%20remarkable%20improvements%20in%20generating%20realistic%20human%20faces.%20While%20these%20advancements%20demonstrate%20significant%20progress%20in%20generative%20models%2C%20they%20also%20raise%20concerns%20about%20the%20potential%20misuse%20of%20these%20generated%20images.%20In%20this%20study%2C%20we%20investigate%20how%20humans%20perceive%20and%20distinguish%20between%20real%20and%20fake%20images.%20We%20designed%20a%20perceptual%20experiment%20using%20eye-tracking%20technology%20to%20analyze%20how%20individuals%20differentiate%20real%20faces%20from%20those%20generated%20by%20AI.%20Our%20analysis%20of%20StyleGAN-3%20generated%20images%20reveals%20that%20participants%20can%20distinguish%20real%20from%20fake%20faces%20with%20an%20average%20accuracy%20of%2076.80%25.%20Additionally%2C%20we%20found%20that%20participants%20scrutinize%20images%20more%20closely%20when%20they%20suspect%20an%20image%20to%20be%20fake.%20We%20believe%20this%20study%20offers%20valuable%20insights%20into%20human%20perception%20of%20AI-generated%20media.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2409.15498%22%2C%22date%22%3A%222024-09-23%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2409.15498%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2409.15498%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A53%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22PB39KT42%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Palacios-Ib%5Cu00e1%5Cu00f1ez%20et%20al.%22%2C%22parsedDate%22%3A%222024-08-30%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EPalacios-Ib%26%23xE1%3B%26%23xF1%3Bez%2C%20A.%2C%20Castellet-Lathan%2C%20S.%2C%20%26amp%3B%20Contero%2C%20M.%20%282024%29.%20Exploring%20the%20user%26%23x2019%3Bs%20gaze%20during%20product%20evaluation%20through%20the%20semantic%20differential%3A%20a%20comparison%20between%20virtual%20reality%20and%20photorealistic%20images.%20%3Ci%3EVirtual%20Reality%3C%5C%2Fi%3E%2C%20%3Ci%3E28%3C%5C%2Fi%3E%283%29%2C%20153.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs10055-024-01048-2%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs10055-024-01048-2%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Exploring%20the%20user%5Cu2019s%20gaze%20during%20product%20evaluation%20through%20the%20semantic%20differential%3A%20a%20comparison%20between%20virtual%20reality%20and%20photorealistic%20images%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Almudena%22%2C%22lastName%22%3A%22Palacios-Ib%5Cu00e1%5Cu00f1ez%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Santiago%22%2C%22lastName%22%3A%22Castellet-Lathan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Manuel%22%2C%22lastName%22%3A%22Contero%22%7D%5D%2C%22abstractNote%22%3A%22Abstract%5Cn%20%20%20%20%20%20%20%20%20%20%20%20Advanced%20product%20presentation%20methods%20can%20enhance%20the%20product%20evaluation%20experience%20both%20during%20the%20design%20process%20and%20online%20shopping%2C%20as%20static%20images%20often%20fail%20to%20convey%20essential%20product%20details.%20Virtual%20Reality%20%28VR%29%20technologies%20hold%20great%20potential%20in%20this%20regard%2C%20becoming%20increasingly%20accessible%20to%20all%20users.%20However%2C%20the%20influence%20of%20display%20mediums%20on%20emotional%20responses%20and%20product%20assessment%20needs%20further%20investigation%2C%20especially%20using%20physiological%20measures%20to%20obtain%20more%20objective%20insights.%20In%20this%20study%2C%20we%20investigate%20the%20influence%20of%20VR%20and%20photorealistic%20images%20on%20assessing%20and%20observing%20virtual%20prototypes%20of%20game%20controllers.%20The%20Semantic%20Differential%20technique%20was%20employed%20for%20product%20assessment%2C%20while%20built-in%20eye-tracking%20was%20used%20to%20measure%20participants%5Cu2019%20viewing%20time%20on%20various%20areas%20of%20interest%20%28AOIs%29.%20Our%20findings%20show%20that%20the%20medium%20significantly%20affects%20not%20only%20product%20evaluation%20and%20confidence%20in%20the%20response%20but%20also%20how%20the%20user%20observes%20it%2C%20with%20sensory-related%20features%20being%20particularly%20influenced.%20These%20findings%20hold%20practical%20implications%20for%20product%20design%20and%20vendors%2C%20as%20understanding%20the%20relationship%20between%20visualization%20mediums%20and%20product%20evaluation%20enhances%20the%20design%20process%20and%20improves%20consumer%20experiences.%22%2C%22date%22%3A%222024-08-30%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs10055-024-01048-2%22%2C%22ISSN%22%3A%221434-9957%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2Fs10055-024-01048-2%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A40%3A27Z%22%7D%7D%2C%7B%22key%22%3A%22KWS5ZP9Y%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Huang%20et%20al.%22%2C%22parsedDate%22%3A%222024-08-14%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHuang%2C%20Z.%2C%20Zhu%2C%20G.%2C%20Duan%2C%20X.%2C%20Wang%2C%20R.%2C%20Li%2C%20Y.%2C%20Zhang%2C%20S.%2C%20%26amp%3B%20Wang%2C%20Z.%20%282024%29.%20%3Ci%3EMeasuring%20eye-tracking%20accuracy%20and%20its%20impact%20on%20usability%20in%20apple%20vision%20pro%3C%5C%2Fi%3E%20%28No.%20arXiv%3A2406.00255%29.%20arXiv.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2406.00255%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2406.00255%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Measuring%20eye-tracking%20accuracy%20and%20its%20impact%20on%20usability%20in%20apple%20vision%20pro%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zehao%22%2C%22lastName%22%3A%22Huang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gancheng%22%2C%22lastName%22%3A%22Zhu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xiaoting%22%2C%22lastName%22%3A%22Duan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rong%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yongkai%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shuai%22%2C%22lastName%22%3A%22Zhang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zhiguo%22%2C%22lastName%22%3A%22Wang%22%7D%5D%2C%22abstractNote%22%3A%22With%20built-in%20eye-tracking%20cameras%2C%20the%20recently%20released%20Apple%20Vision%20Pro%20%28AVP%29%20mixed%20reality%20%28MR%29%20headset%20features%20gaze-based%20interaction%2C%20eye%20image%20rendering%20on%20external%20screens%2C%20and%20iris%20recognition%20for%20device%20unlocking.%20One%20of%20the%20technological%20advancements%20of%20the%20AVP%20is%20its%20heavy%20reliance%20on%20gaze-%20and%20gesture-based%20interaction.%20However%2C%20limited%20information%20is%20available%20regarding%20the%20technological%20specifications%20of%20the%20eye-tracking%20capability%20of%20the%20AVP%2C%20and%20raw%20gaze%20data%20is%20inaccessible%20to%20developers.%20This%20study%20evaluates%20the%20eye-tracking%20accuracy%20of%20the%20AVP%20with%20two%20sets%20of%20tests%20spanning%20both%20MR%20and%20virtual%20reality%20%28VR%29%20applications.%20This%20study%20also%20examines%20how%20eye-tracking%20accuracy%20relates%20to%20user-reported%20usability.%20The%20results%20revealed%20an%20overall%20eye-tracking%20accuracy%20of%201.11%7B%5C%5Cdeg%7D%20and%200.93%7B%5C%5Cdeg%7D%20in%20two%20testing%20setups%2C%20within%20a%20field%20of%20view%20%28FOV%29%20of%20approximately%2034%7B%5C%5Cdeg%7D%20x%2018%7B%5C%5Cdeg%7D.%20The%20usability%20and%20learnability%20scores%20of%20the%20AVP%2C%20measured%20using%20the%20standard%20System%20Usability%20Scale%20%28SUS%29%2C%20were%2075.24%20and%2068.26%2C%20respectively.%20Importantly%2C%20no%20statistically%20reliable%20correlation%20was%20found%20between%20eye-tracking%20accuracy%20and%20usability%20scores.%20These%20results%20suggest%20that%20eye-tracking%20accuracy%20is%20critical%20for%20gaze-based%20interaction%2C%20but%20it%20is%20not%20the%20sole%20determinant%20of%20user%20experience%20in%20VR%5C%2FAR.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2406.00255%22%2C%22date%22%3A%222024-08-14%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2406.00255%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2406.00255%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A39%3A44Z%22%7D%7D%2C%7B%22key%22%3A%22GXY9IDII%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Wiediartini%20et%20al.%22%2C%22parsedDate%22%3A%222024-08-02%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EWiediartini%2C%20Ciptomulyono%2C%20U.%2C%20%26amp%3B%20Dewi%2C%20R.%20S.%20%282024%29.%20Evaluation%20of%20physiological%20responses%20to%20mental%20workload%20in%20n-back%20and%20arithmetic%20tasks.%20%3Ci%3EErgonomics%3C%5C%2Fi%3E%2C%20%3Ci%3E67%3C%5C%2Fi%3E%288%29%2C%201121%26%23x2013%3B1133.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F00140139.2023.2284677%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F00140139.2023.2284677%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Evaluation%20of%20physiological%20responses%20to%20mental%20workload%20in%20n-back%20and%20arithmetic%20tasks%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22name%22%3A%22Wiediartini%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Udisubakti%22%2C%22lastName%22%3A%22Ciptomulyono%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ratna%20Sari%22%2C%22lastName%22%3A%22Dewi%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-08-02%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1080%5C%2F00140139.2023.2284677%22%2C%22ISSN%22%3A%220014-0139%2C%201366-5847%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.tandfonline.com%5C%2Fdoi%5C%2Ffull%5C%2F10.1080%5C%2F00140139.2023.2284677%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A40%3A27Z%22%7D%7D%2C%7B%22key%22%3A%22D6XC3MHB%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Lin%20et%20al.%22%2C%22parsedDate%22%3A%222024-08%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ELin%2C%20J.-H.%2C%20Hsu%2C%20M.%2C%20%26amp%3B%20Guo%2C%20L.-Y.%20%282024%29.%20Investigation%20of%20the%20Reliability%20of%20Oculomotor%20Assessment%20of%20Gaze%20and%20Smooth%20Pursuit%20with%20a%20Novel%20Approach.%20%3Ci%3E2024%2017th%20International%20Convention%20on%20Rehabilitation%20Engineering%20and%20Assistive%20Technology%20%28i-CREATe%29%3C%5C%2Fi%3E%2C%201%26%23x2013%3B6.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2Fi-CREATe62067.2024.10776135%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2Fi-CREATe62067.2024.10776135%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Investigation%20of%20the%20Reliability%20of%20Oculomotor%20Assessment%20of%20Gaze%20and%20Smooth%20Pursuit%20with%20a%20Novel%20Approach%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jeng-Heng%22%2C%22lastName%22%3A%22Lin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Min%22%2C%22lastName%22%3A%22Hsu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lan-Yuen%22%2C%22lastName%22%3A%22Guo%22%7D%5D%2C%22abstractNote%22%3A%22The%20oculo-motor%20control%20mechanism%20is%20intimately%20related%20to%20the%20sensory%20systems.%20Conducting%20assessments%20of%20ocular%20movement%20typically%20required%20precise%20instrumentation%20for%20analysis.%20However%2C%20the%20high%20cost%20of%20these%20instruments%20resulted%20in%20limited%20accessibility.%20The%20analysis%20of%20oculomotor%20features%20allowed%20for%20the%20evaluation%20of%20the%20control%20mechanism%20of%20the%20sensory-motor%20system%2C%20including%20the%20assessment%20of%20proprioceptive%20and%20vestibular%20impairment.%20Currently%2C%20clinical%20oculomotor%20evaluations%20are%20typically%20conducted%20with%20a%20focus%20on%20assessing%20the%20relative%20movement%20of%20the%20head%20and%20neck%20in%20a%20single%20plane.%20A%20research%20gap%20exists%20regarding%20the%20influence%20of%20neck%20positions%20in%20other%20planes%20on%20oculomotor%20features.%20Consequently%2C%20this%20study%20integrated%20a%20multi-axis%20motion%20Stewart%20platform%20and%20a%20low-frequency%20eye%20tracker%20%28Gazepoint%203%2C%20GP3%29%20to%20develop%20a%20highly%20applicable%20oculomotor%20feature%20analysis%20system.%20This%20system%20enabled%20a%20multi-plane%20neck%20position%2C%20providing%20a%20more%20comprehensive%20investigation%20of%20the%20oculomotor%20control%20mechanism.%20This%20research%20investigated%20two%20types%20of%20eye%20movement%20analysis%3A%20gaze%20and%20smooth%20pursuit.%20A%20total%20of%20seven%20participants%20were%20involved%20in%20the%20study.%20The%20data%20underwent%20signal%20processing%20for%20calculating%20oculomotor%20feature%20parameters%20and%20evaluating%20test-retest%20reliability%20performance.%20The%20results%20demonstrated%20that%20the%20GP3%20exhibited%20moderate%20reliability%20when%20observing%20eye%20gaze%20at%20varying%20positions%20on%20a%20screen%20with%20the%20neck%20positioned%20neutrally%20%28ICC%3A%200.212%5Cu223c0.549%29.%20The%20accuracy%20of%20gaze%20performance%20ranged%20from%200.99%5Cu00b0%20to%201.54%5Cu00b0%2C%20and%20the%20precision%20ranged%20from%200.25%5Cu00b0%20to%200.44%5Cu00b0.%20The%20oculomotor%20analysis%20system%20developed%20in%20this%20study%20achieved%20an%20acceptable%20level%20of%20performance%20in%20terms%20of%20related%20parameter%20data%20and%20reliability%2C%20providing%20a%20foundation%20for%20the%20clinical%20evaluation%20of%20oculomotor%20assessment.%22%2C%22date%22%3A%222024-08%22%2C%22proceedingsTitle%22%3A%222024%2017th%20International%20Convention%20on%20Rehabilitation%20Engineering%20and%20Assistive%20Technology%20%28i-CREATe%29%22%2C%22conferenceName%22%3A%222024%2017th%20International%20Convention%20on%20Rehabilitation%20Engineering%20and%20Assistive%20Technology%20%28i-CREATe%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2Fi-CREATe62067.2024.10776135%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fabstract%5C%2Fdocument%5C%2F10776135%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A22%3A16Z%22%7D%7D%2C%7B%22key%22%3A%2237CBIZRP%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Silva%20et%20al.%22%2C%22parsedDate%22%3A%222024-07-26%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESilva%2C%20F.%2C%20Garrido%2C%20M.%20I.%2C%20%26amp%3B%20Soares%2C%20S.%20C.%20%282024%29.%20The%20effect%20of%20anxiety%20and%20its%20interplay%20with%20social%20cues%20when%20perceiving%20aggressive%20behaviours.%20%3Ci%3EQuarterly%20Journal%20of%20Experimental%20Psychology%3C%5C%2Fi%3E%2C%2017470218241258208.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F17470218241258209%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F17470218241258209%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22The%20effect%20of%20anxiety%20and%20its%20interplay%20with%20social%20cues%20when%20perceiving%20aggressive%20behaviours%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22F%5Cu00e1bio%22%2C%22lastName%22%3A%22Silva%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marta%20I.%22%2C%22lastName%22%3A%22Garrido%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sandra%20C.%22%2C%22lastName%22%3A%22Soares%22%7D%5D%2C%22abstractNote%22%3A%22Contextual%20cues%20and%20emotional%20states%20carry%20expectations%20and%20biases%20that%20are%20used%20to%20attribute%20meaning%20to%20what%20we%20see.%20In%20addition%2C%20emotional%20states%2C%20such%20as%20anxiety%2C%20shape%20our%20visual%20systems%2C%20increasing%20overall%2C%20and%20particularly%20threat-related%2C%20sensitivity.%20It%20remains%20unclear%2C%20however%2C%20how%20anxiety%20interacts%20with%20additional%20cues%20when%20categorising%20sensory%20input.%20This%20is%20especially%20important%20in%20social%20scenarios%20where%20ambiguous%20gestures%20are%20commonplace%2C%20thus%20requiring%20the%20integration%20of%20cues%20for%20a%20proper%20interpretation.%20To%20this%20end%2C%20we%20decided%20to%20assess%20how%20states%20of%20anxiety%20might%20bias%20the%20perception%20of%20potentially%20aggressive%20social%20interactions%2C%20and%20how%20external%20cues%20are%20incorporated%20in%20this%20process.%20Participants%20%28%20N%5Cu2009%3D%5Cu200971%29%20were%20tasked%20with%20signalling%20the%20presence%20of%20aggression%20in%20ambiguous%20social%20interactions.%20Simultaneously%2C%20an%20observer%20%28facial%20expression%29%20reacted%20%28by%20showing%20an%20emotional%20expression%29%20to%20this%20interaction.%20Importantly%2C%20participants%20performed%20this%20task%20under%20safety%20and%20threat%20of%20shock%20conditions.%20Decision%20measures%20and%20eye-tracking%20data%20were%20collected.%20Our%20results%20showed%20that%20threat%20of%20shock%20did%20not%20affect%20sensitivity%20nor%20criterion%20when%20detecting%20aggressive%20interactions.%20The%20same%20pattern%20was%20observed%20for%20response%20times.%20Drift%20diffusion%20modelling%20analysis%2C%20however%2C%20suggested%20quicker%20evidence%20accumulation%20when%20under%20threat.%20Finally%2C%20dwell%20times%20over%20the%20observer%20were%20higher%20when%20under%20threat%2C%20indicating%20a%20possible%20association%20between%20anxiety%20states%20and%20a%20bias%20towards%20potentially%20threat-related%20indicators.%20Future%20probing%20into%20this%20topic%20remains%20a%20necessity%20to%20better%20explain%20the%20current%20findings.%22%2C%22date%22%3A%222024-07-26%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1177%5C%2F17470218241258209%22%2C%22ISSN%22%3A%221747-0218%2C%201747-0226%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fjournals.sagepub.com%5C%2Fdoi%5C%2F10.1177%5C%2F17470218241258209%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A13%3A02Z%22%7D%7D%2C%7B%22key%22%3A%22I35LMM5V%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Bezgin%20Edi%5Cu015f%20et%20al.%22%2C%22parsedDate%22%3A%222024-07-23%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EBezgin%20Edi%26%23x15F%3B%2C%20L.%2C%20K%26%23x131%3Bl%26%23x131%3B%26%23xE7%3B%2C%20S.%2C%20%26amp%3B%20Ayd%26%23x131%3Bn%2C%20S.%20%282024%29.%20Message%20Appeals%20of%20Social%20Media%20Postings%3A%20An%20Experimental%20Study%20on%20Non-Governmental%20Organization.%20%3Ci%3EJournal%20of%20Nonprofit%20%26amp%3B%20Public%20Sector%20Marketing%3C%5C%2Fi%3E%2C%201%26%23x2013%3B21.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F10495142.2024.2377975%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F10495142.2024.2377975%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Message%20Appeals%20of%20Social%20Media%20Postings%3A%20An%20Experimental%20Study%20on%20Non-Governmental%20Organization%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Leyla%22%2C%22lastName%22%3A%22Bezgin%20Edi%5Cu015f%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sabiha%22%2C%22lastName%22%3A%22K%5Cu0131l%5Cu0131%5Cu00e7%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Serap%22%2C%22lastName%22%3A%22Ayd%5Cu0131n%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-07-23%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1080%5C%2F10495142.2024.2377975%22%2C%22ISSN%22%3A%221049-5142%2C%201540-6997%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.tandfonline.com%5C%2Fdoi%5C%2Ffull%5C%2F10.1080%5C%2F10495142.2024.2377975%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A13%3A02Z%22%7D%7D%2C%7B%22key%22%3A%222VGV5NPL%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Fu%20et%20al.%22%2C%22parsedDate%22%3A%222024-06-27%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EFu%2C%20B.%2C%20Soriano%2C%20A.%20R.%2C%20Chu%2C%20K.%2C%20Gatsby%2C%20P.%2C%20%26amp%3B%20Guardado%2C%20N.%20%282024%29.%20Modelling%20Visual%20Attention%20for%20Future%20Intelligent%20Flight%20Deck%20-%20A%20Case%20Study%20of%20Pilot%20Eye%20Tracking%20in%20Simulated%20Flight%20Takeoff.%20%3Ci%3EAdjunct%20Proceedings%20of%20the%2032nd%20ACM%20Conference%20on%20User%20Modeling%2C%20Adaptation%20and%20Personalization%3C%5C%2Fi%3E%2C%20170%26%23x2013%3B175.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3631700.3664871%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3631700.3664871%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Modelling%20Visual%20Attention%20for%20Future%20Intelligent%20Flight%20Deck%20-%20A%20Case%20Study%20of%20Pilot%20Eye%20Tracking%20in%20Simulated%20Flight%20Takeoff%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bo%22%2C%22lastName%22%3A%22Fu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Angelo%20Ryan%22%2C%22lastName%22%3A%22Soriano%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kayla%22%2C%22lastName%22%3A%22Chu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Gatsby%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicolas%22%2C%22lastName%22%3A%22Guardado%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-06-27%22%2C%22proceedingsTitle%22%3A%22Adjunct%20Proceedings%20of%20the%2032nd%20ACM%20Conference%20on%20User%20Modeling%2C%20Adaptation%20and%20Personalization%22%2C%22conferenceName%22%3A%22UMAP%20%2724%3A%2032nd%20ACM%20Conference%20on%20User%20Modeling%2C%20Adaptation%20and%20Personalization%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1145%5C%2F3631700.3664871%22%2C%22ISBN%22%3A%229798400704666%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3631700.3664871%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A44%3A52Z%22%7D%7D%2C%7B%22key%22%3A%22TKHVZMSZ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Murphy%20et%20al.%22%2C%22parsedDate%22%3A%222024-06-17%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EMurphy%2C%20T.%2C%20Armitage%2C%20J.%20A.%2C%20van%20Wijngaarden%2C%20P.%2C%20Abel%2C%20L.%20A.%2C%20%26amp%3B%20Douglass%2C%20A.%20%282024%29.%20Unmasking%20visual%20search%3A%20an%20objective%20framework%20for%20grouping%20eye%20tracking%20data.%20%3Ci%3EInvestigative%20Ophthalmology%20%26amp%3B%20Visual%20Science%3C%5C%2Fi%3E%2C%20%3Ci%3E65%3C%5C%2Fi%3E%287%29%2C%205179.%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Unmasking%20visual%20search%3A%20an%20objective%20framework%20for%20grouping%20eye%20tracking%20data%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tim%22%2C%22lastName%22%3A%22Murphy%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22James%20Andrew%22%2C%22lastName%22%3A%22Armitage%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22van%20Wijngaarden%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Larry%20A%22%2C%22lastName%22%3A%22Abel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amanda%22%2C%22lastName%22%3A%22Douglass%22%7D%5D%2C%22abstractNote%22%3A%22Clustering%20gaze%20patterns%20to%20assess%20visual%20search%20behavior%20is%20a%20complex%20process%20requiring%20a%20degree%20of%20subjectivity.%20This%20study%20tested%20an%20objective%20framework%20for%20clustering%20raw%20gaze%20data%20to%20extract%20the%20visual%20search%20strategies%20of%20clinicians%20screening%20images%20for%20disease.%20%20%20%20Eye%20tracking%20data%20from%20a%20Gazepoint%20GP3%20HD%20were%20collected%20from%2026%20expert%20participants%20%28optometrists%2C%20general%20ophthalmologists%20and%20vitreoretinal%20specialists%29%20as%20they%20graded%20diabetic%20retinopathy%20in%20up%20to%2040%20posterior%20pole%20fundus%20photographs%2C%20and%20their%20results%20grouped%20into%20subsets%20on%20the%20basis%20of%20correct%20and%20incorrect%20diagnosis.%20Hidden%20Markov%20Models%20were%20used%20to%20determine%20the%20areas%20of%20interest%20common%20to%20all%20images%20based%20on%20raw%20gaze%20data%2C%20which%20were%20manually%20annotated%20and%20expanded%20to%20account%20for%20hardware%20accuracy.%20Fixation%20data%20were%20then%20converted%20to%20directed%20acyclic%20graphs%20using%20these%20annotations%20and%20simplified%20with%20depth-first%20search%20to%20create%20fixation%20strings.%20Results%20were%20compared%20using%20Levenshtein%20distance%20%281%29%20and%20clustered%20using%20Affinity%20Propagation%20%282%29%2C%20with%20exemplars%20generated%20to%20visualise%20each%20cluster.%20%20%20%20Hidden%20Markov%20Models%20clustered%20gaze%20data%20around%20the%20optic%20disc%2C%20macula%2C%20and%20superior%20and%20inferior%20arcades.%20Image%20annotations%20were%20created%20for%20the%20optic%20disc%2C%20superior%20arcade%2C%20inferior%20arcade%2C%20and%20nasal%20and%20temporal%20macular%20regions.%20Fixation%20strings%20for%20correct%20%28n%3D4584%29%20and%20incorrect%20%28n%3D914%29%20records%20were%20clustered%20and%20exemplars%20extracted%20for%20the%20largest%20clusters%20in%20each%20subset.%20The%20exemplar%20for%20the%20largest%20cluster%20in%20the%20correct%20subset%20%28n%3D36%29%20was%20%5Bnasal%20macula%2C%20inferior%20arcade%2C%20superior%20arcade%2C%20temporal%20macula%2C%20nasal%20macula%5D.%20The%20largest%20cluster%20in%20the%20incorrect%20subset%20%28n%3D26%29%20was%20%5Btemporal%20macula%2C%20nasal%20macula%2C%20inferior%20arcade%2C%20nasal%20macula%2C%20inferior%20arcade%2C%20temporal%20macula%2C%20superior%20arcade%2C%20temporal%20macula%5D.%20%20%20%20Our%20results%20show%20that%20an%20objective%20framework%20can%20be%20used%20to%20convert%20raw%20gaze%20data%20into%20groups%20of%20similar%20fixation%20strings.%20If%20confirmed%20by%20studies%20in%20other%20domains%20such%20as%20x-ray%20analysis%2C%20this%20framework%20may%20offer%20an%20automated%20and%20objective%20method%20for%20extracting%20gaze%20behavior%20in%20visual%20search%20tasks.1.%20Levenshtein%20VI.%20Binary%20codes%20capable%20of%20correcting%20deletions%2C%20insertions%2C%20and%20reversals.%20In%20Soviet%20physics%20doklady%201966%20Feb%2010%20%28Vol.%2010%2C%20No.%208%2C%20pp.%20707-710%29.2.%20Frey%20BJ%2C%20Dueck%20D.%20Clustering%20by%20passing%20messages%20between%20data%20points.%20science.%202007%20Feb%2016%3B315%285814%29%3A972-6.%20%20This%20abstract%20was%20presented%20at%20the%202024%20ARVO%20Annual%20Meeting%2C%20held%20in%20Seattle%2C%20WA%2C%20May%205-9%2C%202024.%22%2C%22date%22%3A%222024-06-17%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%221552-5783%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A22%3A44Z%22%7D%7D%2C%7B%22key%22%3A%22L4L722KQ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Nguyen-Ho%20et%20al.%22%2C%22parsedDate%22%3A%222024-06-10%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ENguyen-Ho%2C%20T.-L.%2C%20Kongmeesub%2C%20O.%2C%20Tran%2C%20M.-T.%2C%20Nie%2C%20D.%2C%20Healy%2C%20G.%2C%20%26amp%3B%20Gurrin%2C%20C.%20%282024%29.%20EAGLE%3A%20Eyegaze-Assisted%20Guidance%20and%20Learning%20Evaluation%20for%20Lifeloging%20Retrieval.%20%3Ci%3EProceedings%20of%20the%207th%20Annual%20ACM%20Workshop%20on%20the%20Lifelog%20Search%20Challenge%3C%5C%2Fi%3E%2C%2018%26%23x2013%3B23.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3643489.3661115%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3643489.3661115%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22EAGLE%3A%20Eyegaze-Assisted%20Guidance%20and%20Learning%20Evaluation%20for%20Lifeloging%20Retrieval%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Thang-Long%22%2C%22lastName%22%3A%22Nguyen-Ho%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Onanong%22%2C%22lastName%22%3A%22Kongmeesub%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Minh-Triet%22%2C%22lastName%22%3A%22Tran%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dongyun%22%2C%22lastName%22%3A%22Nie%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Graham%22%2C%22lastName%22%3A%22Healy%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cathal%22%2C%22lastName%22%3A%22Gurrin%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-06-10%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%207th%20Annual%20ACM%20Workshop%20on%20the%20Lifelog%20Search%20Challenge%22%2C%22conferenceName%22%3A%22LSC%20%2724%3A%207th%20Annual%20ACM%20Workshop%20on%20the%20Lifelog%20Search%20Challenge%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1145%5C%2F3643489.3661115%22%2C%22ISBN%22%3A%229798400705502%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3643489.3661115%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A40%3A28Z%22%7D%7D%2C%7B%22key%22%3A%225VPWCYLZ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Tedla%20et%20al.%22%2C%22parsedDate%22%3A%222024-06-04%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ETedla%2C%20S.%20K.%2C%20MacKenzie%2C%20S.%2C%20%26amp%3B%20Brown%2C%20M.%20%282024%29.%20LookToFocus%3A%20Image%20Focus%20via%20Eye%20Tracking.%20%3Ci%3EProceedings%20of%20the%202024%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%3C%5C%2Fi%3E%2C%201%26%23x2013%3B7.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3649902.3656358%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3649902.3656358%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22LookToFocus%3A%20Image%20Focus%20via%20Eye%20Tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Saikiran%20Kumar%22%2C%22lastName%22%3A%22Tedla%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Scott%22%2C%22lastName%22%3A%22MacKenzie%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Brown%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-06-04%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%202024%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%22%2C%22conferenceName%22%3A%22ETRA%20%2724%3A%20The%202024%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1145%5C%2F3649902.3656358%22%2C%22ISBN%22%3A%229798400706073%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3649902.3656358%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A13%3A02Z%22%7D%7D%2C%7B%22key%22%3A%22PIRD74IC%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Moutinho%20and%20Cerf%22%2C%22parsedDate%22%3A%222024-06-03%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EMoutinho%2C%20L.%2C%20%26amp%3B%20Cerf%2C%20M.%20%28Eds.%29.%20%282024%29.%20%3Ci%3EBiometrics%20and%20Neuroscience%20Research%20in%20Business%20and%20Management%3A%20Advances%20and%20Applications%3C%5C%2Fi%3E.%20De%20Gruyter.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1515%5C%2F9783110708509%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22book%22%2C%22title%22%3A%22Biometrics%20and%20Neuroscience%20Research%20in%20Business%20and%20Management%3A%20Advances%20and%20Applications%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Luiz%22%2C%22lastName%22%3A%22Moutinho%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Moran%22%2C%22lastName%22%3A%22Cerf%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-06-03%22%2C%22language%22%3A%22%22%2C%22ISBN%22%3A%229783110708509%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.degruyter.com%5C%2Fdocument%5C%2Fdoi%5C%2F10.1515%5C%2F9783110708509%5C%2Fhtml%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A40%3A28Z%22%7D%7D%2C%7B%22key%22%3A%22EVGAZRZS%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Taieb-Maimon%20and%20Romanovskii-Chernik%22%2C%22parsedDate%22%3A%222024-05-22%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ETaieb-Maimon%2C%20M.%2C%20%26amp%3B%20Romanovskii-Chernik%2C%20L.%20%282024%29.%20Improving%20Error%20Correction%20and%20Text%20Editing%20Using%20Voice%20and%20Mouse%20Multimodal%20Interface.%20%3Ci%3EInternational%20Journal%20of%20Human%26%23x2013%3BComputer%20Interaction%3C%5C%2Fi%3E%2C%201%26%23x2013%3B24.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F10447318.2024.2352932%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F10447318.2024.2352932%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Improving%20Error%20Correction%20and%20Text%20Editing%20Using%20Voice%20and%20Mouse%20Multimodal%20Interface%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Meirav%22%2C%22lastName%22%3A%22Taieb-Maimon%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Luiza%22%2C%22lastName%22%3A%22Romanovskii-Chernik%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-05-22%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1080%5C%2F10447318.2024.2352932%22%2C%22ISSN%22%3A%221044-7318%2C%201532-7590%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.tandfonline.com%5C%2Fdoi%5C%2Ffull%5C%2F10.1080%5C%2F10447318.2024.2352932%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A03%3A51Z%22%7D%7D%2C%7B%22key%22%3A%22LHLNDIKG%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Emami%20et%20al.%22%2C%22parsedDate%22%3A%222024-05-20%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EEmami%2C%20P.%2C%20Jiang%2C%20Y.%2C%20Guo%2C%20Z.%2C%20%26amp%3B%20Leiva%2C%20L.%20A.%20%282024%29.%20Impact%20of%20Design%20Decisions%20in%20Scanpath%20Modeling.%20%3Ci%3EProceedings%20of%20the%20ACM%20on%20Human-Computer%20Interaction%3C%5C%2Fi%3E%2C%20%3Ci%3E8%3C%5C%2Fi%3E%28ETRA%29%2C%201%26%23x2013%3B16.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3655602%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3655602%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Impact%20of%20Design%20Decisions%20in%20Scanpath%20Modeling%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Parvin%22%2C%22lastName%22%3A%22Emami%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yue%22%2C%22lastName%22%3A%22Jiang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zixin%22%2C%22lastName%22%3A%22Guo%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Luis%20A.%22%2C%22lastName%22%3A%22Leiva%22%7D%5D%2C%22abstractNote%22%3A%22Modeling%20visual%20saliency%20in%20graphical%20user%20interfaces%20%28GUIs%29%20allows%20to%20understand%20how%20people%20perceive%20GUI%20designs%20and%20what%20elements%20attract%20their%20attention.%20One%20aspect%20that%20is%20often%20overlooked%20is%20the%20fact%20that%20computational%20models%20depend%20on%20a%20series%20of%20design%20parameters%20that%20are%20not%20straightforward%20to%20decide.%20We%20systematically%20analyze%20how%20different%20design%20parameters%20affect%20scanpath%20evaluation%20metrics%20using%20a%20state-of-the-art%20computational%20model%20%28DeepGaze%2B%2B%29.%20We%20particularly%20focus%20on%20three%20design%20parameters%3A%20input%20image%20size%2C%20inhibition-of-return%20decay%2C%20and%20masking%20radius.%20We%20show%20that%20even%20small%20variations%20of%20these%20design%20parameters%20have%20a%20noticeable%20impact%20on%20standard%20evaluation%20metrics%20such%20as%20DTW%20or%20Eyenalysis.%20These%20effects%20also%20occur%20in%20other%20scanpath%20models%2C%20such%20as%20UMSS%20and%20ScanGAN%2C%20and%20in%20other%20datasets%20such%20as%20MASSVIS.%20Taken%20together%2C%20our%20results%20put%20forward%20the%20impact%20of%20design%20decisions%20for%20predicting%20users%27%20viewing%20behavior%20on%20GUIs.%22%2C%22date%22%3A%222024-05-20%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1145%5C%2F3655602%22%2C%22ISSN%22%3A%222573-0142%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3655602%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A40%3A27Z%22%7D%7D%2C%7B%22key%22%3A%22C8VT35L7%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Dondi%20et%20al.%22%2C%22parsedDate%22%3A%222024-04-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EDondi%2C%20P.%2C%20Sapuppo%2C%20S.%2C%20%26amp%3B%20Porta%2C%20M.%20%282024%29.%20Leyenes%3A%20A%20gaze-based%20text%20entry%20method%20using%20linear%20smooth%20pursuit%20and%20target%20speed.%20%3Ci%3EInternational%20Journal%20of%20Human-Computer%20Studies%3C%5C%2Fi%3E%2C%20%3Ci%3E184%3C%5C%2Fi%3E%2C%20103204.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.ijhcs.2023.103204%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.ijhcs.2023.103204%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Leyenes%3A%20A%20gaze-based%20text%20entry%20method%20using%20linear%20smooth%20pursuit%20and%20target%20speed%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Piercarlo%22%2C%22lastName%22%3A%22Dondi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Samuel%22%2C%22lastName%22%3A%22Sapuppo%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marco%22%2C%22lastName%22%3A%22Porta%22%7D%5D%2C%22abstractNote%22%3A%22Gaze-based%20writing%20is%20one%20of%20the%20most%20widespread%20eye%20tracking%20applications%20for%20human%5Cu2013computer%20interaction.%20While%20eye%20tracking%20communication%20has%20traditionally%20been%20employed%20as%20an%20assistive%20technology%2C%20declining%20prices%20of%20eye%20trackers%20now%20make%20it%20a%20feasible%20alternative%20to%20keyboards%20or%20touchscreens%20in%20many%20contexts%20%28for%20example%2C%20the%20interaction%20with%20public%20info%20points%29.%20In%20this%20paper%20we%20propose%20Leyenes%2C%20a%20text%20entry%20method%20based%20on%20smooth%20pursuit%2C%20a%20natural%20eye%20movement%20that%20occurs%20when%20the%20gaze%20follows%20a%20moving%20target.%20Our%20approach%20requires%20no%20explicit%20calibration%20by%20the%20user%2C%20allowing%20for%20more%20spontaneous%20interaction%20and%20enabling%20eye%20input%20even%20when%20calibration%20is%20difficult%20to%20achieve%20or%20maintain.%20To%20the%20best%20of%20our%20knowledge%2C%20Leyenes%20is%20the%20first%20text%20entry%20technique%20based%20on%20smooth%20pursuit%20that%20considers%20both%20%28approximate%29%20gaze%20speed%20and%20position%20and%20employs%20a%20linear%20interface%20instead%20of%20the%20more%20common%20circular%20layouts.%20The%20results%20of%20the%20user%20study%20we%20conducted%20show%20that%20the%20proposed%20solution%20is%20slow%20but%20robust%2C%20with%20a%20very%20low%20error%20rate%2C%20which%20makes%20it%20particularly%20suitable%20for%20extemporaneous%20writing%20of%20short%20text.%22%2C%22date%22%3A%222024-04-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.ijhcs.2023.103204%22%2C%22ISSN%22%3A%221071-5819%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS1071581923002136%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T21%3A45%3A48Z%22%7D%7D%2C%7B%22key%22%3A%2266IV7ZI2%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kobylska%20and%20Dzie%5Cu0144kowski%22%2C%22parsedDate%22%3A%222024-03-20%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EKobylska%2C%20A.%2C%20%26amp%3B%20Dzie%26%23x144%3Bkowski%2C%20M.%20%282024%29.%20User%20experience%20analysis%20in%20virtual%20museums.%20%3Ci%3EJournal%20of%20Computer%20Sciences%20Institute%3C%5C%2Fi%3E%2C%20%3Ci%3E30%3C%5C%2Fi%3E%2C%2031%26%23x2013%3B38.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.35784%5C%2Fjcsi.5382%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.35784%5C%2Fjcsi.5382%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22User%20experience%20analysis%20in%20virtual%20museums%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aleksandra%22%2C%22lastName%22%3A%22Kobylska%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mariusz%22%2C%22lastName%22%3A%22Dzie%5Cu0144kowski%22%7D%5D%2C%22abstractNote%22%3A%22The%20paper%20presents%20an%20analysis%20of%20user%20experience%20in%20virtual%20museums.%20The%20objects%20of%20interest%20are%20museums%20that%20offer%20a%5Cu00a0virtual%20walk%20which%20allows%20the%20user%20to%20visit%20and%20view%20exhibitions%20without%20leaving%20home.%20Among%20the%20selected%20objects%20for%20the%20study%20were%20the%20Museum%20of%20Auschwitz-Birkenau%20and%20the%20Malbork%20Castle%20Museum.%20The%20interfaces%20of%20these%20two%20virtual%20museums%20were%20subjected%20to%20eye%20tracking%20analysis%20using%20the%20Gazepoint%20GP3%20HD%20eye%20tracker%20and%20an%20expert%20analysis%20using%20Nielsen%20heuristics.%20Additionally%2C%20a%20survey%20consisting%20of%20questions%20from%20the%20System%20Usability%20Scale%20and%20self-reported%20questions%20was%20then%20conducted%20to%20help%20gather%20information%20on%20usability%20and%20to%20collect%20opinions%20about%20online%20museums.%20The%20research%20group%20consisted%20of%20sixteen%20students%20from%20the%20Lublin%20University%20of%20Technology%2C%20who%20were%20presented%20with%20the%20same%20tasks%20to%20perform%20in%20the%20virtual%20museums.%20As%20a%20result%20of%20the%20research%2C%20it%20turned%20out%20that%20both%20according%20to%20the%20Nielsen%20heuristics%20and%20the%20System%20Usability%20Scale%20survey%2C%20the%20Auschwitz-Birkenau%20Museum%20was%20rated%20better%2C%20while%20in%20the%20eye%20tracking%20experiment%20both%20museums%20obtained%20similar%20results.%22%2C%22date%22%3A%222024-03-20%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.35784%5C%2Fjcsi.5382%22%2C%22ISSN%22%3A%222544-0764%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fph.pollub.pl%5C%2Findex.php%5C%2Fjcsi%5C%2Farticle%5C%2Fview%5C%2F5382%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A39%3A20Z%22%7D%7D%2C%7B%22key%22%3A%22FXPHLJMJ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Moradizeyveh%20et%20al.%22%2C%22parsedDate%22%3A%222024-03-12%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EMoradizeyveh%2C%20S.%2C%20Tabassum%2C%20M.%2C%20Liu%2C%20S.%2C%20Newport%2C%20R.%20A.%2C%20Beheshti%2C%20A.%2C%20%26amp%3B%20Ieva%2C%20A.%20D.%20%282024%29.%20%3Ci%3EWhen%20Eye-Tracking%20Meets%20Machine%20Learning%3A%20A%20Systematic%20Review%20on%20Applications%20in%20Medical%20Image%20Analysis%3C%5C%2Fi%3E%20%28No.%20arXiv%3A2403.07834%29.%20arXiv.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2403.07834%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2403.07834%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22When%20Eye-Tracking%20Meets%20Machine%20Learning%3A%20A%20Systematic%20Review%20on%20Applications%20in%20Medical%20Image%20Analysis%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sahar%22%2C%22lastName%22%3A%22Moradizeyveh%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mehnaz%22%2C%22lastName%22%3A%22Tabassum%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sidong%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robert%20Ahadizad%22%2C%22lastName%22%3A%22Newport%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amin%22%2C%22lastName%22%3A%22Beheshti%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Antonio%20Di%22%2C%22lastName%22%3A%22Ieva%22%7D%5D%2C%22abstractNote%22%3A%22Eye-gaze%20tracking%20research%20offers%20significant%20promise%20in%20enhancing%20various%20healthcare-related%20tasks%2C%20above%20all%20in%20medical%20image%20analysis%20and%20interpretation.%20Eye%20tracking%2C%20a%20technology%20that%20monitors%20and%20records%20the%20movement%20of%20the%20eyes%2C%20provides%20valuable%20insights%20into%20human%20visual%20attention%20patterns.%20This%20technology%20can%20transform%20how%20healthcare%20professionals%20and%20medical%20specialists%20engage%20with%20and%20analyze%20diagnostic%20images%2C%20offering%20a%20more%20insightful%20and%20efficient%20approach%20to%20medical%20diagnostics.%20Hence%2C%20extracting%20meaningful%20features%20and%20insights%20from%20medical%20images%20by%20leveraging%20eye-gaze%20data%20improves%20our%20understanding%20of%20how%20radiologists%20and%20other%20medical%20experts%20monitor%2C%20interpret%2C%20and%20understand%20images%20for%20diagnostic%20purposes.%20Eye-tracking%20data%2C%20with%20intricate%20human%20visual%20attention%20patterns%20embedded%2C%20provides%20a%20bridge%20to%20integrating%20artificial%20intelligence%20%28AI%29%20development%20and%20human%20cognition.%20This%20integration%20allows%20novel%20methods%20to%20incorporate%20domain%20knowledge%20into%20machine%20learning%20%28ML%29%20and%20deep%20learning%20%28DL%29%20approaches%20to%20enhance%20their%20alignment%20with%20human-like%20perception%20and%20decision-making.%20Moreover%2C%20extensive%20collections%20of%20eye-tracking%20data%20have%20also%20enabled%20novel%20ML%5C%2FDL%20methods%20to%20analyze%20human%20visual%20patterns%2C%20paving%20the%20way%20to%20a%20better%20understanding%20of%20human%20vision%2C%20attention%2C%20and%20cognition.%20This%20systematic%20review%20investigates%20eye-gaze%20tracking%20applications%20and%20methodologies%20for%20enhancing%20ML%5C%2FDL%20algorithms%20for%20medical%20image%20analysis%20in%20depth.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2403.07834%22%2C%22date%22%3A%222024-03-12%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2403.07834%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2403.07834%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A53%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22BVJ5L4RW%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Jiang%20et%20al.%22%2C%22parsedDate%22%3A%222024-02-07%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EJiang%2C%20Y.%2C%20Leiva%2C%20L.%20A.%2C%20Houssel%2C%20P.%20R.%20B.%2C%20Tavakoli%2C%20H.%20R.%2C%20Kylm%26%23xE4%3Bl%26%23xE4%3B%2C%20J.%2C%20%26amp%3B%20Oulasvirta%2C%20A.%20%282024%29.%20%3Ci%3EUEyes%3A%20An%20Eye-Tracking%20Dataset%20across%20User%20Interface%20Types%3C%5C%2Fi%3E%20%28No.%20arXiv%3A2402.05202%29.%20arXiv.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2402.05202%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2402.05202%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22UEyes%3A%20An%20Eye-Tracking%20Dataset%20across%20User%20Interface%20Types%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yue%22%2C%22lastName%22%3A%22Jiang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Luis%20A.%22%2C%22lastName%22%3A%22Leiva%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Paul%20R.%20B.%22%2C%22lastName%22%3A%22Houssel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hamed%20R.%22%2C%22lastName%22%3A%22Tavakoli%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Julia%22%2C%22lastName%22%3A%22Kylm%5Cu00e4l%5Cu00e4%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Antti%22%2C%22lastName%22%3A%22Oulasvirta%22%7D%5D%2C%22abstractNote%22%3A%22Different%20types%20of%20user%20interfaces%20differ%20significantly%20in%20the%20number%20of%20elements%20and%20how%20they%20are%20displayed.%20To%20examine%20how%20such%20differences%20affect%20the%20way%20users%20look%20at%20UIs%2C%20we%20collected%20and%20analyzed%20a%20large%20eye-tracking-based%20dataset%2C%20UEyes%20%2862%20participants%2C%201%2C980%20UI%20screenshots%2C%20near%2020K%20eye%20movement%20sequences%29%2C%20covering%20four%20major%20UI%20types%3A%20webpage%2C%20desktop%20UI%2C%20mobile%20UI%2C%20and%20poster.%20Furthermore%2C%20we%20analyze%20and%20discuss%20the%20differences%20in%20important%20factors%2C%20such%20as%20color%2C%20location%2C%20and%20gaze%20direction%20across%20UI%20types%2C%20individual%20viewing%20strategies%20and%20potential%20future%20directions.%20This%20position%20paper%20is%20a%20derivative%20of%20our%20recent%20paper%20with%20a%20particular%20focus%20on%20the%20UEyes%20dataset.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2402.05202%22%2C%22date%22%3A%222024-02-07%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2402.05202%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2402.05202%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A53%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22Y7SP8ZNQ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Yin%20and%20Neyens%22%2C%22parsedDate%22%3A%222024-02-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EYin%2C%20R.%2C%20%26amp%3B%20Neyens%2C%20D.%20M.%20%282024%29.%20Examining%20how%20information%20presentation%20methods%20and%20a%20chatbot%20impact%20the%20use%20and%20effectiveness%20of%20electronic%20health%20record%20patient%20portals%3A%20An%20exploratory%20study.%20%3Ci%3EPatient%20Education%20and%20Counseling%3C%5C%2Fi%3E%2C%20%3Ci%3E119%3C%5C%2Fi%3E%2C%20108055.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.pec.2023.108055%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.pec.2023.108055%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Examining%20how%20information%20presentation%20methods%20and%20a%20chatbot%20impact%20the%20use%20and%20effectiveness%20of%20electronic%20health%20record%20patient%20portals%3A%20An%20exploratory%20study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rong%22%2C%22lastName%22%3A%22Yin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%20M.%22%2C%22lastName%22%3A%22Neyens%22%7D%5D%2C%22abstractNote%22%3A%22Objectives%5CnExamining%20information%20presentation%20strategies%20that%20may%20facilitate%20patient%20education%20through%20patient%20portals%20is%20important%20for%20effective%20health%20education.%5CnMethods%5CnA%20randomized%20exploratory%20study%20evaluated%20information%20presentation%20%28text%20or%20videos%29%20and%20a%20chatbot%20in%20patient%20education%20and%20examined%20several%20performance%20and%20outcome%20variables%20%28e.g.%2C%20search%20duration%2C%20Decisional%20Conflict%20Scale%2C%20and%20eye-tracking%20measures%29%2C%20along%20with%20a%20simple%20descriptive%20qualitative%20content%20analysis%20of%20the%20transcript%20of%20chatbot.%5CnResults%5CnOf%20the%2092%20participants%2C%20those%20within%20the%20text%20conditions%20%28n%5Cu00a0%3D%5Cu00a046%2C%20p%5Cu00a0%3C%5Cu00a00.001%29%2C%20had%20chatbot%20experiences%20%28B%20%3D%5Cu221274.85%2C%20p%5Cu00a0%3D%5Cu00a00.046%29%2C%20knew%20someone%20with%20IBD%20%28B%20%3D%5Cu221298.66%2C%20p%5Cu00a0%3D%5Cu00a00.039%29%2C%20and%20preferred%20to%20engage%20in%20medical%20decision-making%20%28B%20%3D102.32%2C%20p%5Cu00a0%3D%5Cu00a00.006%29%20were%20more%20efficient%20in%20information-searching.%20Participants%20with%20videos%20spent%20longer%20in%20information-searching%20%28mean%3D666.5%20%28SD%3D171.6%29%20VS%20480.3%20%28SD%3D159.5%29%20seconds%2C%20p%5Cu00a0%3C%5Cu00a00.001%29%20but%20felt%20more%20informed%20%28mean%20score%3D18.8%20%28SD%3D17.6%29%20VS%2027.4%20%28SD%3D18.9%29%2C%20p%5Cu00a0%3D%5Cu00a00.027%29.%20The%20participants%5Cu2019%20average%20eye%20fixation%20duration%20with%20videos%20was%20significantly%20higher%20%28mean%3D%20473.8%5Cu00a0ms%2C%20SD%3D52.9%2C%20p%5Cu00a0%3C%5Cu00a00.001%29.%5CnConclusions%5CnParticipants%20in%20video%20conditions%20were%20less%20efficient%20but%20more%20effective%20in%20information%20seeking.%20Exploring%20the%20trade-offs%20between%20efficiency%20and%20effectiveness%20for%20user%20interface%20designs%20is%20important%20to%20appropriately%20deliver%20education%20within%20patient%20portals.%5CnPractice%20implications%5CnThis%20study%20suggests%20that%20user%20interface%20designs%20and%20chatbots%20impact%20health%20information%5Cu2019s%20efficiency%20and%20effectiveness.%22%2C%22date%22%3A%222024-02-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.pec.2023.108055%22%2C%22ISSN%22%3A%220738-3991%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0738399123004366%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T19%3A16%3A12Z%22%7D%7D%2C%7B%22key%22%3A%22IPN8GYUD%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Murphy%20et%20al.%22%2C%22parsedDate%22%3A%222024-01-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EMurphy%2C%20T.%20I.%2C%20Abel%2C%20L.%20A.%2C%20Armitage%2C%20J.%20A.%2C%20%26amp%3B%20Douglass%2C%20A.%20G.%20%282024%29.%20Effects%20of%20tracker%20location%20on%20the%20accuracy%20and%20precision%20of%20the%20Gazepoint%20GP3%20HD%20for%20spectacle%20wearers.%20%3Ci%3EBehavior%20Research%20Methods%3C%5C%2Fi%3E%2C%20%3Ci%3E56%3C%5C%2Fi%3E%281%29%2C%2043%26%23x2013%3B52.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3758%5C%2Fs13428-022-02023-y%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3758%5C%2Fs13428-022-02023-y%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Effects%20of%20tracker%20location%20on%20the%20accuracy%20and%20precision%20of%20the%20Gazepoint%20GP3%20HD%20for%20spectacle%20wearers%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Timothy%20I.%22%2C%22lastName%22%3A%22Murphy%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Larry%20A.%22%2C%22lastName%22%3A%22Abel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22James%20A.%22%2C%22lastName%22%3A%22Armitage%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amanda%20G.%22%2C%22lastName%22%3A%22Douglass%22%7D%5D%2C%22abstractNote%22%3A%22Remote%20view%20eye-tracking%20systems%20are%20prone%20to%20errors%20when%20used%20on%20spectacle%20wearers%20due%20to%20reflections%20from%20the%20lenses%20and%20frame%20that%20result%20in%20inaccurate%20tracking.%20Traditionally%2C%20these%20trackers%20are%20situated%20below%20a%20computer%20monitor%20and%20the%20viewer%5Cu2019s%20eye%20moments%20are%20recorded%20while%20they%20view%20the%20screen.%20Reflections%20may%20be%20influenced%20by%20the%20pantoscopic%20tilt%20of%20the%20spectacles%2C%20whereby%20the%20tilt%20angle%20causes%20incident%20light%20to%20be%20reflected%20to%20the%20camera.%20To%20overcome%20this%20problem%2C%20we%20propose%20mounting%20the%20tracker%20above%20the%20monitor%20to%20avoid%20these%20reflections%20and%20test%20the%20accuracy%20and%20precision%20of%20subjects%20with%20single%20vision%20spectacles%2C%20multifocals%2C%20and%20no%20correction%2C%20using%20both%20mounting%20positions.%20Experimental%20results%20showed%20that%20this%20alternate%20position%20had%20overall%20worse%20accuracy%20%284.06%5Cu00b0%20%5Cu00b1%200.13%29%20and%20precision%20%280.67%5Cu00b0%20%5Cu00b1%200.05%29%20compared%20to%20the%20standard%20configuration%20%282.15%5Cu00b0%20%5Cu00b1%200.06%20vs.%200.50%5Cu00b0%20%5Cu00b1%200.03%29%2C%20with%20more%20invalid%20readings%20%285.91%20vs.%2019.19%25%29%20for%20single%20vision%20lens%20wearers.%20Multifocals%20performed%20better%20for%20the%20top-mounting%20position%20for%20the%20top%20portion%20of%20the%20monitor%2C%20suggesting%20higher-order%20aberrations%20from%20the%20bottom%20portion%20of%20the%20lens%20negatively%20impact%20data%20quality.%20Higher%20pantoscopic%20tilt%20angles%20displayed%20an%20improved%20accuracy%20for%20this%20alternate%20position%20%28r%289%29%20%3D%20%5Cu2212%200.69%2C%20p%20%3D%200.02%29%2C%20with%20superior%20accuracy%20for%20tilt%20angles%20greater%20than%2014%5Cu00b0%20compared%20to%20the%20standard%20configuration.%20This%20study%20quantifies%20the%20impact%20of%20spectacle%20wear%20on%20eye-tracking%20performance%20and%20suggests%20other%20alternate%20mounting%20positions%20may%20be%20viable%20in%20certain%20situations.%22%2C%22date%22%3A%222024-01-01%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.3758%5C%2Fs13428-022-02023-y%22%2C%22ISSN%22%3A%221554-3528%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3758%5C%2Fs13428-022-02023-y%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A20%3A07Z%22%7D%7D%2C%7B%22key%22%3A%22KCFKRMVU%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Taieb-Maimon%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ETaieb-Maimon%2C%20M.%2C%20Romanovski-Chernik%2C%20A.%2C%20Last%2C%20M.%2C%20Litvak%2C%20M.%2C%20%26amp%3B%20Elhadad%2C%20M.%20%282024%29.%20Mining%20Eye-Tracking%20Data%20for%20Text%20Summarization.%20%3Ci%3EInternational%20Journal%20of%20Human%26%23x2013%3BComputer%20Interaction%3C%5C%2Fi%3E%2C%20%3Ci%3E40%3C%5C%2Fi%3E%2817%29%2C%204887%26%23x2013%3B4905.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F10447318.2023.2227827%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F10447318.2023.2227827%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Mining%20Eye-Tracking%20Data%20for%20Text%20Summarization%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Meirav%22%2C%22lastName%22%3A%22Taieb-Maimon%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aleksandr%22%2C%22lastName%22%3A%22Romanovski-Chernik%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mark%22%2C%22lastName%22%3A%22Last%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marina%22%2C%22lastName%22%3A%22Litvak%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Elhadad%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%2209%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1080%5C%2F10447318.2023.2227827%22%2C%22ISSN%22%3A%221044-7318%2C%201532-7590%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.tandfonline.com%5C%2Fdoi%5C%2Ffull%5C%2F10.1080%5C%2F10447318.2023.2227827%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A03%3A51Z%22%7D%7D%2C%7B%22key%22%3A%22K6MA3PYT%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Jankowski%20and%20Goroncy%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EJankowski%2C%20M.%2C%20%26amp%3B%20Goroncy%2C%20A.%20%282024%29.%20Anatomical%20variants%20of%20acne%20differ%20in%20their%20impact%20on%20social%20perception.%20%3Ci%3EJournal%20of%20the%20European%20Academy%20of%20Dermatology%20and%20Venereology%3C%5C%2Fi%3E%2C%20%3Ci%3E38%3C%5C%2Fi%3E%288%29%2C%201628%26%23x2013%3B1636.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1111%5C%2Fjdv.19798%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1111%5C%2Fjdv.19798%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Anatomical%20variants%20of%20acne%20differ%20in%20their%20impact%20on%20social%20perception%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marek%22%2C%22lastName%22%3A%22Jankowski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Agnieszka%22%2C%22lastName%22%3A%22Goroncy%22%7D%5D%2C%22abstractNote%22%3A%22Abstract%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20Background%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20Acne%20negatively%20affects%20quality%20of%20life%2C%20however%20quality%5Cu2010of%5Cu2010life%20scores%20poorly%20correlate%20with%20disease%20severity%20scores.%20Previous%20research%20demonstrated%20existence%20of%20facial%20areas%20in%20which%20skin%20lesions%20have%20greater%20impact%20on%20gaze%20patterns.%20Therefore%2C%20we%20hypothesized%20that%20anatomical%20variants%20of%20acne%20may%20be%20perceived%20differently.%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20Objectives%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20The%20aim%20was%20to%20investigate%20effect%20of%20anatomical%20variants%20of%20acne%20on%20natural%20gaze%20patterns%20and%20resulting%20impact%20on%20social%20perception%20of%20acne%20patients.%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20Methods%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20We%20tracked%20eye%20movements%20of%20participants%20viewing%20neutral%20and%20emotional%20faces%20with%20acne.%20Images%20were%20rated%20for%20acne%5Cu2010related%20visual%20disturbance%2C%20and%20emotional%20faces%20were%20rated%20for%20valence%20intensity.%20Respondents%20of%20an%20online%20survey%20were%20asked%20to%20rate%20their%20perception%20of%20pictured%20individuals%27%20personality%20traits.%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20Results%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20All%20faces%20with%20acne%20were%20perceived%20as%20less%20attractive%20and%20received%20poorer%20personality%20judgements%20with%20mid%5Cu2010facial%20acne%20presenting%20smallest%20deviation%20from%20healthy%20faces.%20T%5Cu2010zone%20and%20mixed%20acne%20exhibited%20the%20least%20significant%20difference%20in%20respondents%20gaze%20behaviour%20pattern%20from%20each%20other.%20In%20addition%2C%20there%20was%20no%20significant%20difference%20in%20respondents%27%20grading%20of%20acne%20visual%20disturbance%20or%20ratings%20for%20attractiveness%2C%20success%20and%20trustworthiness.%20U%5Cu2010zone%20adult%20female%20acne%20was%20rated%20as%20the%20most%20visually%20disturbing%20and%20received%20the%20lowest%20scores%20for%20attractiveness.%20Happy%20faces%20with%20adult%20female%20acne%20were%20rated%20as%20less%20happy%20compared%20to%20other%20acne%20variants%20and%20clear%5Cu2010skin%20faces.%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20Conclusions%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20Anatomic%20variants%20of%20acne%20have%20a%20distinct%20impact%20on%20gaze%20patterns%20and%20social%20perception.%20Adult%20female%20acne%20has%20the%20strongest%20negative%20effect%20on%20recognition%20of%20positive%20emotions%20in%20affected%20individuals%2C%20attractiveness%20ratings%20and%20forming%20social%20impressions.%20If%20perioral%20acne%20lesions%20are%20absent%2C%20frontal%20lesions%20determine%20impact%20of%20acne%20on%20social%20perception%20irrespective%20of%20the%20presence%20of%20mid%5Cu2010facial%20lesions.%20This%20perceptive%20hierarchy%20should%20be%20taken%20into%20consideration%20while%20deciding%20treatment%20goals%20in%20acne%20patients%2C%20prioritizing%20achieving%20remission%20in%20perioral%20and%20frontal%20area.%22%2C%22date%22%3A%2208%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1111%5C%2Fjdv.19798%22%2C%22ISSN%22%3A%220926-9959%2C%201468-3083%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2F10.1111%5C%2Fjdv.19798%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A40%3A27Z%22%7D%7D%2C%7B%22key%22%3A%22SNRN8LS6%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chv%5Cu00e1tal%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EChv%26%23xE1%3Btal%2C%20R.%2C%20Slez%26%23xE1%3Bkov%26%23xE1%3B%2C%20J.%2C%20%26amp%3B%20Popelka%2C%20S.%20%282024%29.%20Analysis%20of%20problem-solving%20strategies%20for%20the%20development%20of%20geometric%20imagination%20using%20eye-tracking.%20%3Ci%3EEducation%20and%20Information%20Technologies%3C%5C%2Fi%3E%2C%20%3Ci%3E29%3C%5C%2Fi%3E%2810%29%2C%2012969%26%23x2013%3B12987.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs10639-023-12395-z%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs10639-023-12395-z%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Analysis%20of%20problem-solving%20strategies%20for%20the%20development%20of%20geometric%20imagination%20using%20eye-tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Roman%22%2C%22lastName%22%3A%22Chv%5Cu00e1tal%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jana%22%2C%22lastName%22%3A%22Slez%5Cu00e1kov%5Cu00e1%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stanislav%22%2C%22lastName%22%3A%22Popelka%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%2207%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs10639-023-12395-z%22%2C%22ISSN%22%3A%221360-2357%2C%201573-7608%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2Fs10639-023-12395-z%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A56%3A25Z%22%7D%7D%2C%7B%22key%22%3A%22JY77W8PQ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chhimpa%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EChhimpa%2C%20G.%20R.%2C%20Kumar%2C%20A.%2C%20Garhwal%2C%20S.%2C%20%26amp%3B%20Dhiraj.%20%282024%29.%20Empowering%20individuals%20with%20disabilities%3A%20a%20real-time%2C%20cost-effective%2C%20calibration-free%20assistive%20system%20utilizing%20eye%20tracking.%20%3Ci%3EJournal%20of%20Real-Time%20Image%20Processing%3C%5C%2Fi%3E%2C%20%3Ci%3E21%3C%5C%2Fi%3E%283%29%2C%2097.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs11554-024-01478-w%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs11554-024-01478-w%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Empowering%20individuals%20with%20disabilities%3A%20a%20real-time%2C%20cost-effective%2C%20calibration-free%20assistive%20system%20utilizing%20eye%20tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Govind%20Ram%22%2C%22lastName%22%3A%22Chhimpa%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ajay%22%2C%22lastName%22%3A%22Kumar%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sunita%22%2C%22lastName%22%3A%22Garhwal%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22name%22%3A%22Dhiraj%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%2206%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs11554-024-01478-w%22%2C%22ISSN%22%3A%221861-8200%2C%201861-8219%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2Fs11554-024-01478-w%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A53%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22Z5HSD6PE%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Avoyan%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EAvoyan%2C%20A.%2C%20Ribeiro%2C%20M.%2C%20Schotter%2C%20A.%2C%20Schotter%2C%20E.%20R.%2C%20Vaziri%2C%20M.%2C%20%26amp%3B%20Zou%2C%20M.%20%282024%29.%20Planned%20vs.%20Actual%20Attention.%20%3Ci%3EManagement%20Science%3C%5C%2Fi%3E%2C%20%3Ci%3E70%3C%5C%2Fi%3E%285%29%2C%202912%26%23x2013%3B2933.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1287%5C%2Fmnsc.2023.4834%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1287%5C%2Fmnsc.2023.4834%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Planned%20vs.%20Actual%20Attention%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ala%22%2C%22lastName%22%3A%22Avoyan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mauricio%22%2C%22lastName%22%3A%22Ribeiro%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andrew%22%2C%22lastName%22%3A%22Schotter%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elizabeth%20R.%22%2C%22lastName%22%3A%22Schotter%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mehrdad%22%2C%22lastName%22%3A%22Vaziri%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Minghao%22%2C%22lastName%22%3A%22Zou%22%7D%5D%2C%22abstractNote%22%3A%22People%20often%20need%20to%20plan%20how%20to%20allocate%20their%20attention%20across%20different%20tasks.%20In%20this%20paper%2C%20we%20run%20two%20experiments%20to%20study%20a%20stylized%20version%20of%20this%20attention-allocation%20problem%20between%20strategic%20tasks.%20More%20specifically%2C%20we%20present%20subjects%20with%20pairs%20of%202%20%5Cu00d7%202%20games%2C%20and%20for%20each%20pair%2C%20we%20give%20them%2010%5Cu2009seconds%20to%20decide%20how%20they%20would%20split%20a%20fixed%20time%20budget%20between%20the%20two%20games.%20Then%2C%20subjects%20play%20both%20games%20without%20time%20constraints%2C%20and%20we%20use%20eye-tracking%20to%20estimate%20the%20fraction%20of%20time%20they%20spend%20on%20each%20game.%20We%20find%20that%20subjects%5Cu2019%20planned%20and%20actual%20attention%20allocation%20differ%20and%20identify%20the%20determinants%20of%20this%20mismatch.%20Further%2C%20we%20argue%20that%20misallocations%20can%20be%20relevant%20in%20games%20in%20which%20a%20player%5Cu2019s%20strategy%20choice%20is%20sensitive%20to%20the%20time%20taken%20to%20reach%20a%20decision.%5Cn%20%20%20%20%20%20%20%20%20%20%20%20This%20paper%20was%20accepted%20by%20Yan%20Chen%2C%20behavioral%20economics%20and%20decision%20analysis.%5Cn%20%20%20%20%20%20%20%20%20%20%20%20Funding%3A%20Work%20on%20this%20project%20was%20provided%20by%20the%20National%20Science%20Foundation%20%5BGrant%20SES%201724550%5D%20%5Cu201cCollaborative%20Research%3A%20Attention%20in%20Games%20and%20Decisions%2C%5Cu201d%20awarded%20to%20A.%20Schotter%20and%20E.%20R.%20Schotter.%20The%20work%20of%20M.%20Ribeiro%20and%20M.%20Zou%20was%20supported%20by%20the%20Center%20for%20Experimental%20Economics%20Social%20Science%20at%20New%20York%20University.%5Cn%20%20%20%20%20%20%20%20%20%20%20%20Supplemental%20Material%3A%20The%20online%20appendix%20and%20data%20are%20available%20at%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1287%5C%2Fmnsc.2023.4834%20.%22%2C%22date%22%3A%2205%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1287%5C%2Fmnsc.2023.4834%22%2C%22ISSN%22%3A%220025-1909%2C%201526-5501%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fpubsonline.informs.org%5C%2Fdoi%5C%2F10.1287%5C%2Fmnsc.2023.4834%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A44%3A52Z%22%7D%7D%2C%7B%22key%22%3A%22AVBB8CSI%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Conijn%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EConijn%2C%20R.%2C%20Dux%20Speltz%2C%20E.%2C%20%26amp%3B%20Chukharev-Hudilainen%2C%20E.%20%282024%29.%20Automated%20extraction%20of%20revision%20events%20from%20keystroke%20data.%20%3Ci%3EReading%20and%20Writing%3C%5C%2Fi%3E%2C%20%3Ci%3E37%3C%5C%2Fi%3E%282%29%2C%20483%26%23x2013%3B508.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs11145-021-10222-w%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs11145-021-10222-w%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Automated%20extraction%20of%20revision%20events%20from%20keystroke%20data%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rianne%22%2C%22lastName%22%3A%22Conijn%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emily%22%2C%22lastName%22%3A%22Dux%20Speltz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Evgeny%22%2C%22lastName%22%3A%22Chukharev-Hudilainen%22%7D%5D%2C%22abstractNote%22%3A%22Abstract%5Cn%20%20%20%20%20%20%20%20%20%20%20%20Revision%20plays%20an%20important%20role%20in%20writing%2C%20and%20as%20revisions%20break%20down%20the%20linearity%20of%20the%20writing%20process%2C%20they%20are%20crucial%20in%20describing%20writing%20process%20dynamics.%20Keystroke%20logging%20and%20analysis%20have%20been%20used%20to%20identify%20revisions%20made%20during%20writing.%20Previous%20approaches%20include%20the%20manual%20annotation%20of%20revisions%2C%20building%20nonlinear%20S-notations%2C%20and%20the%20automated%20extraction%20of%20backspace%20keypresses.%20However%2C%20these%20approaches%20are%20time-intensive%2C%20vulnerable%20to%20construct%2C%20or%20restricted.%20Therefore%2C%20this%20article%20presents%20a%20computational%20approach%20to%20the%20automatic%20extraction%20of%20full%20revision%20events%20from%20keystroke%20logs%2C%20including%20both%20insertions%20and%20deletions%2C%20as%20well%20as%20the%20characters%20typed%20to%20replace%20the%20deleted%20text.%5Cu00a0Within%20this%20approach%2C%20revision%20candidates%20are%20first%20automatically%20extracted%2C%20which%20allows%20for%20a%20simplified%20manual%20annotation%20of%20revision%20events.%20Second%2C%20machine%20learning%20is%20used%20to%20automatically%20detect%20revision%20events.%20For%20this%2C%207120%20revision%20events%20were%20manually%20annotated%20in%20a%20dataset%20of%20keystrokes%20obtained%20from%2065%20students%20conducting%20a%20writing%20task.%20The%20results%20showed%20that%20revision%20events%20could%20be%20automatically%20predicted%20with%20a%20relatively%20high%20accuracy.%20In%20addition%2C%20a%20case%20study%20proved%20that%20this%20approach%20could%20be%20easily%20applied%20to%20a%20new%20dataset.%20To%20conclude%2C%20computational%20approaches%20can%20be%20beneficial%20in%20providing%20automated%20insights%20into%20revisions%20in%20writing.%22%2C%22date%22%3A%2202%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs11145-021-10222-w%22%2C%22ISSN%22%3A%220922-4777%2C%201573-0905%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2Fs11145-021-10222-w%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A03%3A51Z%22%7D%7D%2C%7B%22key%22%3A%22XYRBV3J4%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Segedinac%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESegedinac%2C%20M.%2C%20Savi%26%23x107%3B%2C%20G.%2C%20Zeljkovi%26%23x107%3B%2C%20I.%2C%20Slivka%2C%20J.%2C%20%26amp%3B%20Konjovi%26%23x107%3B%2C%20Z.%20%282024%29.%20Assessing%20code%20readability%20in%20Python%20programming%20courses%20using%20eye%26%23x2010%3Btracking.%20%3Ci%3EComputer%20Applications%20in%20Engineering%20Education%3C%5C%2Fi%3E%2C%20%3Ci%3E32%3C%5C%2Fi%3E%281%29%2C%20e22685.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1002%5C%2Fcae.22685%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1002%5C%2Fcae.22685%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Assessing%20code%20readability%20in%20Python%20programming%20courses%20using%20eye%5Cu2010tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Milan%22%2C%22lastName%22%3A%22Segedinac%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Goran%22%2C%22lastName%22%3A%22Savi%5Cu0107%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ivana%22%2C%22lastName%22%3A%22Zeljkovi%5Cu0107%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jelena%22%2C%22lastName%22%3A%22Slivka%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zora%22%2C%22lastName%22%3A%22Konjovi%5Cu0107%22%7D%5D%2C%22abstractNote%22%3A%22Abstract%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20Code%20readability%20models%20are%20typically%20based%20on%20the%20code%27s%20structural%20and%20textual%20features%2C%20considering%20code%20readability%20as%20an%20objective%20category.%20However%2C%20readability%20is%20inherently%20subjective%20and%20dependent%20on%20the%20knowledge%20and%20experience%20of%20the%20reader%20analyzing%20the%20code.%20This%20paper%20assesses%20the%20readability%20of%20Python%20code%20statements%20commonly%20used%20in%20undergraduate%20programming%20courses.%20Our%20readability%20model%20is%20based%20on%20tracking%20the%20reader%27s%20eye%20movement%20during%20the%20while%5Cu2010read%20phase.%20It%20uses%20machine%20learning%20%28ML%29%20techniques%20and%20relies%20on%20a%20novel%20set%20of%20features%5Cu2014observational%20features%5Cu2014that%20capture%20how%20the%20readers%20read%20the%20code.%20We%20experimented%20by%20tracking%20the%20eye%20movement%20of%2090%20undergraduate%20students%20while%20assessing%20the%20readability%20of%2048%20Python%20code%20snippets.%20We%20trained%20an%20ML%20model%20that%20predicts%20readability%20based%20on%20the%20collected%20observational%20data%20and%20the%20code%20snippet%27s%20structural%20and%20textual%20features.%20In%20our%20experiments%2C%20the%20XGBoost%20classifier%20trained%20using%20observational%20features%20exclusively%20achieved%20the%20best%20results%20%280.85%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20F%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20%5Cu2010measure%29.%20Using%20correlation%20analysis%2C%20we%20identified%20Python%20statements%20most%20affecting%20readability%20for%20undergraduate%20students%20and%20proposed%20implications%20for%20teaching%20Python%20programming.%20In%20line%20with%20findings%20for%20Java%20language%2C%20we%20found%20that%20constructs%20related%20to%20the%20code%27s%20size%20and%20complexity%20hurt%20the%20code%27s%20readability.%20Numerous%20comments%20also%20hindered%20readability%2C%20potentially%20due%20to%20their%20association%20with%20less%20readable%20code.%20Some%20Python%5Cu2010specific%20statements%20%28list%20comprehension%2C%20lambda%20function%2C%20and%20dictionary%20comprehension%29%20harmed%20code%20readability%2C%20even%20though%20they%20were%20part%20of%20the%20curriculum.%20Tracking%20students%27%20gaze%20indicated%20some%20additional%20factors%2C%20most%20notably%20nonlinearity%20introduced%20by%20if%2C%20for%2C%20while%2C%20try%2C%20and%20function%20call%20statements.%22%2C%22date%22%3A%2201%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1002%5C%2Fcae.22685%22%2C%22ISSN%22%3A%221061-3773%2C%201099-0542%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2F10.1002%5C%2Fcae.22685%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A44%3A51Z%22%7D%7D%2C%7B%22key%22%3A%22X9ZPV39R%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Sims%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESims%2C%20J.%20P.%2C%20Haynes%2C%20A.%2C%20%26amp%3B%20Lanius%2C%20C.%20%282024%29.%20Exploring%20the%20utility%20of%20eye%20tracking%20for%20sociological%20research%20on%20race.%20%3Ci%3EThe%20British%20Journal%20of%20Sociology%3C%5C%2Fi%3E%2C%20%3Ci%3E75%3C%5C%2Fi%3E%281%29%2C%2065%26%23x2013%3B72.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1111%5C%2F1468-4446.13054%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1111%5C%2F1468-4446.13054%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Exploring%20the%20utility%20of%20eye%20tracking%20for%20sociological%20research%20on%20race%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jennifer%20Patrice%22%2C%22lastName%22%3A%22Sims%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alex%22%2C%22lastName%22%3A%22Haynes%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Candice%22%2C%22lastName%22%3A%22Lanius%22%7D%5D%2C%22abstractNote%22%3A%22Abstract%5Cn%20%20%20%20%20%20%20%20%20%20%20%20One%20part%20of%20the%20social%20construction%20of%20race%20is%20the%20symbolic%20association%20of%20given%20physical%20features%20with%20different%20races.%20This%20research%20note%20explores%20the%20utility%20of%20eye%20tracking%20for%20sociological%20research%20on%20racial%20perception%2C%20that%20is%2C%20for%20determining%20what%20race%20someone%20%5Cu2018looks%20like.%5Cu2019%20Results%20reveal%20that%20participants%20gave%20greatest%20attention%20to%20targets%27%20hair.%20This%20was%20especially%20so%20when%20targets%20of%20all%20races%20had%20straight%20hair%20or%20when%20a%20target%20identified%20as%20Black%5C%2FWhite%20mixed%5Cu2010race.%20The%20mixed%5Cu2010race%20results%20in%20particular%20provide%20physiological%20evidence%20of%20the%20theory%20of%20multiracial%20dissection.%20We%20conclude%20by%20suggesting%20that%20eye%20tracking%20can%20be%20useful%20to%20sociologists%20by%20revealing%20subconscious%20tendencies%20and%20biases%20which%2C%20once%20identified%2C%20can%20be%20consciously%20addressed%20in%20service%20to%20reducing%20social%20disparities.%22%2C%22date%22%3A%2201%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1111%5C%2F1468-4446.13054%22%2C%22ISSN%22%3A%220007-1315%2C%201468-4446%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2F10.1111%5C%2F1468-4446.13054%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-18T19%3A53%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22UEAIWIHV%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Duwer%20and%20Dzie%5Cu0144kowski%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EDuwer%2C%20A.%2C%20%26amp%3B%20Dzie%26%23x144%3Bkowski%2C%20M.%20%282024%29.%20Analysis%20of%20the%20usability%20of%20selected%20auction%20websites.%20%3Ci%3EJournal%20of%20Computer%20Sciences%20Institute%3C%5C%2Fi%3E%2C%20%3Ci%3E31%3C%5C%2Fi%3E%2C%20138%26%23x2013%3B144.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fph.pollub.pl%5C%2Findex.php%5C%2Fjcsi%5C%2Farticle%5C%2Fview%5C%2F6200%27%3Ehttps%3A%5C%2F%5C%2Fph.pollub.pl%5C%2Findex.php%5C%2Fjcsi%5C%2Farticle%5C%2Fview%5C%2F6200%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Analysis%20of%20the%20usability%20of%20selected%20auction%20websites%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Adrian%22%2C%22lastName%22%3A%22Duwer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mariusz%22%2C%22lastName%22%3A%22Dzie%5Cu0144kowski%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fph.pollub.pl%5C%2Findex.php%5C%2Fjcsi%5C%2Farticle%5C%2Fview%5C%2F6200%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A13%3A02Z%22%7D%7D%2C%7B%22key%22%3A%222H2ALYNX%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Byrne%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EByrne%2C%20M.%20%282024%29.%20%3Ci%3EMaster%20of%20Arts%3C%5C%2Fi%3E%20%5BPhD%20Thesis%2C%20RICE%20UNIVERSITY%5D.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Frepository.rice.edu%5C%2Fbitstreams%5C%2F1c22e77e-df9f-4541-bbbf-04b91d20755f%5C%2Fdownload%27%3Ehttps%3A%5C%2F%5C%2Frepository.rice.edu%5C%2Fbitstreams%5C%2F1c22e77e-df9f-4541-bbbf-04b91d20755f%5C%2Fdownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22thesis%22%2C%22title%22%3A%22Master%20of%20Arts%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Byrne%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22thesisType%22%3A%22PhD%20Thesis%22%2C%22university%22%3A%22RICE%20UNIVERSITY%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Frepository.rice.edu%5C%2Fbitstreams%5C%2F1c22e77e-df9f-4541-bbbf-04b91d20755f%5C%2Fdownload%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A13%3A02Z%22%7D%7D%2C%7B%22key%22%3A%22YNUIVMSJ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hahn%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHahn%2C%20A.%20C.%2C%20Riedelsheimer%2C%20J.%20A.%2C%20Royer%2C%20Z.%2C%20Frederick%2C%20J.%2C%20Kee%2C%20R.%2C%20Crimmins%2C%20R.%2C%20Huber%2C%20B.%2C%20Harris%2C%20D.%20H.%2C%20%26amp%3B%20Jantzen%2C%20K.%20J.%20%282024%29.%20Effects%20of%20cleft%20lip%20on%20visual%20scanning%20and%20neural%20processing%20of%20infant%20faces.%20%3Ci%3EPlos%20One%3C%5C%2Fi%3E%2C%20%3Ci%3E19%3C%5C%2Fi%3E%283%29%2C%20e0300673.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fjournals.plos.org%5C%2Fplosone%5C%2Farticle%3Fid%3D10.1371%5C%2Fjournal.pone.0300673%27%3Ehttps%3A%5C%2F%5C%2Fjournals.plos.org%5C%2Fplosone%5C%2Farticle%3Fid%3D10.1371%5C%2Fjournal.pone.0300673%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Effects%20of%20cleft%20lip%20on%20visual%20scanning%20and%20neural%20processing%20of%20infant%20faces%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amanda%20C.%22%2C%22lastName%22%3A%22Hahn%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Juergen%20A.%22%2C%22lastName%22%3A%22Riedelsheimer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zo%5Cu00eb%22%2C%22lastName%22%3A%22Royer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jeffrey%22%2C%22lastName%22%3A%22Frederick%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rachael%22%2C%22lastName%22%3A%22Kee%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rhiannon%22%2C%22lastName%22%3A%22Crimmins%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bernd%22%2C%22lastName%22%3A%22Huber%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%20H.%22%2C%22lastName%22%3A%22Harris%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kelly%20J.%22%2C%22lastName%22%3A%22Jantzen%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fjournals.plos.org%5C%2Fplosone%5C%2Farticle%3Fid%3D10.1371%5C%2Fjournal.pone.0300673%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A13%3A02Z%22%7D%7D%2C%7B%22key%22%3A%22TAB2BCUH%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Robertson%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ERobertson%2C%20B.%20D.%20%282024%29.%20%3Ci%3ERelationship%20Between%20Heart%20Rate%20Variability%2C%20Saccadic%20Impairment%2C%20and%20Cognitive%20Performance%20Following%20Mild%20Traumatic%20Brain%20Injury%20in%20a%20Military%20Population%3C%5C%2Fi%3E%20%5BPhD%20Thesis%2C%20Alliant%20International%20University%5D.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fsearch.proquest.com%5C%2Fopenview%5C%2F5a86f190ba90aec459361b3cdf8ee3b0%5C%2F1%3Fpq-origsite%3Dgscholar%26cbl%3D18750%26diss%3Dy%27%3Ehttps%3A%5C%2F%5C%2Fsearch.proquest.com%5C%2Fopenview%5C%2F5a86f190ba90aec459361b3cdf8ee3b0%5C%2F1%3Fpq-origsite%3Dgscholar%26cbl%3D18750%26diss%3Dy%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22thesis%22%2C%22title%22%3A%22Relationship%20Between%20Heart%20Rate%20Variability%2C%20Saccadic%20Impairment%2C%20and%20Cognitive%20Performance%20Following%20Mild%20Traumatic%20Brain%20Injury%20in%20a%20Military%20Population%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bo%20D.%22%2C%22lastName%22%3A%22Robertson%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22thesisType%22%3A%22PhD%20Thesis%22%2C%22university%22%3A%22Alliant%20International%20University%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fsearch.proquest.com%5C%2Fopenview%5C%2F5a86f190ba90aec459361b3cdf8ee3b0%5C%2F1%3Fpq-origsite%3Dgscholar%26cbl%3D18750%26diss%3Dy%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A13%3A02Z%22%7D%7D%2C%7B%22key%22%3A%22U6843JY8%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Barriga%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EBarriga%2C%20A.%20D.%20%282024%29.%20In%20Your%20Sight%20and%20in%20Your%20Mind%3A%20The%20Puppeteer%20as%20Cognitive%20Guide%20in%20Kory%26%23x16B%3B%20Nishikawa%20V%20and%20Tom%20Lee%26%23x2019%3Bs%20Shank%26%23x2019%3Bs%20Mare.%20%3Ci%3ETheatre%20Topics%3C%5C%2Fi%3E%2C%20%3Ci%3E34%3C%5C%2Fi%3E%283%29%2C%20197%26%23x2013%3B207.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fmuse.jhu.edu%5C%2Fpub%5C%2F1%5C%2Farticle%5C%2F942001%5C%2Fsummary%27%3Ehttps%3A%5C%2F%5C%2Fmuse.jhu.edu%5C%2Fpub%5C%2F1%5C%2Farticle%5C%2F942001%5C%2Fsummary%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22In%20Your%20Sight%20and%20in%20Your%20Mind%3A%20The%20Puppeteer%20as%20Cognitive%20Guide%20in%20Kory%5Cu016b%20Nishikawa%20V%20and%20Tom%20Lee%27s%20Shank%27s%20Mare%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ana%20D%5Cu00edaz%22%2C%22lastName%22%3A%22Barriga%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fmuse.jhu.edu%5C%2Fpub%5C%2F1%5C%2Farticle%5C%2F942001%5C%2Fsummary%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A13%3A01Z%22%7D%7D%2C%7B%22key%22%3A%22BTAYRBJ2%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Asaraf%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EAsaraf%2C%20S.%2C%20Parmet%2C%20Y.%2C%20%26amp%3B%20Borowsky%2C%20A.%20%282024%29.%20%3Ci%3EHazard%20perception%20and%20attention%20of%20track%20safety%20supervisor%20as%20a%20function%20of%20working%20time%3C%5C%2Fi%3E.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.hfes-europe.org%5C%2Fwp-content%5C%2Fuploads%5C%2F2024%5C%2F05%5C%2FAsaraf2024.pdf%27%3Ehttps%3A%5C%2F%5C%2Fwww.hfes-europe.org%5C%2Fwp-content%5C%2Fuploads%5C%2F2024%5C%2F05%5C%2FAsaraf2024.pdf%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Hazard%20perception%20and%20attention%20of%20track%20safety%20supervisor%20as%20a%20function%20of%20working%20time%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shay%22%2C%22lastName%22%3A%22Asaraf%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yisrael%22%2C%22lastName%22%3A%22Parmet%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Avinoam%22%2C%22lastName%22%3A%22Borowsky%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.hfes-europe.org%5C%2Fwp-content%5C%2Fuploads%5C%2F2024%5C%2F05%5C%2FAsaraf2024.pdf%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A04%3A42Z%22%7D%7D%2C%7B%22key%22%3A%22KN3YLLKE%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22%5Cu0160pajdel%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%26%23x160%3Bpajdel%2C%20M.%20%282024%29.%20%3Ci%3EAnalysis%20of%20Eye%20Movements%20Reveals%20Longer%20Visual%20Saccades%20and%20Abnormal%20Preference%20for%20Social%20Images%20in%20Autism%20Spectrum%20Disorder%3C%5C%2Fi%3E.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Frediviva.sav.sk%5C%2F66i1%5C%2F1.pdf%27%3Ehttps%3A%5C%2F%5C%2Frediviva.sav.sk%5C%2F66i1%5C%2F1.pdf%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Analysis%20of%20Eye%20Movements%20Reveals%20Longer%20Visual%20Saccades%20and%20Abnormal%20Preference%20for%20Social%20Images%20in%20Autism%20Spectrum%20Disorder%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mari%5Cu00e1n%22%2C%22lastName%22%3A%22%5Cu0160pajdel%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Frediviva.sav.sk%5C%2F66i1%5C%2F1.pdf%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A04%3A37Z%22%7D%7D%2C%7B%22key%22%3A%22B6VXAREP%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Shepherd%20and%20Kidd%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EShepherd%2C%20S.%20S.%2C%20%26amp%3B%20Kidd%2C%20C.%20%282024%29.%20Visual%20engagement%20is%20not%20synonymous%20with%20learning%20in%20young%20children.%20%3Ci%3EProceedings%20of%20the%20Annual%20Meeting%20of%20the%20Cognitive%20Science%20Society%3C%5C%2Fi%3E%2C%20%3Ci%3E46%3C%5C%2Fi%3E.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fescholarship.org%5C%2Fuc%5C%2Fitem%5C%2F0wz74769%27%3Ehttps%3A%5C%2F%5C%2Fescholarship.org%5C%2Fuc%5C%2Fitem%5C%2F0wz74769%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Visual%20engagement%20is%20not%20synonymous%20with%20learning%20in%20young%20children%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sarah%20Stolp%22%2C%22lastName%22%3A%22Shepherd%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Celeste%22%2C%22lastName%22%3A%22Kidd%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%20Annual%20Meeting%20of%20the%20Cognitive%20Science%20Society%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fescholarship.org%5C%2Fuc%5C%2Fitem%5C%2F0wz74769%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A03%3A51Z%22%7D%7D%2C%7B%22key%22%3A%2297X8UTHN%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chavez%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EChavez%2C%20F.%20%282024%29.%20%3Ci%3EComputational%20Modeling%20of%20Voters%26%23x2019%3B%20Checking%20Behavior%20%26amp%3B%20Checking%20Performance%3C%5C%2Fi%3E%20%5BMaster%26%23x2019%3Bs%20Thesis%2C%20Rice%20University%5D.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fsearch.proquest.com%5C%2Fopenview%5C%2F3055e7cd3118a7fd5b44379a5d5f48ba%5C%2F1%3Fpq-origsite%3Dgscholar%26cbl%3D18750%26diss%3Dy%27%3Ehttps%3A%5C%2F%5C%2Fsearch.proquest.com%5C%2Fopenview%5C%2F3055e7cd3118a7fd5b44379a5d5f48ba%5C%2F1%3Fpq-origsite%3Dgscholar%26cbl%3D18750%26diss%3Dy%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22thesis%22%2C%22title%22%3A%22Computational%20Modeling%20of%20Voters%27%20Checking%20Behavior%20%26%20Checking%20Performance%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fabrizio%22%2C%22lastName%22%3A%22Chavez%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22thesisType%22%3A%22Master%27s%20Thesis%22%2C%22university%22%3A%22Rice%20University%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fsearch.proquest.com%5C%2Fopenview%5C%2F3055e7cd3118a7fd5b44379a5d5f48ba%5C%2F1%3Fpq-origsite%3Dgscholar%26cbl%3D18750%26diss%3Dy%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T04%3A03%3A51Z%22%7D%7D%2C%7B%22key%22%3A%226IYEAHS4%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Baltuttis%20and%20Teubner%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EBaltuttis%2C%20D.%2C%20%26amp%3B%20Teubner%2C%20T.%20%282024%29.%20Effects%20of%20Visual%20Risk%20Indicators%20on%20Phishing%20Detection%20Behavior%3A%20An%20Eye-Tracking%20Experiment.%20%3Ci%3EComputers%20%26amp%3B%20Security%3C%5C%2Fi%3E%2C%20103940.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0167404824002451%27%3Ehttps%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0167404824002451%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Effects%20of%20Visual%20Risk%20Indicators%20on%20Phishing%20Detection%20Behavior%3A%20An%20Eye-Tracking%20Experiment%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dennik%22%2C%22lastName%22%3A%22Baltuttis%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Timm%22%2C%22lastName%22%3A%22Teubner%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0167404824002451%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A44%3A52Z%22%7D%7D%2C%7B%22key%22%3A%2252UURMHC%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22George%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EGeorge%2C%20J.%20F.%20%282024%29.%20Discovering%20why%20people%20believe%20disinformation%20about%20healthcare.%20%3Ci%3EPlos%20One%3C%5C%2Fi%3E%2C%20%3Ci%3E19%3C%5C%2Fi%3E%283%29%2C%20e0300497.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fjournals.plos.org%5C%2Fplosone%5C%2Farticle%3Fid%3D10.1371%5C%2Fjournal.pone.0300497%27%3Ehttps%3A%5C%2F%5C%2Fjournals.plos.org%5C%2Fplosone%5C%2Farticle%3Fid%3D10.1371%5C%2Fjournal.pone.0300497%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Discovering%20why%20people%20believe%20disinformation%20about%20healthcare%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Joey%20F.%22%2C%22lastName%22%3A%22George%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fjournals.plos.org%5C%2Fplosone%5C%2Farticle%3Fid%3D10.1371%5C%2Fjournal.pone.0300497%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A44%3A52Z%22%7D%7D%2C%7B%22key%22%3A%22DTCZBFYX%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kerr%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EKerr%2C%20C.%20%282024%29.%20Seeing%20the%20science%20and%20technology%20pipeline.%20%3Ci%3EIEEE%20Engineering%20Management%20Review%3C%5C%2Fi%3E.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fabstract%5C%2Fdocument%5C%2F10529524%5C%2F%27%3Ehttps%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fabstract%5C%2Fdocument%5C%2F10529524%5C%2F%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Seeing%20the%20science%20and%20technology%20pipeline%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Clive%22%2C%22lastName%22%3A%22Kerr%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fabstract%5C%2Fdocument%5C%2F10529524%5C%2F%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A44%3A52Z%22%7D%7D%2C%7B%22key%22%3A%22XEQQQTLL%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Cheng%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ECheng%2C%20G.%2C%20Zou%2C%20D.%2C%20Xie%2C%20H.%2C%20%26amp%3B%20Wang%2C%20F.%20L.%20%282024%29.%20Exploring%20differences%20in%20self-regulated%20learning%20strategy%20use%20between%20high-and%20low-performing%20students%20in%20introductory%20programming%3A%20An%20analysis%20of%20eye-tracking%20and%20retrospective%20think-aloud%20data%20from%20program%20comprehension.%20%3Ci%3EComputers%20%26amp%3B%20Education%3C%5C%2Fi%3E%2C%20%3Ci%3E208%3C%5C%2Fi%3E%2C%20104948.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0360131523002257%27%3Ehttps%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0360131523002257%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Exploring%20differences%20in%20self-regulated%20learning%20strategy%20use%20between%20high-and%20low-performing%20students%20in%20introductory%20programming%3A%20An%20analysis%20of%20eye-tracking%20and%20retrospective%20think-aloud%20data%20from%20program%20comprehension%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gary%22%2C%22lastName%22%3A%22Cheng%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Di%22%2C%22lastName%22%3A%22Zou%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Haoran%22%2C%22lastName%22%3A%22Xie%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fu%20Lee%22%2C%22lastName%22%3A%22Wang%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0360131523002257%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A44%3A52Z%22%7D%7D%2C%7B%22key%22%3A%22QSTZBM9V%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Danielkiewicz%20and%20Dzie%5Cu0144kowski%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EDanielkiewicz%2C%20R.%2C%20%26amp%3B%20Dzie%26%23x144%3Bkowski%2C%20M.%20%282024%29.%20Analysis%20of%20user%20experience%20during%20interaction%20with%20automotive%20repair%20workshop%20websites.%20%3Ci%3EJournal%20of%20Computer%20Sciences%20Institute%3C%5C%2Fi%3E%2C%20%3Ci%3E30%3C%5C%2Fi%3E%2C%2039%26%23x2013%3B46.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fph.pollub.pl%5C%2Findex.php%5C%2Fjcsi%5C%2Farticle%5C%2Fview%5C%2F5416%27%3Ehttps%3A%5C%2F%5C%2Fph.pollub.pl%5C%2Findex.php%5C%2Fjcsi%5C%2Farticle%5C%2Fview%5C%2F5416%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Analysis%20of%20user%20experience%20during%20interaction%20with%20automotive%20repair%20workshop%20websites%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rados%5Cu0142aw%22%2C%22lastName%22%3A%22Danielkiewicz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mariusz%22%2C%22lastName%22%3A%22Dzie%5Cu0144kowski%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fph.pollub.pl%5C%2Findex.php%5C%2Fjcsi%5C%2Farticle%5C%2Fview%5C%2F5416%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A44%3A52Z%22%7D%7D%2C%7B%22key%22%3A%22QWCPZP3C%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ghi%5Cu0163%5Cu0103%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EGhi%26%23x163%3B%26%23x103%3B%2C%20A.%2C%20Hern%26%23xE1%3Bndez-Serrano%2C%20O.%2C%20Moreno%2C%20M.%2C%20Monr%26%23xE0%3Bs%2C%20M.%2C%20Gual%2C%20A.%2C%20Maurage%2C%20P.%2C%20Gacto-S%26%23xE1%3Bnchez%2C%20M.%2C%20Ferrer-Garc%26%23xED%3Ba%2C%20M.%2C%20Porras-Garc%26%23xED%3Ba%2C%20B.%2C%20%26amp%3B%20Guti%26%23xE9%3Brrez-Maldonado%2C%20J.%20%282024%29.%20Exploring%20Attentional%20Bias%20toward%20Alcohol%20Content%3A%20Insights%20from%20Eye-Movement%20Activity.%20%3Ci%3EEuropean%20Addiction%20Research%3C%5C%2Fi%3E%2C%20%3Ci%3E30%3C%5C%2Fi%3E%282%29%2C%2065%26%23x2013%3B79.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fkarger.com%5C%2Fear%5C%2Farticle%5C%2F30%5C%2F2%5C%2F65%5C%2F896035%27%3Ehttps%3A%5C%2F%5C%2Fkarger.com%5C%2Fear%5C%2Farticle%5C%2F30%5C%2F2%5C%2F65%5C%2F896035%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Exploring%20Attentional%20Bias%20toward%20Alcohol%20Content%3A%20Insights%20from%20Eye-Movement%20Activity%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexandra%22%2C%22lastName%22%3A%22Ghi%5Cu0163%5Cu0103%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Olga%22%2C%22lastName%22%3A%22Hern%5Cu00e1ndez-Serrano%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Manuel%22%2C%22lastName%22%3A%22Moreno%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Miquel%22%2C%22lastName%22%3A%22Monr%5Cu00e0s%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Antoni%22%2C%22lastName%22%3A%22Gual%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pierre%22%2C%22lastName%22%3A%22Maurage%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mariano%22%2C%22lastName%22%3A%22Gacto-S%5Cu00e1nchez%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marta%22%2C%22lastName%22%3A%22Ferrer-Garc%5Cu00eda%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bruno%22%2C%22lastName%22%3A%22Porras-Garc%5Cu00eda%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jos%5Cu00e9%22%2C%22lastName%22%3A%22Guti%5Cu00e9rrez-Maldonado%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fkarger.com%5C%2Fear%5C%2Farticle%5C%2F30%5C%2F2%5C%2F65%5C%2F896035%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A44%3A51Z%22%7D%7D%2C%7B%22key%22%3A%224WVM4K6N%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Stimson%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EStimson%2C%20K.%20H.%20%282024%29.%20%3Ci%3EZoom%20dysmorphia%3A%20An%20eye-tracking%20study%20of%20self-view%20and%20attention%20during%20video%20conferences%3C%5C%2Fi%3E.%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdigitalcommons.dartmouth.edu%5C%2Fcognitive-science_senior_theses%5C%2F5%5C%2F%27%3Ehttps%3A%5C%2F%5C%2Fdigitalcommons.dartmouth.edu%5C%2Fcognitive-science_senior_theses%5C%2F5%5C%2F%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Zoom%20dysmorphia%3A%20An%20eye-tracking%20study%20of%20self-view%20and%20attention%20during%20video%20conferences%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kathleen%20H.%22%2C%22lastName%22%3A%22Stimson%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdigitalcommons.dartmouth.edu%5C%2Fcognitive-science_senior_theses%5C%2F5%5C%2F%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-12-21T03%3A42%3A38Z%22%7D%7D%5D%7D
Fu, B., & Chow, N. (2025). AdaptLIL: A Real-Time Adaptive Linked Indented List Visualization for Ontology Mapping. In G. Demartini, K. Hose, M. Acosta, M. Palmonari, G. Cheng, H. Skaf-Molli, N. Ferranti, D. Hernández, & A. Hogan (Eds.), The Semantic Web – ISWC 2024 (Vol. 15232, pp. 3–22). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-77850-6_1
Токмовцева, А. Д., & Акельева, Е. В. (2024). Insights into Landscape Perception and Appreciation through Eye Movement Tracking. Lurian Journal, 5(2), 38–46. https://doi.org/10.15826/Lurian.2024.5.2.2
Chow, N., & Fu, B. (2024). AdaptLIL: A Gaze-Adaptive Visualization for Ontology Mapping (No. arXiv:2411.11768). arXiv. https://doi.org/10.48550/arXiv.2411.11768
Cui, Y., & Liu, X. (2024). How condensation and non-condensation impact viewers’ processing effort and comprehension – an eye-tracking study on Chinese subtitling of English documentaries. Perspectives, 1–19. https://doi.org/10.1080/0907676X.2024.2433059
Erol Barkana, D., Bartl-Pokorny, K. D., Kose, H., Landowska, A., Milling, M., Robins, B., Schuller, B. W., Uluer, P., Wrobel, M. R., & Zorcec, T. (2024). Challenges in Observing the Emotions of Children with Autism Interacting with a Social Robot. International Journal of Social Robotics. https://doi.org/10.1007/s12369-024-01185-3
Ciukaj, M., & Skublewska-Paszkowska, M. (2024). Comparative analysis of the availability of popular social networking sites. Journal of Computer Sciences Institute, 32, 217–222. https://doi.org/10.35784/jcsi.6292
Huang, J., Gopalakrishnan, S., Mittal, T., Zuena, J., & Pytlarz, J. (2024). Analysis of Human Perception in Distinguishing Real and AI-Generated Faces: An Eye-Tracking Based Study (No. arXiv:2409.15498). arXiv. https://doi.org/10.48550/arXiv.2409.15498
Palacios-Ibáñez, A., Castellet-Lathan, S., & Contero, M. (2024). Exploring the user’s gaze during product evaluation through the semantic differential: a comparison between virtual reality and photorealistic images. Virtual Reality, 28(3), 153. https://doi.org/10.1007/s10055-024-01048-2
Huang, Z., Zhu, G., Duan, X., Wang, R., Li, Y., Zhang, S., & Wang, Z. (2024). Measuring eye-tracking accuracy and its impact on usability in apple vision pro (No. arXiv:2406.00255). arXiv. https://doi.org/10.48550/arXiv.2406.00255
Wiediartini, Ciptomulyono, U., & Dewi, R. S. (2024). Evaluation of physiological responses to mental workload in n-back and arithmetic tasks. Ergonomics, 67(8), 1121–1133. https://doi.org/10.1080/00140139.2023.2284677
Lin, J.-H., Hsu, M., & Guo, L.-Y. (2024). Investigation of the Reliability of Oculomotor Assessment of Gaze and Smooth Pursuit with a Novel Approach. 2024 17th International Convention on Rehabilitation Engineering and Assistive Technology (i-CREATe), 1–6. https://doi.org/10.1109/i-CREATe62067.2024.10776135
Silva, F., Garrido, M. I., & Soares, S. C. (2024). The effect of anxiety and its interplay with social cues when perceiving aggressive behaviours. Quarterly Journal of Experimental Psychology, 17470218241258208. https://doi.org/10.1177/17470218241258209
Bezgin Ediş, L., Kılıç, S., & Aydın, S. (2024). Message Appeals of Social Media Postings: An Experimental Study on Non-Governmental Organization. Journal of Nonprofit & Public Sector Marketing, 1–21. https://doi.org/10.1080/10495142.2024.2377975
Fu, B., Soriano, A. R., Chu, K., Gatsby, P., & Guardado, N. (2024). Modelling Visual Attention for Future Intelligent Flight Deck - A Case Study of Pilot Eye Tracking in Simulated Flight Takeoff. Adjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization, 170–175. https://doi.org/10.1145/3631700.3664871
Murphy, T., Armitage, J. A., van Wijngaarden, P., Abel, L. A., & Douglass, A. (2024). Unmasking visual search: an objective framework for grouping eye tracking data. Investigative Ophthalmology & Visual Science, 65(7), 5179.
Nguyen-Ho, T.-L., Kongmeesub, O., Tran, M.-T., Nie, D., Healy, G., & Gurrin, C. (2024). EAGLE: Eyegaze-Assisted Guidance and Learning Evaluation for Lifeloging Retrieval. Proceedings of the 7th Annual ACM Workshop on the Lifelog Search Challenge, 18–23. https://doi.org/10.1145/3643489.3661115
Tedla, S. K., MacKenzie, S., & Brown, M. (2024). LookToFocus: Image Focus via Eye Tracking. Proceedings of the 2024 Symposium on Eye Tracking Research and Applications, 1–7. https://doi.org/10.1145/3649902.3656358
Moutinho, L., & Cerf, M. (Eds.). (2024). Biometrics and Neuroscience Research in Business and Management: Advances and Applications. De Gruyter. https://doi.org/10.1515/9783110708509
Taieb-Maimon, M., & Romanovskii-Chernik, L. (2024). Improving Error Correction and Text Editing Using Voice and Mouse Multimodal Interface. International Journal of Human–Computer Interaction, 1–24. https://doi.org/10.1080/10447318.2024.2352932
Emami, P., Jiang, Y., Guo, Z., & Leiva, L. A. (2024). Impact of Design Decisions in Scanpath Modeling. Proceedings of the ACM on Human-Computer Interaction, 8(ETRA), 1–16. https://doi.org/10.1145/3655602
Dondi, P., Sapuppo, S., & Porta, M. (2024). Leyenes: A gaze-based text entry method using linear smooth pursuit and target speed. International Journal of Human-Computer Studies, 184, 103204. https://doi.org/10.1016/j.ijhcs.2023.103204
Kobylska, A., & Dzieńkowski, M. (2024). User experience analysis in virtual museums. Journal of Computer Sciences Institute, 30, 31–38. https://doi.org/10.35784/jcsi.5382
Moradizeyveh, S., Tabassum, M., Liu, S., Newport, R. A., Beheshti, A., & Ieva, A. D. (2024). When Eye-Tracking Meets Machine Learning: A Systematic Review on Applications in Medical Image Analysis (No. arXiv:2403.07834). arXiv. https://doi.org/10.48550/arXiv.2403.07834
Jiang, Y., Leiva, L. A., Houssel, P. R. B., Tavakoli, H. R., Kylmälä, J., & Oulasvirta, A. (2024). UEyes: An Eye-Tracking Dataset across User Interface Types (No. arXiv:2402.05202). arXiv. https://doi.org/10.48550/arXiv.2402.05202
Yin, R., & Neyens, D. M. (2024). Examining how information presentation methods and a chatbot impact the use and effectiveness of electronic health record patient portals: An exploratory study. Patient Education and Counseling, 119, 108055. https://doi.org/10.1016/j.pec.2023.108055
Murphy, T. I., Abel, L. A., Armitage, J. A., & Douglass, A. G. (2024). Effects of tracker location on the accuracy and precision of the Gazepoint GP3 HD for spectacle wearers. Behavior Research Methods, 56(1), 43–52. https://doi.org/10.3758/s13428-022-02023-y
Taieb-Maimon, M., Romanovski-Chernik, A., Last, M., Litvak, M., & Elhadad, M. (2024). Mining Eye-Tracking Data for Text Summarization. International Journal of Human–Computer Interaction, 40(17), 4887–4905. https://doi.org/10.1080/10447318.2023.2227827
Jankowski, M., & Goroncy, A. (2024). Anatomical variants of acne differ in their impact on social perception. Journal of the European Academy of Dermatology and Venereology, 38(8), 1628–1636. https://doi.org/10.1111/jdv.19798
Chvátal, R., Slezáková, J., & Popelka, S. (2024). Analysis of problem-solving strategies for the development of geometric imagination using eye-tracking. Education and Information Technologies, 29(10), 12969–12987. https://doi.org/10.1007/s10639-023-12395-z
Chhimpa, G. R., Kumar, A., Garhwal, S., & Dhiraj. (2024). Empowering individuals with disabilities: a real-time, cost-effective, calibration-free assistive system utilizing eye tracking. Journal of Real-Time Image Processing, 21(3), 97. https://doi.org/10.1007/s11554-024-01478-w
Avoyan, A., Ribeiro, M., Schotter, A., Schotter, E. R., Vaziri, M., & Zou, M. (2024). Planned vs. Actual Attention. Management Science, 70(5), 2912–2933. https://doi.org/10.1287/mnsc.2023.4834
Conijn, R., Dux Speltz, E., & Chukharev-Hudilainen, E. (2024). Automated extraction of revision events from keystroke data. Reading and Writing, 37(2), 483–508. https://doi.org/10.1007/s11145-021-10222-w
Segedinac, M., Savić, G., Zeljković, I., Slivka, J., & Konjović, Z. (2024). Assessing code readability in Python programming courses using eye‐tracking. Computer Applications in Engineering Education, 32(1), e22685. https://doi.org/10.1002/cae.22685
Sims, J. P., Haynes, A., & Lanius, C. (2024). Exploring the utility of eye tracking for sociological research on race. The British Journal of Sociology, 75(1), 65–72. https://doi.org/10.1111/1468-4446.13054
Duwer, A., & Dzieńkowski, M. (2024). Analysis of the usability of selected auction websites. Journal of Computer Sciences Institute, 31, 138–144. https://ph.pollub.pl/index.php/jcsi/article/view/6200
Byrne, M. (2024). Master of Arts [PhD Thesis, RICE UNIVERSITY]. https://repository.rice.edu/bitstreams/1c22e77e-df9f-4541-bbbf-04b91d20755f/download
Hahn, A. C., Riedelsheimer, J. A., Royer, Z., Frederick, J., Kee, R., Crimmins, R., Huber, B., Harris, D. H., & Jantzen, K. J. (2024). Effects of cleft lip on visual scanning and neural processing of infant faces. Plos One, 19(3), e0300673. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0300673
Robertson, B. D. (2024). Relationship Between Heart Rate Variability, Saccadic Impairment, and Cognitive Performance Following Mild Traumatic Brain Injury in a Military Population [PhD Thesis, Alliant International University]. https://search.proquest.com/openview/5a86f190ba90aec459361b3cdf8ee3b0/1?pq-origsite=gscholar&cbl=18750&diss=y
Barriga, A. D. (2024). In Your Sight and in Your Mind: The Puppeteer as Cognitive Guide in Koryū Nishikawa V and Tom Lee’s Shank’s Mare. Theatre Topics, 34(3), 197–207. https://muse.jhu.edu/pub/1/article/942001/summary
Asaraf, S., Parmet, Y., & Borowsky, A. (2024). Hazard perception and attention of track safety supervisor as a function of working time. https://www.hfes-europe.org/wp-content/uploads/2024/05/Asaraf2024.pdf
Špajdel, M. (2024). Analysis of Eye Movements Reveals Longer Visual Saccades and Abnormal Preference for Social Images in Autism Spectrum Disorder. https://rediviva.sav.sk/66i1/1.pdf
Shepherd, S. S., & Kidd, C. (2024). Visual engagement is not synonymous with learning in young children. Proceedings of the Annual Meeting of the Cognitive Science Society, 46. https://escholarship.org/uc/item/0wz74769
Chavez, F. (2024). Computational Modeling of Voters’ Checking Behavior & Checking Performance [Master’s Thesis, Rice University]. https://search.proquest.com/openview/3055e7cd3118a7fd5b44379a5d5f48ba/1?pq-origsite=gscholar&cbl=18750&diss=y
Baltuttis, D., & Teubner, T. (2024). Effects of Visual Risk Indicators on Phishing Detection Behavior: An Eye-Tracking Experiment. Computers & Security, 103940. https://www.sciencedirect.com/science/article/pii/S0167404824002451
George, J. F. (2024). Discovering why people believe disinformation about healthcare. Plos One, 19(3), e0300497. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0300497
Kerr, C. (2024). Seeing the science and technology pipeline. IEEE Engineering Management Review. https://ieeexplore.ieee.org/abstract/document/10529524/
Cheng, G., Zou, D., Xie, H., & Wang, F. L. (2024). Exploring differences in self-regulated learning strategy use between high-and low-performing students in introductory programming: An analysis of eye-tracking and retrospective think-aloud data from program comprehension. Computers & Education, 208, 104948. https://www.sciencedirect.com/science/article/pii/S0360131523002257
Danielkiewicz, R., & Dzieńkowski, M. (2024). Analysis of user experience during interaction with automotive repair workshop websites. Journal of Computer Sciences Institute, 30, 39–46. https://ph.pollub.pl/index.php/jcsi/article/view/5416
Ghiţă, A., Hernández-Serrano, O., Moreno, M., Monràs, M., Gual, A., Maurage, P., Gacto-Sánchez, M., Ferrer-García, M., Porras-García, B., & Gutiérrez-Maldonado, J. (2024). Exploring Attentional Bias toward Alcohol Content: Insights from Eye-Movement Activity. European Addiction Research, 30(2), 65–79. https://karger.com/ear/article/30/2/65/896035
Stimson, K. H. (2024). Zoom dysmorphia: An eye-tracking study of self-view and attention during video conferences. https://digitalcommons.dartmouth.edu/cognitive-science_senior_theses/5/