100+ datasets found
  1. R

    Eye Tracking Dataset

    • universe.roboflow.com
    zip
    Updated Aug 4, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Wheelchaireyedetection (2023). Eye Tracking Dataset [Dataset]. https://universe.roboflow.com/wheelchaireyedetection/eye-tracking-ztsqw/dataset/1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Aug 4, 2023
    Dataset authored and provided by
    Wheelchaireyedetection
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Variables measured
    Pupil Bounding Boxes
    Description

    Eye Tracking

    ## Overview
    
    Eye Tracking is a dataset for object detection tasks - it contains Pupil annotations for 301 images.
    
    ## Getting Started
    
    You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
    
      ## License
    
      This dataset is available under the [MIT license](https://creativecommons.org/licenses/MIT).
    
  2. P

    GazeCapture Dataset

    • paperswithcode.com
    Updated Jun 13, 2016
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Kyle Krafka; Aditya Khosla; Petr Kellnhofer; Harini Kannan; Suchendra Bhandarkar; Wojciech Matusik; Antonio Torralba (2016). GazeCapture Dataset [Dataset]. https://paperswithcode.com/dataset/gazecapture
    Explore at:
    Dataset updated
    Jun 13, 2016
    Authors
    Kyle Krafka; Aditya Khosla; Petr Kellnhofer; Harini Kannan; Suchendra Bhandarkar; Wojciech Matusik; Antonio Torralba
    Description

    From scientific research to commercial applications, eye tracking is an important tool across many domains. Despite its range of applications, eye tracking has yet to become a pervasive technology. We believe that we can put the power of eye tracking in everyone's palm by building eye tracking software that works on commodity hardware such as mobile phones and tablets, without the need for additional sensors or devices. We tackle this problem by introducing GazeCapture, the first large-scale dataset for eye tracking, containing data from over 1450 people consisting of almost $2.5M$ frames. Using GazeCapture, we train iTracker, a convolutional neural network for eye tracking, which achieves a significant reduction in error over previous approaches while running in real time (10 - 15fps) on a modern mobile device. Our model achieves a prediction error of 1.7cm and 2.5cm without calibration on mobile phones and tablets respectively. With calibration, this is reduced to 1.3cm and 2.1cm. Further, we demonstrate that the features learned by iTracker generalize well to other datasets, achieving state-of-the-art results.

  3. Jetris - An eyetracking dataset from a Tetris-like task

    • zenodo.org
    bin, zip
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Kshitij Sharma; Luis P. Prieto; Kshitij Sharma; Luis P. Prieto (2020). Jetris - An eyetracking dataset from a Tetris-like task [Dataset]. http://doi.org/10.5281/zenodo.16516
    Explore at:
    zip, binAvailable download formats
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Kshitij Sharma; Luis P. Prieto; Kshitij Sharma; Luis P. Prieto
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    This dataset contains eye-tracking data from a set of 16 subjects, playing a series of short games of Tetris (for up to 5 minutes each), in different conditions (e.g., collaborative vs competitive).

    This dataset has been used in several scientific works, such as the [CSCL 2015](http://isls.org/cscl2015/) conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset such paper is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration

  4. f

    Data from: Eye-Tracking Dataset to Support the Research on Autism Spectrum...

    • figshare.com
    zip
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Federica Cilia; Romuald Carette; Mahmoud Elbattah; Jean-Luc Guérin; Gilles Dequen (2023). Eye-Tracking Dataset to Support the Research on Autism Spectrum Disorder [Dataset]. http://doi.org/10.6084/m9.figshare.20113592.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    figshare
    Authors
    Federica Cilia; Romuald Carette; Mahmoud Elbattah; Jean-Luc Guérin; Gilles Dequen
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Abstract: This study aims to publish an eye-tracking dataset developed for the purpose of autism diagnosis. Eye-tracking methods are used intensively in that context, whereas abnormalities of the eye gaze are largely recognised as the hallmark of autism. As such, it is believed that the dataset can allow for developing useful applications or discovering interesting insights. As well, Machine Learning is a potential application for developing diagnostic models that can help detect autism at an early stage of development.

    Dataset Description: The dataset is distributed over 25 CSV-formatted files. Each file represents the output of an eye-tracking experiment. However, a single experiment usually included multiple participants. The participant ID is clearly provided at each record at the ‘Participant’ column, which can be used to identify the class of participant (i.e., Typically Developing or ASD). Furthermore, a set of metadata files is included. The main metadata file, Participants.csv, is used to describe the key characteristics of participants (e.g. gender, age, CARS). Every participant was also assigned a unique ID.

    Dataset Citation: Cilia, F., Carette, R., Elbattah, M., Guérin, J., & Dequen, G. (2022). Eye-Tracking Dataset to Support the Research on Autism Spectrum Disorder. In Proceedings of the IJCAI–ECAI Workshop on Scarce Data in Artificial Intelligence for Healthcare (SDAIH).

  5. f

    Eyetracking 2018. Dataset 1 and 2.

    • figshare.com
    txt
    Updated Jul 30, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Blinded Researcher (2018). Eyetracking 2018. Dataset 1 and 2. [Dataset]. http://doi.org/10.6084/m9.figshare.6876455.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    Jul 30, 2018
    Dataset provided by
    figshare
    Authors
    Blinded Researcher
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Datasets described in the manuscript: 'Empathy Modulates the Temporal Structure of Social Attention'Dataset1.txt.Column names.1. X coordinate2. Y coordinate3. Timestamp (ms)4. Participant5. Trial6. Codes whether the stimulus is intact or scrambled (1= intact, 2 = scrambled).7. Codes whether gaze is in the social AOI (boolean).8. Codes whether gaze is in the nonsocial AOI (boolean).9. Codes the presence of trackloss (boolean)10. The observer's EQ score.Dataset2.txt.Column names.1. X coordinate2. Y coordinate3. Codes the side of the social stimulus4. Timestamp (ms)5. Participant6. Trial7. Codes whether gaze is in the left AOI (boolean)8. Codes whether gaze is in the right AOI (boolean)9. Codes whether the stimulus is intact or scrambled10. Codes the AOI that gaze is directed in (see next 2 columns)11. Whether the gaze is in the social AOI (boolean).12. Whether the gaze is in the nonsocial AOI (boolean).13. A column indicating the presence of trackloss (boolean)14. The observer's EQ score.

  6. DELANA - An eyetracking dataset from facilitating a series of laptop-based...

    • zenodo.org
    • data.niaid.nih.gov
    • +1more
    bin, zip
    Updated Jan 21, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Luis P. Prieto; Kshitij Sharma; Luis P. Prieto; Kshitij Sharma (2020). DELANA - An eyetracking dataset from facilitating a series of laptop-based lessons [Dataset]. http://doi.org/10.5281/zenodo.16514
    Explore at:
    zip, binAvailable download formats
    Dataset updated
    Jan 21, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Luis P. Prieto; Kshitij Sharma; Luis P. Prieto; Kshitij Sharma
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    This dataset contains eye-tracking data from a two subjects (an expert and a novice teachers), facilitating three collaborative learning lessons (2 for the expert, 1 for the novice) in a classroom with laptops and a projector, with real master-level students. These sessions were recorded during a course on the topic of digital education and learning analytics at [EPFL](http://epfl.ch).

    This dataset has been used in several scientific works, such as the [CSCL 2015](http://isls.org/cscl2015/) conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration

  7. Z

    JDC2014 - An eyetracking dataset from facilitating a semi-authentic...

    • data.niaid.nih.gov
    • zenodo.org
    Updated Jan 21, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Prieto, Luis P. (2020). JDC2014 - An eyetracking dataset from facilitating a semi-authentic multi-tabletop lesson [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_16515
    Explore at:
    Dataset updated
    Jan 21, 2020
    Dataset provided by
    Sharma, Kshitij
    Prieto, Luis P.
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    This dataset contains eye-tracking data from a single subject (a researcher), facilitating three collaborative learning lessons in a multi-tabletop classroom, with real 10-12 year old students. These sessions were recorded during an "open doors day" at the CHILI Lab.

    This dataset has been used in several scientific works, such as the CSCL 2015 conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration

  8. R

    Ps6 Eyetracking Dataset

    • universe.roboflow.com
    zip
    Updated Apr 12, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    EyeTracking (2024). Ps6 Eyetracking Dataset [Dataset]. https://universe.roboflow.com/eyetracking-u8yw1/ps6-eyetracking
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 12, 2024
    Dataset authored and provided by
    EyeTracking
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Variables measured
    Object In Hospital Room Bounding Boxes
    Description

    Ps6 Eyetracking

    ## Overview
    
    Ps6 Eyetracking is a dataset for object detection tasks - it contains Object In Hospital Room annotations for 826 images.
    
    ## Getting Started
    
    You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
    
      ## License
    
      This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
    
  9. Eye Tracker Data

    • figshare.com
    zip
    Updated Oct 15, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Pedro Lencastre (2022). Eye Tracker Data [Dataset]. http://doi.org/10.6084/m9.figshare.19729636.v2
    Explore at:
    zipAvailable download formats
    Dataset updated
    Oct 15, 2022
    Dataset provided by
    Figsharehttp://figshare.com/
    figshare
    Authors
    Pedro Lencastre
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Eye tracker as described in the paper

  10. ISL2015NOVEL - An eyetracking dataset from facilitating secondary...

    • zenodo.org
    • explore.openaire.eu
    • +1more
    bin, zip
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Luis P. Prieto; Kshitij Sharma; Luis P. Prieto; Kshitij Sharma (2020). ISL2015NOVEL - An eyetracking dataset from facilitating secondary multi-tabletop classrooms [Dataset]. http://doi.org/10.5281/zenodo.198681
    Explore at:
    bin, zipAvailable download formats
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Luis P. Prieto; Kshitij Sharma; Luis P. Prieto; Kshitij Sharma
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    IMPORTANT NOTE: One of the files in this dataset is incorrect, see this dataset's erratum at https://zenodo.org/record/203958

    This dataset contains eye-tracking data from a single subject (an experienced teacher), facilitating two geometry lessons in a secondary school classroom, with 11-12 year old students using tangible paper tabletops and a projector. These sessions were recorded in the frame of the MIOCTI project (http://chili.epfl.ch/miocti).

    This dataset has been used in several scientific works, such a submitted journal paper "Orchestration Load Indicators and Patterns: In-the-wild Studies Using Mobile Eye-tracking", by Luis P. Prieto, Kshitij Sharma, Lukasz Kidzinski & Pierre Dillenbourg (the analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/paper-IEEETLT-orchestrationload)

  11. R

    Eyetracking in a Virtual Gallery

    • repod.icm.edu.pl
    tsv, txt
    Updated Jul 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Osinska, Veslava; Szalach, Adam; Piotrowski, Dominik; Gross, Tomasz (2025). Eyetracking in a Virtual Gallery [Dataset]. http://doi.org/10.18150/QCVQNA
    Explore at:
    txt(905), txt(901), tsv(45744316), tsv(55107013), txt(893), tsv(62719294)Available download formats
    Dataset updated
    Jul 1, 2025
    Dataset provided by
    RepOD
    Authors
    Osinska, Veslava; Szalach, Adam; Piotrowski, Dominik; Gross, Tomasz
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Dataset funded by
    Narodowe Centrum Nauki
    Description

    Files # 1Format is CSV======================================real.csvA table 8 columns by 1 807 194 rows.All data about subject were fully anonimized ID,TrialSequence,TrialID,Time,PupilDiaX,PupilDiaY,GazePosX,GazePosYThe columns contain information about user ID (#), stimuli siquence, stimuli ID, timestamp, Pupil Diameter X, Pupil Diaeter Y, Gaze Position X Gaze Position Y.The measurement was performed by headset HTC VIVE PRO Eye.ResearchersVeslava Osinska, Adam Szalach, Dominik Piotrowski, Tomasz GrossTime12.2024-02.2025Description The dataset contains the results of eye tracking studies of visual perception of a set of real style images in VRKeywords eye tracking, images, visual perception, heasetSharing and access informationThe data is available under a CC0 license.The data was made available on June 30, 2025.~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~Files # 2Format is CSV======================================modern.csvA table 8 columns by 1 588 084 rows.All data about subject were fully anonimized ID,TrialSequence,TrialID,Time,PupilDiaX,PupilDiaY,GazePosX,GazePosYThe columns contain information about user ID (#), stimuli siquence, stimuli ID, timestamp, Pupil Diameter X, Pupil Diaeter Y, Gaze Position X Gaze Position Y.The measurement was performed by headset HTC VIVE PRO Eye.ResearchersVeslava Osinska, Adam Szalach, Dominik Piotrowski, Tomasz GrossTime12.2024-02.2025Description The dataset contains the results of eye tracking studies of visual perception of a set of modern various style images in VRKeywords eye tracking, images, visual perception, heasetSharing and access informationThe data is available under a CC0 license.The data was made available on June 30, 2025.~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~Files # 3Format is CSV======================================graphics.csvA table 8 columns by 1 318 860 rows.All data about subject were fully anonimized ID,TrialSequence,TrialID,Time,PupilDiaX,PupilDiaY,GazePosX,GazePosYThe columns contain information about user ID (#), stimuli siquence, stimuli ID, timestamp, Pupil Diameter X, Pupil Diaeter Y, Gaze Position X Gaze Position Y.The measurement was performed by headset HTC VIVE PRO Eye.ResearchersVeslava Osinska, Adam Szalach, Dominik Piotrowski, Tomasz GrossTime12.2024-02.2025Description The dataset contains the results of eye tracking studies of visual perception of a set of graphics style images in VRKeywords eye tracking, images, visual perception, heasetSharing and access informationThe data is available under a CC0 license.The data was made available on June 30, 2025.

  12. eSEEd: emotional State Estimation based on Eye-tracking dataset

    • zenodo.org
    • explore.openaire.eu
    • +1more
    zip
    Updated Feb 13, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Vasileios Skaramagkas; Vasileios Skaramagkas; Emmanouil Ktistakis; Dimitris Manousos; Eleni Kazantzaki; Nikolaos S. Tachos; Evanthia Tripoliti; Dimitrios I. Fotiadis; Manolis Tsiknakis; Emmanouil Ktistakis; Dimitris Manousos; Eleni Kazantzaki; Nikolaos S. Tachos; Evanthia Tripoliti; Dimitrios I. Fotiadis; Manolis Tsiknakis (2024). eSEEd: emotional State Estimation based on Eye-tracking dataset [Dataset]. http://doi.org/10.5281/zenodo.7794625
    Explore at:
    zipAvailable download formats
    Dataset updated
    Feb 13, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Vasileios Skaramagkas; Vasileios Skaramagkas; Emmanouil Ktistakis; Dimitris Manousos; Eleni Kazantzaki; Nikolaos S. Tachos; Evanthia Tripoliti; Dimitrios I. Fotiadis; Manolis Tsiknakis; Emmanouil Ktistakis; Dimitris Manousos; Eleni Kazantzaki; Nikolaos S. Tachos; Evanthia Tripoliti; Dimitrios I. Fotiadis; Manolis Tsiknakis
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    We present eSEEd- emotional State Estimation based on Eye-tracking database. Eye movements of 48 participants were recorded as they watched 10 emotion evoking videos each of them followed by a neutral video. Participants rated five emotions (tenderness, anger, disgust, sadness, neutral) on a scale from 0 to 10, later translated in terms of emotional arousal and valence levels. Furthermore, each participant filled 3 self-assessment questionnaires. An extensive analysis of the participants' answers to the questionnaires self-assessment scores as well as their ratings during the experiments is presented. Moreover, eye and gaze features were extracted from the low level eye recorded metrics and their correlations with the participants' ratings are investigated. Finally, analysis and results are presented for machine learning approaches, for the classification of various arousal and valence levels based solely on eye and gaze features. The dataset is made publicly available and we encourage other researchers to use it for testing new methods and analytic pipelines for the estimation of an individual's affective state.

    TO USE THIS DATASET PLEASE CITE:
    Skaramagkas, V.; Ktistakis, E.; Manousos, D.; Kazantzaki, E.; Tachos, N.S.; Tripoliti, E.; Fotiadis, D.I.; Tsiknakis, M. eSEE-d: Emotional State Estimation Based on Eye-Tracking Dataset. Brain Sci. 2023, 13, 589. https://doi.org/10.3390/brainsci13040589

  13. i

    Youth of Utrecht. (2024). Dual Eyetracking [Data set]. Utrecht University....

    • data.individualdevelopment.nl
    Updated Oct 17, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). Youth of Utrecht. (2024). Dual Eyetracking [Data set]. Utrecht University. https://doi.org/10.60641/cqgb-r705 [Dataset]. https://data.individualdevelopment.nl/dataset/4078266adfb9691830a623bfc7bfee37
    Explore at:
    Dataset updated
    Oct 17, 2024
    Area covered
    Utrecht
    Description

    A dual eye-tracking set-up was used that is capable of concurrently recording eye movements, frontal video, and audio during video-mediated face-to-face interactions between parents and their preadolescent children. Dyads in which parents and children engaged in conversations about cooperative and conflictive family topics were measured. Each conversation lasted for approximately 5 minutes.

  14. Z

    ISL2014BASELINE - An eyetracking dataset from facilitating secondary...

    • data.niaid.nih.gov
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Prieto, Luis P. (2020). ISL2014BASELINE - An eyetracking dataset from facilitating secondary geometry lessons [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_16551
    Explore at:
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Sharma, Kshitij
    Prieto, Luis P.
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    This dataset contains eye-tracking data from a single subject (a researcher), facilitating two geometry lessons in a secondary school classroom, with 11-12 year old students using laptops and a projector. These sessions were recorded in the frame of the MIOCTI project (http://chili.epfl.ch/miocti).

    This dataset has been used in several scientific works, such as the ECTEL 2015 (http://ectel2015.httc.de/) conference paper "Studying Teacher Orchestration Load in Technology-Enhanced Classrooms: A Mixed-method Approach and Case Study", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg (the analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/ectel2015-orchestration-school)

  15. Z

    THÖR - eye-tracking

    • data.niaid.nih.gov
    Updated Jan 24, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Kaj O. Arras (2020). THÖR - eye-tracking [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_3408199
    Explore at:
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Tomasz Piotr Kucner
    Ravi Teja Chadalavada
    Andrey Rudenko
    Achim J. Lilienthal
    Chittaranjan Sriniva Swaminathan SWAMINATHAN
    Kaj O. Arras
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    THÖR is a dataset with human motion trajectory and eye gaze data collected in an indoor environment with accurate ground truth for the position, head orientation, gaze direction, social grouping and goals. THÖR contains sensor data collected by a 3D lidar sensor and involves a mobile robot navigating the space. In comparison to other, our dataset has a larger variety in human motion behaviour, is less noisy, and contains annotations at higher frequencies.

    THÖR eye-tracking - data of the participant (Helmet number 9) from the Tobii Glasses included in this dataset. The entire data from the experiment was recorded into two recordings - "Recording011" and "Recording012".

    The folder "RawData" consists of exported data from Tobii Pro Lab software using a filter called "Tobii-IVT Attention filter" (velocity threshold parameter set to 100 degrees/second), which is a recommended method for dynamic situations. The recommendation was given by the equipment manufactures Tobii Pro and from other researchers. For further information, please refer to https://www.tobiipro.com/siteassets/tobii-pro/user-manuals/Tobii-Pro-Lab-User-Manual/?v=1.86 (Appendix-B, page 85).

    The recording start times are as follows: Tobii recording011 starttime: 13:34:37.267 Tobii recording012 starttime: 14:36:17.730

    1. "Synchronized_Qualisys_Tobii.mat" file, which consists of all the synchronised data between Qualisys data and Tobii eye-tracker data using timestamps matching. The columns (headers) in this mat file respectively represent - 'timestamp', 'Pos_X', 'Pos_Y', 'Pos_Z', 'Head_R', 'Head_P', 'Head_Y', 'GazepointX', 'GazepointY', 'Gazepoint3D_X', 'Gazepoint3D_Y', 'Gazepoint3D_Z', 'Gazedirectionleft_X', 'Gazedirectionleft_Y', 'Gazedirectionleft_Z', 'Gazedirectionright_X', 'Gazedirectionright_Y', 'Gazedirectionright_Z', 'Pupilpositionleft_X', 'Pupilpositionleft_Y', 'Pupilpositionleft_Z', 'Pupilpositionright_X', 'Pupilpositionright_Y', 'Pupilpositionright_Z', 'Pupildiameterleft', 'Pupildiameterright', 'Gazeeventduration', 'Eyemovementtype_index', 'Fixationpoint_X', 'Fixationpoint_Y', 'Gyro_X', 'Gyro_Y', 'Gyro_Z', 'Accelerometer_X', 'Accelerometer_Y', 'Accelerometer_Z'.

    2. A matlab script "Synchronizing_Qualisys_Tobii.m" used for matching the timestamps of Qualisys and Tobii and generate a data file "Synchronized_Qualisys_Tobii.mat" that can be found it git repository.

    3. All the associated data required for running the matlab script.

    Please note that 'Timestamp', 'Pos_X', 'Pos_Y', 'Pos_Z', 'Head_R', 'Head_P', 'Head_Y','GazepointX', 'GazepointY' represent the timestamps (matched to Tobii timestamps using nearest neighbor search), position and head orientation from Qualisys data and rest of the data is from Tobii eye-tracker. For more information regarding the eye-tracker data, please refer to https://www.tobiipro.com/siteassets/tobii-pro/user-manuals/Tobii-Pro-Lab-User-Manual/?v=1.86 (Section 8.7.2.1, page 68).

  16. D

    MPIIDPEye: Privacy-Aware Eye Tracking Using Differential Privacy

    • darus.uni-stuttgart.de
    Updated Oct 28, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Andreas Bulling (2022). MPIIDPEye: Privacy-Aware Eye Tracking Using Differential Privacy [Dataset]. http://doi.org/10.18419/DARUS-3235
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Oct 28, 2022
    Dataset provided by
    DaRUS
    Authors
    Andreas Bulling
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Dataset funded by
    JST CREST research grant, Japan
    German Federal Ministry of Education and Research (BMBF) for the Center for IT-Security, Privacy and Accountability (CISPA)
    DFG
    Description

    We designed a privacy-aware VR interface that uses differential privacy, which we evaluate on a new 20-participant dataset for two privacy sensitive tasks. The data consists of eye gaze as participants read different types of documents. The dataset consists of a .zip file with two folders (Eye_Tracking_Data and Eye_Movement_Features), a .csv file with the ground truth annotation (Ground_Truth.csv) and a Readme.txt file. In each folder there are two files for participant (P) for each recording (R = document class). These two files contain the recorded eye tracking data and the corresponding eye movement features. The data is saved as a .npy and .csv file. The data scheme of the eye tracking data and eye movement features is given in the Readme.txt file.

  17. E

    Eye Tracking System Report

    • datainsightsmarket.com
    doc, pdf, ppt
    Updated Apr 29, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Data Insights Market (2025). Eye Tracking System Report [Dataset]. https://www.datainsightsmarket.com/reports/eye-tracking-system-492994
    Explore at:
    doc, ppt, pdfAvailable download formats
    Dataset updated
    Apr 29, 2025
    Dataset authored and provided by
    Data Insights Market
    License

    https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The global eye tracking system market, valued at $658 million in 2025, is projected to experience robust growth, driven by a compound annual growth rate (CAGR) of 9.8% from 2025 to 2033. This expansion is fueled by several key factors. The increasing adoption of virtual reality (VR) and augmented reality (AR) technologies across gaming, entertainment, and training applications necessitates precise and accurate eye tracking capabilities. Furthermore, the automotive industry's focus on advanced driver-assistance systems (ADAS) and autonomous driving is significantly boosting demand for reliable eye tracking systems to monitor driver alertness and attentiveness. The healthcare sector also presents a significant growth opportunity, with applications in ophthalmology, neurology, and rehabilitation leveraging eye tracking for diagnostics and treatment monitoring. Technological advancements leading to smaller, more affordable, and more accurate eye tracking devices are further accelerating market penetration. Specific applications like user experience (UX) research and market research also contribute to market growth. The market segmentation reveals a dynamic landscape. Eye tracking software holds a significant share, driven by its cost-effectiveness and ease of integration with existing systems. However, the hardware segment (eye trackers) is expected to witness substantial growth due to increasing demand for high-precision applications in medical and automotive sectors. Regionally, North America and Europe currently dominate the market due to early adoption and strong technological infrastructure. However, rapidly developing economies in Asia-Pacific are emerging as key growth regions, fueled by rising disposable incomes, technological advancements, and government initiatives to promote technological innovation. Competitive rivalry among established players like Tobii Pro, Smart Eye, and SR Research, alongside emerging innovative companies, keeps the market dynamic and competitive. Continued research and development efforts focusing on improving accuracy, reducing costs, and expanding applications will be crucial for future market growth and penetration into newer segments.

  18. Behavioral and Eye-tracking Data for Adaptive Circuit Dynamics Across Human...

    • figshare.com
    zip
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Peter Murphy; Niklas Wilming; Diana Carolina Hernandez Bocanegra; Genis Prat Ortega; Tobias Donner (2023). Behavioral and Eye-tracking Data for Adaptive Circuit Dynamics Across Human Cortex During Evidence Accumulation in Changing Environments [Dataset]. http://doi.org/10.6084/m9.figshare.14035935.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    Figsharehttp://figshare.com/
    figshare
    Authors
    Peter Murphy; Niklas Wilming; Diana Carolina Hernandez Bocanegra; Genis Prat Ortega; Tobias Donner
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains behavioural and eye-tracking data for:Murphy PR, Wilming N, Hernandez Bocanegra DC, Prat Ortega G & Donner TH (2021). Adaptive circuit dynamics across human cortex during evidence accumulation in changing environments. Nature Neuroscience. Online ahead of print.

    Each .zip file contains all data for a single participant and is organized as follows: data from each experimental session are contained in their own folder (S1, S2, etc.); each session folder in turn contains separate Sample_seqs, Behaviour and Eyetracking subfolders.

    The Sample_seqs folder contains Matlab .mat files (labelled ID_SESS_BLOCK.mat, where ID is the participant ID, SESS is the experimental session and BLOCK is the block number within that session) with information about the trial-specific stimulus sequences presented to the participant. The variables in each of these files are:

    gen – structure containing the generative statistics of the task

    stim – structure containing details about the physical presentation of the stimuli (see task script on Donnerlab Github for explanation of these)

    timing – structure containing details about the timing of stimulus presentation (see task script on Donnerlab Github for explanation of these)

    pshort – proportion of trials with stimulus sequences that were shorter than the full sequence length

    stimIn – trials*samples matrix of stimulus locations (in polar angle with horizontal midline = 0 degrees; NaN marks trials sequences that were shorter than the full sequence length)

    distseqs – trials*samples matrix of which generative distribution was used to draw each sample location

    pswitch – trials*samples matrix of binary flags marking when a switch in generative distribution occurred

    The Behaviour folder contains Matlab .mat files (same naming scheme as above) with information about the behaviour produced by the participant on each trial of the task. The main variable in each file is a matrix called Behav for which each row is a trial and columns are the following:

    column 1 – the generative distribution used to draw the final sample location on each trial (and thus, the correct response)

    column 2 – the response given by the participant

    column 3 – the accuracy of the participant’s response

    column 4 – response time relative to Go cue

    column 5 – trial onset according to psychtoolbox clock

    column 6 – number of times participant broke fixation during trial, according to online detection algorithm

    Each .mat file also contains a trials*samples matrix (tRefresh) of the timings of monitor flips corresponding to the onsets of each sample (and made relative to trial onset), as provided by psychtoolbox.

    The Eyetracking folder contains both raw Eyelink 1000 (SR Research) .edf files, and their conversions to .asc text files using the manufacturer’s edf2asc utility (same naming scheme as above). For stimulus and response trigger information, see task scripts on Donnerlab Github..zip file names ending in _2.zip correspond to the four participants from Experiment 2 of the paper, for whom sample-onset-asynchrony (SOA) was manipulated across two conditions (0.2 vs 0.6 s). All other participants are from Experiment 1, where SOA was fixed at 0.4 s.For example code for analyzing behaviour, fitting behavioural models, and analyzing pupil data, see https://github.com/DonnerLab/2021_Murphy_Adaptive-Circuit-Dynamics-Across-Human-Cortex.

  19. Eye Tracking System Market Forecast by Remote and Wearable Eye Tracking...

    • futuremarketinsights.com
    html, pdf
    Updated Apr 22, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Future Market Insights (2024). Eye Tracking System Market Forecast by Remote and Wearable Eye Tracking Systems for 2024 to 2034 [Dataset]. https://www.futuremarketinsights.com/reports/eye-tracking-systems-market
    Explore at:
    html, pdfAvailable download formats
    Dataset updated
    Apr 22, 2024
    Dataset authored and provided by
    Future Market Insights
    License

    https://www.futuremarketinsights.com/privacy-policyhttps://www.futuremarketinsights.com/privacy-policy

    Time period covered
    2024 - 2034
    Area covered
    Worldwide
    Description

    The eye tracking system market is envisioned to reach a value of US$ 1.90 billion in 2024 and register an incredible CAGR of 26.40% from 2024 to 2034. The market is foreseen to surpass US$ 19.76 billion by 2034. The emergence of vision capture technology services in retail, research, automotive, healthcare, and consumer electronics has immensely propelled the eye tracing system industry.

    AttributesDetails
    Market Value for 2024US$ 1.90 billion
    Market Value for 2034US$ 19.76 billion
    Market Forecast CAGR for 2024 to 203426.40%

    2019 to 2023 Historical Analysis vs. 2024 to 2034 Market Forecast Projection

    AttributesDetails
    Market Historical CAGR for 2019 to 202324.20%

    Category-wise Insights

    AttributesDetails
    Top System OrientationWearable Eye Tracking Systems
    Market share in 202444.2%
    AttributesDetails
    Top Sampling Rate61 to 120 Hz
    Market share in 202428.3%

    Country-wise Insights

    CountriesCAGR from 2024 to 2034
    United States23.20%
    Germany21.80%
    China26.90%
    Japan21.10%
    Australia29.90%
  20. MoMoView - Naturalistic Viewing Dataset. Developmental Comparison of Eye...

    • zenodo.org
    Updated Feb 17, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Iryna Schommartz; Iryna Schommartz (2025). MoMoView - Naturalistic Viewing Dataset. Developmental Comparison of Eye Gaze Reinstatement using Eyetracking. [Dataset]. http://doi.org/10.5281/zenodo.14883045
    Explore at:
    Dataset updated
    Feb 17, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Iryna Schommartz; Iryna Schommartz
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    ARENA Deliverable:

    • P2_WP1_03 [M20] Pilot Data collection - part 1 MoMoView

    In the MoMoView project we investigate individual differences during free viewing from developmental perspective. We investigate how development of pattern completions evolves over time and age and whether eye gaze reinstatement patterns of remembered information are more precise compared to eye gaze patterns of forgotten information. Additionally, we investigate how individual differences in eye gaze behaviour are related subsequent memory for central and peripheral details.

    This data contains eye fixations from children aged 5-12, young adults, aged 20-30, and older adults aged 65-80.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Wheelchaireyedetection (2023). Eye Tracking Dataset [Dataset]. https://universe.roboflow.com/wheelchaireyedetection/eye-tracking-ztsqw/dataset/1

Eye Tracking Dataset

eye-tracking-ztsqw

eye-tracking-dataset

Explore at:
zipAvailable download formats
Dataset updated
Aug 4, 2023
Dataset authored and provided by
Wheelchaireyedetection
License

MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically

Variables measured
Pupil Bounding Boxes
Description

Eye Tracking

## Overview

Eye Tracking is a dataset for object detection tasks - it contains Pupil annotations for 301 images.

## Getting Started

You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.

  ## License

  This dataset is available under the [MIT license](https://creativecommons.org/licenses/MIT).
Search
Clear search
Close search
Google apps
Main menu