MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
## Overview
Eye Tracking is a dataset for object detection tasks - it contains Pupil annotations for 301 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [MIT license](https://creativecommons.org/licenses/MIT).
From scientific research to commercial applications, eye tracking is an important tool across many domains. Despite its range of applications, eye tracking has yet to become a pervasive technology. We believe that we can put the power of eye tracking in everyone's palm by building eye tracking software that works on commodity hardware such as mobile phones and tablets, without the need for additional sensors or devices. We tackle this problem by introducing GazeCapture, the first large-scale dataset for eye tracking, containing data from over 1450 people consisting of almost $2.5M$ frames. Using GazeCapture, we train iTracker, a convolutional neural network for eye tracking, which achieves a significant reduction in error over previous approaches while running in real time (10 - 15fps) on a modern mobile device. Our model achieves a prediction error of 1.7cm and 2.5cm without calibration on mobile phones and tablets respectively. With calibration, this is reduced to 1.3cm and 2.1cm. Further, we demonstrate that the features learned by iTracker generalize well to other datasets, achieving state-of-the-art results.
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a set of 16 subjects, playing a series of short games of Tetris (for up to 5 minutes each), in different conditions (e.g., collaborative vs competitive).
This dataset has been used in several scientific works, such as the [CSCL 2015](http://isls.org/cscl2015/) conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset such paper is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Abstract: This study aims to publish an eye-tracking dataset developed for the purpose of autism diagnosis. Eye-tracking methods are used intensively in that context, whereas abnormalities of the eye gaze are largely recognised as the hallmark of autism. As such, it is believed that the dataset can allow for developing useful applications or discovering interesting insights. As well, Machine Learning is a potential application for developing diagnostic models that can help detect autism at an early stage of development.
Dataset Description: The dataset is distributed over 25 CSV-formatted files. Each file represents the output of an eye-tracking experiment. However, a single experiment usually included multiple participants. The participant ID is clearly provided at each record at the ‘Participant’ column, which can be used to identify the class of participant (i.e., Typically Developing or ASD). Furthermore, a set of metadata files is included. The main metadata file, Participants.csv, is used to describe the key characteristics of participants (e.g. gender, age, CARS). Every participant was also assigned a unique ID.
Dataset Citation: Cilia, F., Carette, R., Elbattah, M., Guérin, J., & Dequen, G. (2022). Eye-Tracking Dataset to Support the Research on Autism Spectrum Disorder. In Proceedings of the IJCAI–ECAI Workshop on Scarce Data in Artificial Intelligence for Healthcare (SDAIH).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Datasets described in the manuscript: 'Empathy Modulates the Temporal Structure of Social Attention'Dataset1.txt.Column names.1. X coordinate2. Y coordinate3. Timestamp (ms)4. Participant5. Trial6. Codes whether the stimulus is intact or scrambled (1= intact, 2 = scrambled).7. Codes whether gaze is in the social AOI (boolean).8. Codes whether gaze is in the nonsocial AOI (boolean).9. Codes the presence of trackloss (boolean)10. The observer's EQ score.Dataset2.txt.Column names.1. X coordinate2. Y coordinate3. Codes the side of the social stimulus4. Timestamp (ms)5. Participant6. Trial7. Codes whether gaze is in the left AOI (boolean)8. Codes whether gaze is in the right AOI (boolean)9. Codes whether the stimulus is intact or scrambled10. Codes the AOI that gaze is directed in (see next 2 columns)11. Whether the gaze is in the social AOI (boolean).12. Whether the gaze is in the nonsocial AOI (boolean).13. A column indicating the presence of trackloss (boolean)14. The observer's EQ score.
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a two subjects (an expert and a novice teachers), facilitating three collaborative learning lessons (2 for the expert, 1 for the novice) in a classroom with laptops and a projector, with real master-level students. These sessions were recorded during a course on the topic of digital education and learning analytics at [EPFL](http://epfl.ch).
This dataset has been used in several scientific works, such as the [CSCL 2015](http://isls.org/cscl2015/) conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a single subject (a researcher), facilitating three collaborative learning lessons in a multi-tabletop classroom, with real 10-12 year old students. These sessions were recorded during an "open doors day" at the CHILI Lab.
This dataset has been used in several scientific works, such as the CSCL 2015 conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
## Overview
Ps6 Eyetracking is a dataset for object detection tasks - it contains Object In Hospital Room annotations for 826 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Eye tracker as described in the paper
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
IMPORTANT NOTE: One of the files in this dataset is incorrect, see this dataset's erratum at https://zenodo.org/record/203958
This dataset contains eye-tracking data from a single subject (an experienced teacher), facilitating two geometry lessons in a secondary school classroom, with 11-12 year old students using tangible paper tabletops and a projector. These sessions were recorded in the frame of the MIOCTI project (http://chili.epfl.ch/miocti).
This dataset has been used in several scientific works, such a submitted journal paper "Orchestration Load Indicators and Patterns: In-the-wild Studies Using Mobile Eye-tracking", by Luis P. Prieto, Kshitij Sharma, Lukasz Kidzinski & Pierre Dillenbourg (the analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/paper-IEEETLT-orchestrationload)
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Files # 1Format is CSV======================================real.csvA table 8 columns by 1 807 194 rows.All data about subject were fully anonimized ID,TrialSequence,TrialID,Time,PupilDiaX,PupilDiaY,GazePosX,GazePosYThe columns contain information about user ID (#), stimuli siquence, stimuli ID, timestamp, Pupil Diameter X, Pupil Diaeter Y, Gaze Position X Gaze Position Y.The measurement was performed by headset HTC VIVE PRO Eye.ResearchersVeslava Osinska, Adam Szalach, Dominik Piotrowski, Tomasz GrossTime12.2024-02.2025Description The dataset contains the results of eye tracking studies of visual perception of a set of real style images in VRKeywords eye tracking, images, visual perception, heasetSharing and access informationThe data is available under a CC0 license.The data was made available on June 30, 2025.~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~Files # 2Format is CSV======================================modern.csvA table 8 columns by 1 588 084 rows.All data about subject were fully anonimized ID,TrialSequence,TrialID,Time,PupilDiaX,PupilDiaY,GazePosX,GazePosYThe columns contain information about user ID (#), stimuli siquence, stimuli ID, timestamp, Pupil Diameter X, Pupil Diaeter Y, Gaze Position X Gaze Position Y.The measurement was performed by headset HTC VIVE PRO Eye.ResearchersVeslava Osinska, Adam Szalach, Dominik Piotrowski, Tomasz GrossTime12.2024-02.2025Description The dataset contains the results of eye tracking studies of visual perception of a set of modern various style images in VRKeywords eye tracking, images, visual perception, heasetSharing and access informationThe data is available under a CC0 license.The data was made available on June 30, 2025.~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~Files # 3Format is CSV======================================graphics.csvA table 8 columns by 1 318 860 rows.All data about subject were fully anonimized ID,TrialSequence,TrialID,Time,PupilDiaX,PupilDiaY,GazePosX,GazePosYThe columns contain information about user ID (#), stimuli siquence, stimuli ID, timestamp, Pupil Diameter X, Pupil Diaeter Y, Gaze Position X Gaze Position Y.The measurement was performed by headset HTC VIVE PRO Eye.ResearchersVeslava Osinska, Adam Szalach, Dominik Piotrowski, Tomasz GrossTime12.2024-02.2025Description The dataset contains the results of eye tracking studies of visual perception of a set of graphics style images in VRKeywords eye tracking, images, visual perception, heasetSharing and access informationThe data is available under a CC0 license.The data was made available on June 30, 2025.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
We present eSEEd- emotional State Estimation based on Eye-tracking database. Eye movements of 48 participants were recorded as they watched 10 emotion evoking videos each of them followed by a neutral video. Participants rated five emotions (tenderness, anger, disgust, sadness, neutral) on a scale from 0 to 10, later translated in terms of emotional arousal and valence levels. Furthermore, each participant filled 3 self-assessment questionnaires. An extensive analysis of the participants' answers to the questionnaires self-assessment scores as well as their ratings during the experiments is presented. Moreover, eye and gaze features were extracted from the low level eye recorded metrics and their correlations with the participants' ratings are investigated. Finally, analysis and results are presented for machine learning approaches, for the classification of various arousal and valence levels based solely on eye and gaze features. The dataset is made publicly available and we encourage other researchers to use it for testing new methods and analytic pipelines for the estimation of an individual's affective state.
TO USE THIS DATASET PLEASE CITE:
Skaramagkas, V.; Ktistakis, E.; Manousos, D.; Kazantzaki, E.; Tachos, N.S.; Tripoliti, E.; Fotiadis, D.I.; Tsiknakis, M. eSEE-d: Emotional State Estimation Based on Eye-Tracking Dataset. Brain Sci. 2023, 13, 589. https://doi.org/10.3390/brainsci13040589
A dual eye-tracking set-up was used that is capable of concurrently recording eye movements, frontal video, and audio during video-mediated face-to-face interactions between parents and their preadolescent children. Dyads in which parents and children engaged in conversations about cooperative and conflictive family topics were measured. Each conversation lasted for approximately 5 minutes.
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a single subject (a researcher), facilitating two geometry lessons in a secondary school classroom, with 11-12 year old students using laptops and a projector. These sessions were recorded in the frame of the MIOCTI project (http://chili.epfl.ch/miocti).
This dataset has been used in several scientific works, such as the ECTEL 2015 (http://ectel2015.httc.de/) conference paper "Studying Teacher Orchestration Load in Technology-Enhanced Classrooms: A Mixed-method Approach and Case Study", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg (the analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/ectel2015-orchestration-school)
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
THÖR is a dataset with human motion trajectory and eye gaze data collected in an indoor environment with accurate ground truth for the position, head orientation, gaze direction, social grouping and goals. THÖR contains sensor data collected by a 3D lidar sensor and involves a mobile robot navigating the space. In comparison to other, our dataset has a larger variety in human motion behaviour, is less noisy, and contains annotations at higher frequencies.
THÖR eye-tracking - data of the participant (Helmet number 9) from the Tobii Glasses included in this dataset. The entire data from the experiment was recorded into two recordings - "Recording011" and "Recording012".
The folder "RawData" consists of exported data from Tobii Pro Lab software using a filter called "Tobii-IVT Attention filter" (velocity threshold parameter set to 100 degrees/second), which is a recommended method for dynamic situations. The recommendation was given by the equipment manufactures Tobii Pro and from other researchers. For further information, please refer to https://www.tobiipro.com/siteassets/tobii-pro/user-manuals/Tobii-Pro-Lab-User-Manual/?v=1.86 (Appendix-B, page 85).
The recording start times are as follows: Tobii recording011 starttime: 13:34:37.267 Tobii recording012 starttime: 14:36:17.730
"Synchronized_Qualisys_Tobii.mat" file, which consists of all the synchronised data between Qualisys data and Tobii eye-tracker data using timestamps matching. The columns (headers) in this mat file respectively represent - 'timestamp', 'Pos_X', 'Pos_Y', 'Pos_Z', 'Head_R', 'Head_P', 'Head_Y', 'GazepointX', 'GazepointY', 'Gazepoint3D_X', 'Gazepoint3D_Y', 'Gazepoint3D_Z', 'Gazedirectionleft_X', 'Gazedirectionleft_Y', 'Gazedirectionleft_Z', 'Gazedirectionright_X', 'Gazedirectionright_Y', 'Gazedirectionright_Z', 'Pupilpositionleft_X', 'Pupilpositionleft_Y', 'Pupilpositionleft_Z', 'Pupilpositionright_X', 'Pupilpositionright_Y', 'Pupilpositionright_Z', 'Pupildiameterleft', 'Pupildiameterright', 'Gazeeventduration', 'Eyemovementtype_index', 'Fixationpoint_X', 'Fixationpoint_Y', 'Gyro_X', 'Gyro_Y', 'Gyro_Z', 'Accelerometer_X', 'Accelerometer_Y', 'Accelerometer_Z'.
A matlab script "Synchronizing_Qualisys_Tobii.m" used for matching the timestamps of Qualisys and Tobii and generate a data file "Synchronized_Qualisys_Tobii.mat" that can be found it git repository.
All the associated data required for running the matlab script.
Please note that 'Timestamp', 'Pos_X', 'Pos_Y', 'Pos_Z', 'Head_R', 'Head_P', 'Head_Y','GazepointX', 'GazepointY' represent the timestamps (matched to Tobii timestamps using nearest neighbor search), position and head orientation from Qualisys data and rest of the data is from Tobii eye-tracker. For more information regarding the eye-tracker data, please refer to https://www.tobiipro.com/siteassets/tobii-pro/user-manuals/Tobii-Pro-Lab-User-Manual/?v=1.86 (Section 8.7.2.1, page 68).
Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically
We designed a privacy-aware VR interface that uses differential privacy, which we evaluate on a new 20-participant dataset for two privacy sensitive tasks. The data consists of eye gaze as participants read different types of documents. The dataset consists of a .zip file with two folders (Eye_Tracking_Data and Eye_Movement_Features), a .csv file with the ground truth annotation (Ground_Truth.csv) and a Readme.txt file. In each folder there are two files for participant (P) for each recording (R = document class). These two files contain the recorded eye tracking data and the corresponding eye movement features. The data is saved as a .npy and .csv file. The data scheme of the eye tracking data and eye movement features is given in the Readme.txt file.
https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy
The global eye tracking system market, valued at $658 million in 2025, is projected to experience robust growth, driven by a compound annual growth rate (CAGR) of 9.8% from 2025 to 2033. This expansion is fueled by several key factors. The increasing adoption of virtual reality (VR) and augmented reality (AR) technologies across gaming, entertainment, and training applications necessitates precise and accurate eye tracking capabilities. Furthermore, the automotive industry's focus on advanced driver-assistance systems (ADAS) and autonomous driving is significantly boosting demand for reliable eye tracking systems to monitor driver alertness and attentiveness. The healthcare sector also presents a significant growth opportunity, with applications in ophthalmology, neurology, and rehabilitation leveraging eye tracking for diagnostics and treatment monitoring. Technological advancements leading to smaller, more affordable, and more accurate eye tracking devices are further accelerating market penetration. Specific applications like user experience (UX) research and market research also contribute to market growth. The market segmentation reveals a dynamic landscape. Eye tracking software holds a significant share, driven by its cost-effectiveness and ease of integration with existing systems. However, the hardware segment (eye trackers) is expected to witness substantial growth due to increasing demand for high-precision applications in medical and automotive sectors. Regionally, North America and Europe currently dominate the market due to early adoption and strong technological infrastructure. However, rapidly developing economies in Asia-Pacific are emerging as key growth regions, fueled by rising disposable incomes, technological advancements, and government initiatives to promote technological innovation. Competitive rivalry among established players like Tobii Pro, Smart Eye, and SR Research, alongside emerging innovative companies, keeps the market dynamic and competitive. Continued research and development efforts focusing on improving accuracy, reducing costs, and expanding applications will be crucial for future market growth and penetration into newer segments.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains behavioural and eye-tracking data for:Murphy PR, Wilming N, Hernandez Bocanegra DC, Prat Ortega G & Donner TH (2021). Adaptive circuit dynamics across human cortex during evidence accumulation in changing environments. Nature Neuroscience. Online ahead of print.
Each .zip file contains all data for a single participant and is organized as follows: data from each experimental session are contained in their own folder (S1, S2, etc.); each session folder in turn contains separate Sample_seqs, Behaviour and Eyetracking subfolders.
The Sample_seqs folder contains Matlab .mat files (labelled ID_SESS_BLOCK.mat, where ID is the participant ID, SESS is the experimental session and BLOCK is the block number within that session) with information about the trial-specific stimulus sequences presented to the participant. The variables in each of these files are:
gen – structure containing the generative statistics of the task
stim – structure containing details about the physical presentation of the stimuli (see task script on Donnerlab Github for explanation of these)
timing – structure containing details about the timing of stimulus presentation (see task script on Donnerlab Github for explanation of these)
pshort – proportion of trials with stimulus sequences that were shorter than the full sequence length
stimIn – trials*samples matrix of stimulus locations (in polar angle with horizontal midline = 0 degrees; NaN marks trials sequences that were shorter than the full sequence length)
distseqs – trials*samples matrix of which generative distribution was used to draw each sample location
pswitch – trials*samples matrix of binary flags marking when a switch in generative distribution occurred
The Behaviour folder contains Matlab .mat files (same naming scheme as above) with information about the behaviour produced by the participant on each trial of the task. The main variable in each file is a matrix called Behav for which each row is a trial and columns are the following:
column 1 – the generative distribution used to draw the final sample location on each trial (and thus, the correct response)
column 2 – the response given by the participant
column 3 – the accuracy of the participant’s response
column 4 – response time relative to Go cue
column 5 – trial onset according to psychtoolbox clock
column 6 – number of times participant broke fixation during trial, according to online detection algorithm
Each .mat file also contains a trials*samples matrix (tRefresh) of the timings of monitor flips corresponding to the onsets of each sample (and made relative to trial onset), as provided by psychtoolbox.
The Eyetracking folder contains both raw Eyelink 1000 (SR Research) .edf files, and their conversions to .asc text files using the manufacturer’s edf2asc utility (same naming scheme as above). For stimulus and response trigger information, see task scripts on Donnerlab Github..zip file names ending in _2.zip correspond to the four participants from Experiment 2 of the paper, for whom sample-onset-asynchrony (SOA) was manipulated across two conditions (0.2 vs 0.6 s). All other participants are from Experiment 1, where SOA was fixed at 0.4 s.For example code for analyzing behaviour, fitting behavioural models, and analyzing pupil data, see https://github.com/DonnerLab/2021_Murphy_Adaptive-Circuit-Dynamics-Across-Human-Cortex.
https://www.futuremarketinsights.com/privacy-policyhttps://www.futuremarketinsights.com/privacy-policy
The eye tracking system market is envisioned to reach a value of US$ 1.90 billion in 2024 and register an incredible CAGR of 26.40% from 2024 to 2034. The market is foreseen to surpass US$ 19.76 billion by 2034. The emergence of vision capture technology services in retail, research, automotive, healthcare, and consumer electronics has immensely propelled the eye tracing system industry.
Attributes | Details |
---|---|
Market Value for 2024 | US$ 1.90 billion |
Market Value for 2034 | US$ 19.76 billion |
Market Forecast CAGR for 2024 to 2034 | 26.40% |
2019 to 2023 Historical Analysis vs. 2024 to 2034 Market Forecast Projection
Attributes | Details |
---|---|
Market Historical CAGR for 2019 to 2023 | 24.20% |
Category-wise Insights
Attributes | Details |
---|---|
Top System Orientation | Wearable Eye Tracking Systems |
Market share in 2024 | 44.2% |
Attributes | Details |
---|---|
Top Sampling Rate | 61 to 120 Hz |
Market share in 2024 | 28.3% |
Country-wise Insights
Countries | CAGR from 2024 to 2034 |
---|---|
United States | 23.20% |
Germany | 21.80% |
China | 26.90% |
Japan | 21.10% |
Australia | 29.90% |
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
ARENA Deliverable:
In the MoMoView project we investigate individual differences during free viewing from developmental perspective. We investigate how development of pattern completions evolves over time and age and whether eye gaze reinstatement patterns of remembered information are more precise compared to eye gaze patterns of forgotten information. Additionally, we investigate how individual differences in eye gaze behaviour are related subsequent memory for central and peripheral details.
This data contains eye fixations from children aged 5-12, young adults, aged 20-30, and older adults aged 65-80.
MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
## Overview
Eye Tracking is a dataset for object detection tasks - it contains Pupil annotations for 301 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [MIT license](https://creativecommons.org/licenses/MIT).