MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
## Overview
EyeTracking is a dataset for classification tasks - it contains Eyes annotations for 7,139 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [MIT license](https://creativecommons.org/licenses/MIT).
This dataset was created by Vivek_anandh
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Files # 1Format is CSV======================================real.csvA table 8 columns by 1 807 194 rows.All data about subject were fully anonimized ID,TrialSequence,TrialID,Time,PupilDiaX,PupilDiaY,GazePosX,GazePosYThe columns contain information about user ID (#), stimuli siquence, stimuli ID, timestamp, Pupil Diameter X, Pupil Diaeter Y, Gaze Position X Gaze Position Y.The measurement was performed by headset HTC VIVE PRO Eye.ResearchersVeslava Osinska, Adam Szalach, Dominik Piotrowski, Tomasz GrossTime12.2024-02.2025Description The dataset contains the results of eye tracking studies of visual perception of a set of real style images in VRKeywords eye tracking, images, visual perception, heasetSharing and access informationThe data is available under a CC0 license.The data was made available on June 30, 2025.~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~Files # 2Format is CSV======================================modern.csvA table 8 columns by 1 588 084 rows.All data about subject were fully anonimized ID,TrialSequence,TrialID,Time,PupilDiaX,PupilDiaY,GazePosX,GazePosYThe columns contain information about user ID (#), stimuli siquence, stimuli ID, timestamp, Pupil Diameter X, Pupil Diaeter Y, Gaze Position X Gaze Position Y.The measurement was performed by headset HTC VIVE PRO Eye.ResearchersVeslava Osinska, Adam Szalach, Dominik Piotrowski, Tomasz GrossTime12.2024-02.2025Description The dataset contains the results of eye tracking studies of visual perception of a set of modern various style images in VRKeywords eye tracking, images, visual perception, heasetSharing and access informationThe data is available under a CC0 license.The data was made available on June 30, 2025.~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~Files # 3Format is CSV======================================graphics.csvA table 8 columns by 1 318 860 rows.All data about subject were fully anonimized ID,TrialSequence,TrialID,Time,PupilDiaX,PupilDiaY,GazePosX,GazePosYThe columns contain information about user ID (#), stimuli siquence, stimuli ID, timestamp, Pupil Diameter X, Pupil Diaeter Y, Gaze Position X Gaze Position Y.The measurement was performed by headset HTC VIVE PRO Eye.ResearchersVeslava Osinska, Adam Szalach, Dominik Piotrowski, Tomasz GrossTime12.2024-02.2025Description The dataset contains the results of eye tracking studies of visual perception of a set of graphics style images in VRKeywords eye tracking, images, visual perception, heasetSharing and access informationThe data is available under a CC0 license.The data was made available on June 30, 2025.
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a two subjects (an expert and a novice teachers), facilitating three collaborative learning lessons (2 for the expert, 1 for the novice) in a classroom with laptops and a projector, with real master-level students. These sessions were recorded during a course on the topic of digital education and learning analytics at [EPFL](http://epfl.ch).
This dataset has been used in several scientific works, such as the [CSCL 2015](http://isls.org/cscl2015/) conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Datasets described in the manuscript: 'Empathy Modulates the Temporal Structure of Social Attention'Dataset1.txt.Column names.1. X coordinate2. Y coordinate3. Timestamp (ms)4. Participant5. Trial6. Codes whether the stimulus is intact or scrambled (1= intact, 2 = scrambled).7. Codes whether gaze is in the social AOI (boolean).8. Codes whether gaze is in the nonsocial AOI (boolean).9. Codes the presence of trackloss (boolean)10. The observer's EQ score.Dataset2.txt.Column names.1. X coordinate2. Y coordinate3. Codes the side of the social stimulus4. Timestamp (ms)5. Participant6. Trial7. Codes whether gaze is in the left AOI (boolean)8. Codes whether gaze is in the right AOI (boolean)9. Codes whether the stimulus is intact or scrambled10. Codes the AOI that gaze is directed in (see next 2 columns)11. Whether the gaze is in the social AOI (boolean).12. Whether the gaze is in the nonsocial AOI (boolean).13. A column indicating the presence of trackloss (boolean)14. The observer's EQ score.
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
IMPORTANT NOTE: One of the files in this dataset is incorrect, see this dataset's erratum at https://zenodo.org/record/203958
This dataset contains eye-tracking data from a single subject (an experienced teacher), facilitating two geometry lessons in a secondary school classroom, with 11-12 year old students using tangible paper tabletops and a projector. These sessions were recorded in the frame of the MIOCTI project (http://chili.epfl.ch/miocti).
This dataset has been used in several scientific works, such a submitted journal paper "Orchestration Load Indicators and Patterns: In-the-wild Studies Using Mobile Eye-tracking", by Luis P. Prieto, Kshitij Sharma, Lukasz Kidzinski & Pierre Dillenbourg (the analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/paper-IEEETLT-orchestrationload)
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
## Overview
Ps6 Eyetracking is a dataset for object detection tasks - it contains Object In Hospital Room annotations for 826 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
Abstract:
This study aims to publish an eye-tracking dataset developed for the purpose of autism diagnosis. Eye-tracking methods are used intensively in that context, whereas abnormalities of the eye gaze are largely recognized as the hallmark of autism. As such, it is believed that the dataset can allow for developing useful applications or discovering interesting insights. As well, Machine Learning is a potential application for developing diagnostic models that can help detect autism at an early stage of development.
Dataset Description:
The dataset is distributed over 25 CSV-formatted files. Each file represents the output of an eye-tracking experiment. However, a single experiment usually included multiple participants. The participant ID is clearly provided at each record at the ‘Participant’ column, which can be used to identify the class of participant (i.e., Typically Developing or ASD). Furthermore, a set of metadata files is included. The main metadata file, Participants.csv, is used to describe the key characteristics of participants (e.g. gender, age, CARS). Every participant was also assigned a unique ID.
Dataset Citation:
Cilia, F., Carette, R., Elbattah, M., Guérin, J., & Dequen, G. (2022). Eye-Tracking Dataset to Support the Research on Autism Spectrum Disorder. In Proceedings of the IJCAI–ECAI Workshop on Scarce Data in Artificial Intelligence for Healthcare (SDAIH).
Authors:
Federica Cilia; Romuald Carette; Mahmoud Elbattah; Jean-Luc Guérin; Gilles Dequen
## Overview
Eye Tracker Dataset Improved 2 is a dataset for instance segmentation tasks - it contains Test Ww2D annotations for 7,752 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a single subject (a researcher), facilitating three collaborative learning lessons in a multi-tabletop classroom, with real 10-12 year old students. These sessions were recorded during an "open doors day" at the [CHILI Lab](http://chili.epfl.ch).
This dataset has been used in several scientific works, such as the [CSCL 2015](http://isls.org/cscl2015/) conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a single subject (a researcher), facilitating two geometry lessons in a secondary school classroom, with 11-12 year old students using laptops and a projector. These sessions were recorded in the frame of the MIOCTI project (http://chili.epfl.ch/miocti).
This dataset has been used in several scientific works, such as the ECTEL 2015 (http://ectel2015.httc.de/) conference paper "Studying Teacher Orchestration Load in Technology-Enhanced Classrooms: A Mixed-method Approach and Case Study", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg (the analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/ectel2015-orchestration-school)
A dual eye-tracking set-up was used that is capable of concurrently recording eye movements, frontal video, and audio during video-mediated face-to-face interactions between parents and their preadolescent children. Dyads in which parents and children engaged in conversations about cooperative and conflictive family topics were measured. Each conversation lasted for approximately 5 minutes.
We designed a privacy-aware VR interface that uses differential privacy, which we evaluate on a new 20-participant dataset for two privacy sensitive tasks. The data consists of eye gaze as participants read different types of documents. The dataset consists of a .zip file with two folders (Eye_Tracking_Data and Eye_Movement_Features), a .csv file with the ground truth annotation (Ground_Truth.csv) and a Readme.txt file. In each folder there are two files for participant (P) for each recording (R = document class). These two files contain the recorded eye tracking data and the corresponding eye movement features. The data is saved as a .npy and .csv file. The data scheme of the eye tracking data and eye movement features is given in the Readme.txt file. The data is only to be used for non-commercial scientific purposes.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
## Overview
Eye Tracking Images is a dataset for object detection tasks - it contains Eye Tracking annotations for 825 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [Public Domain license](https://creativecommons.org/licenses/Public Domain).
The eye tracking market share is expected to increase by USD 322.35 million from 2020 to 2025, and the market’s growth momentum will accelerate at a CAGR of 14.14%.
This eye tracking market research report provides valuable insights on the post-COVID-19 impact on the market, which will help companies evaluate their business approaches. Furthermore, this report extensively covers eye tracking market segmentation by application (research, AR and VR, HCI, training and simulation, and healthcare) and geography (North America, Europe, Asia, and ROW). The eye tracking market report also offers information on several market vendors, including Alphabet Inc., Apple Inc., BIOPAC Systems Inc., Facebook Inc., Gazepoint, Magic Leap Inc., Noldus Information Technology BV, Seeing Machines Ltd., SR Research Ltd., and Tobii AB among others.
What will the Eye Tracking Market Size be During the Forecast Period?
Download Report Sample to Unlock the Eye Tracking Market Size for the Forecast Period and Other Important Statistics
Eye Tracking Market: Key Drivers, Trends, and Challenges
Based on our research output, there has been a neutral impact on the market growth during and post-COVID-19 era. The new product launches are notably driving the eye tracking market growth, although factors such as the presence of intellectual property rights and patents may impede the market growth. Our research analysts have studied the historical data and deduced the key market drivers and the COVID-19 pandemic's impact on the eye tracking industry. The holistic analysis of the drivers will help in deducing end goals and refining marketing strategies to gain a competitive edge.
Key Eye Tracking Market Driver
One of the key factors driving the eye tracking market is the new product launches. Vendors are concentrating on creating cutting-edge eye tracking technology for a variety of industries, including AR, VR, healthcare, automotive, aerospace, and research. To diversify their product lines and boost sales, they are introducing new products. During the anticipated period, the introduction of numerous new products is anticipated to fuel market expansion. For instance, Apple Inc. released several wearable devices in September 2020 that now feature eye tracking systems with cameras. The method and tool for eye tracking using event camera data are described in the patent. The technology tracks the user's eyes to determine where they are looking while wearing the head-mounted device. Similarly, in June 2020, Tobii AB company launched its Tobii Eye Tracker 5, which is designed and engineered specifically for gaming. The device allows users to control in-game cameras and actions with their head or eye movement, and track analytics on metrics such as tunnel vision, awareness, and focus for training purposes.
Key Eye Tracking Market Trend
The development of eye tracking for mobile devices is another factor supporting the eye tracking market growth in the forecast period. Numerous vendors in the existing market have succeeded in developing solutions that can integrate eye tracking into mobile devices. Eye tracking in mobile devices is usually achieved by incorporating sensors, infrared light, and the optical camera found above the display screen. Smartphone vendors such as Samsung have incorporated eye tracking that allows the user to scroll through the display screen using their eyes. Such technological advances and integration in commercial devices are expected to increase the scope of HCI applications in the global eye tracking market. Furthermore, technological advances, such as software development for the integration of eye tracking devices into mobiles, are likely to drive the growth of the market. Moreover, the use of VR cardboards is increasing rapidly in the market. They act as an outer shell that blocks ambient light and has a holder for a smartphone that streams the VR content. In addition, mobile devices are also used for experiencing VR content that is usually recorded or developed in the form of 360 degrees videos. The development of eye tracking solutions for mobile devices would prove to be highly favorable for the global eye tracking market as smartphones are being largely used for viewing VR content by using VR cardboards.
Key Eye Tracking Market Challenge
The presence of intellectual property rights and patents will be a major challenge for the eye tracking market during the forecast period. The market restricts new vendors and innovators from exploring eye tracking solutions and various applications due to the presence of patents and intellectual property rights. These patents and IPs protect the owners of multinational companies, organizations, or private companies that make commercial gains using their inventions. However, there is no knowledge sharing within the market due to these patents and intellectual property rights. This leads to a low level of innovation in eye
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
## Overview
Webcam Eye Tracking New is a dataset for object detection tasks - it contains Eyes annotations for 364 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
ARENA Deliverable:
In the MoMoView project we investigate individual differences during free viewing from developmental perspective. We investigate how development of pattern completions evolves over time and age and whether eye gaze reinstatement patterns of remembered information are more precise compared to eye gaze patterns of forgotten information. Additionally, we investigate how individual differences in eye gaze behaviour are related subsequent memory for central and peripheral details.
This data contains eye fixations from children aged 5-12, young adults, aged 20-30, and older adults aged 65-80.
https://www.futuremarketinsights.com/privacy-policyhttps://www.futuremarketinsights.com/privacy-policy
The eye tracking system market is envisioned to reach a value of US$ 1.90 billion in 2024 and register an incredible CAGR of 26.40% from 2024 to 2034. The market is foreseen to surpass US$ 19.76 billion by 2034. The emergence of vision capture technology services in retail, research, automotive, healthcare, and consumer electronics has immensely propelled the eye tracing system industry.
Attributes | Details |
---|---|
Market Value for 2024 | US$ 1.90 billion |
Market Value for 2034 | US$ 19.76 billion |
Market Forecast CAGR for 2024 to 2034 | 26.40% |
2019 to 2023 Historical Analysis vs. 2024 to 2034 Market Forecast Projection
Attributes | Details |
---|---|
Market Historical CAGR for 2019 to 2023 | 24.20% |
Category-wise Insights
Attributes | Details |
---|---|
Top System Orientation | Wearable Eye Tracking Systems |
Market share in 2024 | 44.2% |
Attributes | Details |
---|---|
Top Sampling Rate | 61 to 120 Hz |
Market share in 2024 | 28.3% |
Country-wise Insights
Countries | CAGR from 2024 to 2034 |
---|---|
United States | 23.20% |
Germany | 21.80% |
China | 26.90% |
Japan | 21.10% |
Australia | 29.90% |
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This mobile eye-tracking dataset consists of 27 recordings of three participants (all authors) walking through a small art gallery. Participants were instructed to attend individual paintings in specific orders, resulting in five distinct scanpath patterns. The recordings' duration ranges from 50 to 205 seconds. Each recording comprises world video, gaze, fixations, saccades, blinks, and IMU data. Recordings were made with the Pupil Invisible eye-tracking glasses.
https://www.techsciresearch.com/privacy-policy.aspxhttps://www.techsciresearch.com/privacy-policy.aspx
Global Eye Tracking Market was valued at USD 404.18 million in 2023 and is anticipated to project robust growth in the forecast period with a CAGR of 21.76% through 2029.
Pages | 186 |
Market Size | 2023: USD 571.24 million |
Forecast Market Size | 2029: USD 2,916.99 million |
CAGR | 2024-2029: 31.03% |
Fastest Growing Segment | Optical Tracking |
Largest Market | North America |
Key Players | 1. Tobii AB
2. SR Research Ltd.
3. iMotions A/S
4. Gazepoint Research Inc.
5. EyeTech Digital Systems, Inc.
6. EyeTracking, Inc.
7. Mirametrix Inc.
8. Seeing Machines Ltd.
9. Smart Eye AB
10. LC Technology Solutions, Inc. |
MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
## Overview
EyeTracking is a dataset for classification tasks - it contains Eyes annotations for 7,139 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [MIT license](https://creativecommons.org/licenses/MIT).