Facebook
TwitterAttribution-NonCommercial-NoDerivs 4.0 (CC BY-NC-ND 4.0)https://creativecommons.org/licenses/by-nc-nd/4.0/
License information was derived automatically
julienmercier/mobile-eye-tracking-dataset-v2 dataset hosted on Hugging Face and contributed by the HF Datasets community
Facebook
TwitterAttribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a set of 16 subjects, playing a series of short games of Tetris (for up to 5 minutes each), in different conditions (e.g., collaborative vs competitive).
This dataset has been used in several scientific works, such as the [CSCL 2015](http://isls.org/cscl2015/) conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset such paper is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration
Facebook
TwitterMIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
## Overview
EyeTracking is a dataset for classification tasks - it contains Eyes annotations for 7,139 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [MIT license](https://creativecommons.org/licenses/MIT).
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Abstract: This study aims to publish an eye-tracking dataset developed for the purpose of autism diagnosis. Eye-tracking methods are used intensively in that context, whereas abnormalities of the eye gaze are largely recognised as the hallmark of autism. As such, it is believed that the dataset can allow for developing useful applications or discovering interesting insights. As well, Machine Learning is a potential application for developing diagnostic models that can help detect autism at an early stage of development.
Dataset Description: The dataset is distributed over 25 CSV-formatted files. Each file represents the output of an eye-tracking experiment. However, a single experiment usually included multiple participants. The participant ID is clearly provided at each record at the ‘Participant’ column, which can be used to identify the class of participant (i.e., Typically Developing or ASD). Furthermore, a set of metadata files is included. The main metadata file, Participants.csv, is used to describe the key characteristics of participants (e.g. gender, age, CARS). Every participant was also assigned a unique ID.
Dataset Citation: Cilia, F., Carette, R., Elbattah, M., Guérin, J., & Dequen, G. (2022). Eye-Tracking Dataset to Support the Research on Autism Spectrum Disorder. In Proceedings of the IJCAI–ECAI Workshop on Scarce Data in Artificial Intelligence for Healthcare (SDAIH).
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
## Overview
Eyetracking is a dataset for object detection tasks - it contains Eyes annotations for 4,515 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
Facebook
TwitterAttribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a two subjects (an expert and a novice teachers), facilitating three collaborative learning lessons (2 for the expert, 1 for the novice) in a classroom with laptops and a projector, with real master-level students. These sessions were recorded during a course on the topic of digital education and learning analytics at EPFL.
This dataset has been used in several scientific works, such as the CSCL 2015 conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration
Facebook
TwitterAttribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a single subject (a researcher), facilitating three collaborative learning lessons in a multi-tabletop classroom, with real 10-12 year old students. These sessions were recorded during an "open doors day" at the [CHILI Lab](http://chili.epfl.ch).
This dataset has been used in several scientific works, such as the [CSCL 2015](http://isls.org/cscl2015/) conference paper "The Burden of Facilitating Collaboration: Towards Estimation of Teacher Orchestration Load using Eye-tracking Measures", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg. The analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/cscl2015-eyetracking-orchestration
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Raw eyetracking data
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
## Overview
Ps6 Eyetracking is a dataset for object detection tasks - it contains Object In Hospital Room annotations for 826 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The raw eyetracking data from experiment 4.
Facebook
TwitterAttribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
IMPORTANT NOTE: One of the files in this dataset is incorrect, see this dataset's erratum at https://zenodo.org/record/203958
This dataset contains eye-tracking data from a single subject (an experienced teacher), facilitating two geometry lessons in a secondary school classroom, with 11-12 year old students using tangible paper tabletops and a projector. These sessions were recorded in the frame of the MIOCTI project (http://chili.epfl.ch/miocti).
This dataset has been used in several scientific works, such a submitted journal paper "Orchestration Load Indicators and Patterns: In-the-wild Studies Using Mobile Eye-tracking", by Luis P. Prieto, Kshitij Sharma, Lukasz Kidzinski & Pierre Dillenbourg (the analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/paper-IEEETLT-orchestrationload)
Facebook
Twitter## Overview
Eye Tracker Dataset Improved 2 is a dataset for instance segmentation tasks - it contains Test Ww2D annotations for 7,752 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
Facebook
TwitterAttribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically
We designed a privacy-aware VR interface that uses differential privacy, which we evaluate on a new 20-participant dataset for two privacy sensitive tasks. The data consists of eye gaze as participants read different types of documents. The dataset consists of a .zip file with two folders (Eye_Tracking_Data and Eye_Movement_Features), a .csv file with the ground truth annotation (Ground_Truth.csv) and a Readme.txt file. In each folder there are two files for participant (P) for each recording (R = document class). These two files contain the recorded eye tracking data and the corresponding eye movement features. The data is saved as a .npy and .csv file. The data scheme of the eye tracking data and eye movement features is given in the Readme.txt file.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset was created by ferdiu
Released under Attribution 4.0 International (CC BY 4.0)
Facebook
Twitterhttps://www.futuremarketinsights.com/privacy-policyhttps://www.futuremarketinsights.com/privacy-policy
The Eye Tracking System Market is estimated to be valued at USD 2.4 billion in 2025 and is projected to reach USD 25.0 billion by 2035, registering a compound annual growth rate (CAGR) of 26.4% over the forecast period.
| Metric | Value |
|---|---|
| Eye Tracking System Market Estimated Value in (2025 E) | USD 2.4 billion |
| Eye Tracking System Market Forecast Value in (2035 F) | USD 25.0 billion |
| Forecast CAGR (2025 to 2035) | 26.4% |
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This study investigated how ageing impacts anticipation during comprehension, considering that both language experience and cognitive factors modulate prediction during comprehension. We investigated whether older adults can utilize gender-marked adjectives to predict the target noun during Hindi sentence comprehension. Two visual world paradigm studies (Experiment 1 & 2) were conducted, where young and older adults listened to sentences containing a target word while looking at a visual display containing the target object and three distractors. The sentences consisted of adjectives which were highly associated with the target object. We measured anticipatory gaze towards the target object before it was mentioned in the sentence indexing prediction using adjectives. The results from both experiments showed that older adults exhibited anticipatory gaze towards the target noun soon after hearing the adjective, which did not differ from that of young adults
Facebook
TwitterAttribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This dataset contains eye-tracking data from a single subject (a researcher), facilitating two geometry lessons in a secondary school classroom, with 11-12 year old students using laptops and a projector. These sessions were recorded in the frame of the MIOCTI project (http://chili.epfl.ch/miocti).
This dataset has been used in several scientific works, such as the ECTEL 2015 (http://ectel2015.httc.de/) conference paper "Studying Teacher Orchestration Load in Technology-Enhanced Classrooms: A Mixed-method Approach and Case Study", by Luis P. Prieto, Kshitij Sharma, Yun Wen & Pierre Dillenbourg (the analysis and usage of this dataset is available publicly at https://github.com/chili-epfl/ectel2015-orchestration-school)
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
## Overview
Webcam Eye Tracking New is a dataset for object detection tasks - it contains Eyes annotations for 364 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
Facebook
Twitterhttps://www.frdr-dfdr.ca/docs/en/depositing_data/#data-usage-licenseshttps://www.frdr-dfdr.ca/docs/en/depositing_data/#data-usage-licenses
Research on infants' reasoning abilities often rely on looking times, which are longer to surprising and unexpected visual scenes compared to unsurprising and expected ones. Few researchers have examined more precise visual scanning patterns in these scenes, and so, here, we recorded 8- to 11-month-olds' gaze with an eye tracker as we presented a sampling event whose outcome was either surprising, neutral, or unsurprising: A red (or yellow) ball was drawn from one of three visible containers populated 0%, 50%, or 100% with identically colored balls. When measuring looking time to the whole scene, infants were insensitive to the likelihood of the sampling event, replicating failures in similar paradigms. Nevertheless, a new analysis of visual scanning showed that infants did spend more time fixating specific areas-of-interest as a function of the event likelihood. The drawn ball and its associated container attracted more looking than the other containers in the 0% condition, but this pattern was weaker in the 50% condition, and even less strong in the 100% condition. Results suggest that measuring where infants look may be more sensitive than simply how much looking there is to the whole scene. The advantages of eye tracking measures over traditional looking measures are discussed. The Excel file here includes information about cumulative eye-tracker coded looking to the various AOIs (areas of interest). Additional information about the spreadsheet's column headers can be found in the accompanying ReadMe (ReadMe.txt). Confidentiality declaration: Consent procedures were approved by the UCLA North General Institutional Review Board, which included written consent from all infants' parents prior to study participation. Confidentiality of the participants is maintained here, because anonymous subject numbers are used to label the data, which cannot be linked to any confidential subject information. This dataset was originally deposited in the Simon Fraser University institutional repository.
Facebook
Twitterhttps://www.technavio.com/content/privacy-noticehttps://www.technavio.com/content/privacy-notice
The eye tracking market share is expected to increase by USD 322.35 million from 2020 to 2025, and the market’s growth momentum will accelerate at a CAGR of 14.14%.
This eye tracking market research report provides valuable insights on the post-COVID-19 impact on the market, which will help companies evaluate their business approaches. Furthermore, this report extensively covers eye tracking market segmentation by application (research, AR and VR, HCI, training and simulation, and healthcare) and geography (North America, Europe, Asia, and ROW). The eye tracking market report also offers information on several market vendors, including Alphabet Inc., Apple Inc., BIOPAC Systems Inc., Facebook Inc., Gazepoint, Magic Leap Inc., Noldus Information Technology BV, Seeing Machines Ltd., SR Research Ltd., and Tobii AB among others.
What will the Eye Tracking Market Size be During the Forecast Period?
Download Report Sample to Unlock the Eye Tracking Market Size for the Forecast Period and Other Important Statistics
Eye Tracking Market: Key Drivers, Trends, and Challenges
Based on our research output, there has been a neutral impact on the market growth during and post-COVID-19 era. The new product launches are notably driving the eye tracking market growth, although factors such as the presence of intellectual property rights and patents may impede the market growth. Our research analysts have studied the historical data and deduced the key market drivers and the COVID-19 pandemic's impact on the eye tracking industry. The holistic analysis of the drivers will help in deducing end goals and refining marketing strategies to gain a competitive edge.
Key Eye Tracking Market Driver
One of the key factors driving the eye tracking market is the new product launches. Vendors are concentrating on creating cutting-edge eye tracking technology for a variety of industries, including AR, VR, healthcare, automotive, aerospace, and research. To diversify their product lines and boost sales, they are introducing new products. During the anticipated period, the introduction of numerous new products is anticipated to fuel market expansion. For instance, Apple Inc. released several wearable devices in September 2020 that now feature eye tracking systems with cameras. The method and tool for eye tracking using event camera data are described in the patent. The technology tracks the user's eyes to determine where they are looking while wearing the head-mounted device. Similarly, in June 2020, Tobii AB company launched its Tobii Eye Tracker 5, which is designed and engineered specifically for gaming. The device allows users to control in-game cameras and actions with their head or eye movement, and track analytics on metrics such as tunnel vision, awareness, and focus for training purposes.
Key Eye Tracking Market Trend
The development of eye tracking for mobile devices is another factor supporting the eye tracking market growth in the forecast period. Numerous vendors in the existing market have succeeded in developing solutions that can integrate eye tracking into mobile devices. Eye tracking in mobile devices is usually achieved by incorporating sensors, infrared light, and the optical camera found above the display screen. Smartphone vendors such as Samsung have incorporated eye tracking that allows the user to scroll through the display screen using their eyes. Such technological advances and integration in commercial devices are expected to increase the scope of HCI applications in the global eye tracking market. Furthermore, technological advances, such as software development for the integration of eye tracking devices into mobiles, are likely to drive the growth of the market. Moreover, the use of VR cardboards is increasing rapidly in the market. They act as an outer shell that blocks ambient light and has a holder for a smartphone that streams the VR content. In addition, mobile devices are also used for experiencing VR content that is usually recorded or developed in the form of 360 degrees videos. The development of eye tracking solutions for mobile devices would prove to be highly favorable for the global eye tracking market as smartphones are being largely used for viewing VR content by using VR cardboards.
Key Eye Tracking Market Challenge
The presence of intellectual property rights and patents will be a major challenge for the eye tracking market during the forecast period. The market restricts new vendors and innovators from exploring eye tracking solutions and various applications due to the presence of patents and intellectual property rights. These patents and IPs protect the owners of multinational companies, organizations, or private companies that make commercial gains using their inventions. However, there is no knowledge sharing within the market due to these patents and intellectual property rights. This leads to a low level of innovation in eye
Facebook
TwitterAttribution-NonCommercial-NoDerivs 4.0 (CC BY-NC-ND 4.0)https://creativecommons.org/licenses/by-nc-nd/4.0/
License information was derived automatically
julienmercier/mobile-eye-tracking-dataset-v2 dataset hosted on Hugging Face and contributed by the HF Datasets community