The EyeInfo Dataset is an open-source eye-tracking dataset created by Fabricio Batista Narcizo, a research scientist at the IT University of Copenhagen (ITU) and GN Audio A/S (Jabra), Denmark. This dataset was introduced in the paper "High-Accuracy Gaze Estimation for Interpolation-Based Eye-Tracking Methods" (DOI: 10.3390/vision5030041). The dataset contains high-speed monocular eye-tracking data from an off-the-shelf remote eye tracker using active illumination. The data from each user has a text file with data annotations of eye features, environment, viewed targets, and facial features. This dataset follows the principles of the General Data Protection Regulation (GDPR).
We have built a remote eye tracker with off-the-shelf components to collect the real eye-tracking data. The collected data contain binocular eye information from
The prototype has used one Point Grey Grasshopper3 (GS3-U3-41C6NIR-C) integrated with an infrared global shutter sensor (CMOSIS CMV4000-3E12 NIR), which allows us to collect high-definition images (
We have recruited a sample of
For each trial, the participant looked at targets arranged in a
First, we have explained the experiment to the participant and obtained her/his signature on the consent document. Afterward, we have made the fine adjustments in the eye tracker components (i.e., infrared light sources, screen, eye-camera, and chin rest) before running the experiment trial. Each participant has experimented twice, the first trial to collect from the right eye and the second one for the left eye. In the end, we have checked the recorded eye-tracking data and interviewed the participant about fatigue or any physical discomfort during the experiment (no participant has made claims about that). On average, the experiment, including two trials, has lasted
This project uses Data Version Control (DVC) to manage the dataset. The latest collected and processed eye-tracking data are available on the main
branch of this GitHub repository.
- An active Anaconda or Miniforge installation added to your
$PATH
- (Recommended) Visual Studio Code installed
Create new environment called eyeinfo
:
conda env create -f environment.yml
Activate the created environment:
conda activate eyeinfo
The EyeInfo Dataset's raw data (videos, text, CSV, and JSON files) are available as a DVC resource on Google Drive. The dataset contains approximately
Use DVC to download the raw EyeInfo Dataset. From the root folder, execute the following command:
dvc pull
This command will open the web browser automatically, and you must to sign-in using your Google account to allow DVC downloading the dataset. You must give full permission to DVC access the Google Drive resources.
DVC will create the folders called 01_dataset
with the raw data, 02_eye_feature
with the eye features extracted from each collected video, and 03_metadata
with the metadata of each processed video. The folder 01_dataset/0000
contains the
If you want to cite the EyeInfo Dataset, you can use the paper:
@Article{Narcizo2021,
author = {Fabricio Batista Narcizo and Fernando Eust\'{a}quio Dantas dos Santos and Dan Witzner Hansen},
date = {2021-09-15},
title = {High-Accuracy Gaze Estimation for Interpolation-Based Eye-Tracking Methods},
doi = {10.3390/vision5030041},
issn = {2411-5150},
number = {3},
url = {https://dx.doi.org/10.3390/vision5030041},
volume = {5},
abstract = {This study investigates the influence of the eye-camera location associated with the accuracy and precision of interpolation-based eye-tracking methods. Several factors can negatively influence gaze estimation methods when building a commercial or off-the-shelf eye tracker device, including the eye-camera location in uncalibrated setups. Our experiments show that the eye-camera location combined with the non-coplanarity of the eye plane deforms the eye feature distribution when the eye-camera is far from the eye’s optical axis. This paper proposes geometric transformation methods to reshape the eye feature distribution based on the virtual alignment of the eye-camera in the center of the eye’s optical axis. The data analysis uses eye-tracking data from a simulated environment and an experiment with 83 volunteer participants (55 males and 28 females). We evaluate the improvements achieved with the proposed methods using Gaussian analysis, which defines a range for high-accuracy gaze estimation between -0.54$^\circ$ and 0.5$^\circ$. Compared to traditional polynomial-based and homography-based gaze estimation methods, the proposed methods increase the number of gaze estimations in the high-accuracy range.},
article-number = {41},
journal = {Vision},
pubmedid = {34564339},
year = {2021},
}