Supplementary data for the paper 'crowdsourced gazes'
doi:10.4121/13614824.v3
The doi above is for this specific version of this dataset, which is currently the latest. Newer versions may be published in the future.
For a link that will always point to the latest version, please use
doi: 10.4121/13614824
doi: 10.4121/13614824
Datacite citation style:
Bazilinskyy, Pavlo; Dodou, Dimitra; de Winter, Joost (2022): Supplementary data for the paper 'crowdsourced gazes'. Version 3. 4TU.ResearchData. dataset. https://doi.org/10.4121/13614824.v3
Other citation styles (APA, Harvard, MLA, Vancouver, Chicago, IEEE) available at Datacite
Dataset
In a
crowdsourced experiment, the effects of distance and type of the approaching vehicle,
traffic density, and visual clutter on pedestrians’ attention distribution were
explored. 966 participants viewed 107 images of diverse traffic scenes for
durations between 100 and 4000 ms. Participants’ eye-gaze data were collected using
the TurkEyes method. The method involved briefly showing codecharts after each
image and asking the participants to type the code they saw last. The results
indicate that automated vehicles were more often glanced at than manual
vehicles. Measuring eye gaze without an eye tracker is promising.
history
- 2021-02-23 first online
- 2022-05-03 published, posted
publisher
4TU.ResearchData
funding
- This research is supported by grant 016.Vidi.178.047 (2018–2024; “How should automated vehicles communicate with other road users?”), which is financed by the Netherlands Organisation for Scientific Research (NWO).
organizations
TU Delft, Faculty of Mechanical, Maritime and Materials Engineering (3mE)
DATA
files (1)
- 1,429,425,249 bytesMD5:
ef0617b534385cf2196dd653dd4bc2d3
Supplementary material.zip -
download all files (zip)
1,429,425,249 bytes unzipped