Supplementary data for the paper 'crowdsourced gazes'
The doi above is for this specific version of this dataset, which is currently the latest. Newer versions may be published in the future. For a link that will always point to the latest version, please use
Datacite citation style:
Bazilinskyy, Pavlo; Dodou, Dimitra; de Winter, Joost (2022): Supplementary data for the paper 'crowdsourced gazes'. Version 3. 4TU.ResearchData. dataset. https://doi.org/10.4121/13614824.v3Other citation styles (APA, Harvard, MLA, Vancouver, Chicago, IEEE) available at Datacite
In a crowdsourced experiment, the effects of distance and type of the approaching vehicle, traffic density, and visual clutter on pedestrians’ attention distribution were explored. 966 participants viewed 107 images of diverse traffic scenes for durations between 100 and 4000 ms. Participants’ eye-gaze data were collected using the TurkEyes method. The method involved briefly showing codecharts after each image and asking the participants to type the code they saw last. The results indicate that automated vehicles were more often glanced at than manual vehicles. Measuring eye gaze without an eye tracker is promising.
- 2021-02-23 first online
- 2022-05-03 published, posted
- This research is supported by grant 016.Vidi.178.047 (2018–2024; “How should automated vehicles communicate with other road users?”), which is financed by the Netherlands Organisation for Scientific Research (NWO).
organizationsTU Delft, Faculty of Mechanical, Maritime and Materials Engineering (3mE)