usage stats
Totals for datasets (co-)authored by Pavlo Bazilinskyy
12011
downloads
44318
views
7
shares
13
cites
location
Eindhoven, The Netherlands
website
Pavlo Bazilinskyy
Assistant professor
Biography
Pavlo Bazilinskyy is an assistant professor at TU Eindhoven focusing on AI-driven interaction between automated vehicles and other road users. He finished his PhD at TU Delft in auditory feedback for automated driving as a Marie Curie Fellow, where he also worked as a postdoc. He was the head of data research at SD-Insights. Pavlo is a treasurer of the Marie Curie Alumni Association (MCAA) and was a director of the Research and Innovation unit of the Erasmus Mundus Association (EMA).
Datasets
- An auditory dataset of passing vehicles recorded with a smartphone
- Supplementary data for the article: Survey on eHMI concepts: The effect of text, color, and perspective
- Supplementary data for the article: When will most cars be able to drive fully automatically? Projections of 18,970 survey respondents
- Supplementary data for the paper: How do pedestrians distribute their visual attention when walking through a parking garage? An eye-tracking study.
- Supplementary data for the paper ‘Stopping by looking: A driver-pedestrian interaction study in a coupled simulator using head-mounted displays with eye-tracking'
- Supplementary data for the paper 'Bio-inspired intent communication for automated vehicles'
- Supplementary data for the paper 'External Human-Machine Interfaces: Which of 729 Colors Is Best for Signaling ‘Please (Do not) Cross’?'
- Supplementary data for the paper 'Automated vehicles that communicate implicitly: examining the use of lateral position within the lane'
- Supplementary data for the article: Coupled simulator for research on the interaction between pedestrians and (automated) vehicles
- Supplementary data for the paper 'Get out of the way! Examining eHMIs in critical driver-pedestrian encounters in a coupled simulator'
- Supplementary data for the paper 'Identifying lane changes automatically using the GPS sensors of portable devices'
- Supplementary data for the paper 'crowdsourced gazes'
- Supplementary data for the paper 'Crowdsourced assessment of 227 text-based eHMIs for a crossing scenario'
- Supplementary data for the paper 'Blinded windows and empty driver seats: The effects of automated vehicle characteristics on cyclists’ decision-making'
- Supplementary data for the paper 'Towards the detection of driver–pedestrian eye contact'
- Supplementary data for the paper 'What driving style makes pedestrians think a passing vehicle is driving automatically?'
- Supplementary data for the paper 'The effect of drivers’ eye contact on pedestrians’ perceived safety'
- Supplementary data for the paper: Risk perception - A study using dashcam videos and participants from different world regions.
- Supplementary data for the following paper: Crowdsourced measurement of reaction times to audiovisual stimuli with various degrees of asynchrony.
- Supplementary data for the paper 'Predicting perceived risk of traffic scenes using computer vision'
- Supplementary data for the article: Continuous auditory feedback on the status of adaptive cruise control, lane deviation, and time headway.
- Supplementary data for the paper 'How should external Human-Machine Interfaces behave? Examining the effects of colour, position, message, activation distance, vehicle yielding, and visual distraction among 1,434 participants'
- Supplementary data for the paper 'Exterior sounds for electric and automated vehicles: Loud is effective'
- Supplementary data for the paper 'Putting ChatGPT Vision (GPT-4V) to the test: Risk perception in traffic images'
- Supplementary data for the paper: "Incorporating Multiple Users’ Perspectives in HMI Design for Automated Vehicles: Exploration of a Role-Switching Approach"
- Supplementary data for the paper 'From A to B with Ease: User-Centric Interfaces for Shuttle Buses'
- Supplementary material for Blind Driving papers
- Supplementary material for the paper: "Take-over requests in highly automated driving: A crowdsourcing survey on auditory, vibrotactile, and visual displays"
- Supplementary material for paper "Slideo: Bicycle-to-Vehicle communication to intuitively share intention to turn with automated vehicles"
- Supplementary material for paper: "Exploring the Correlation between Emotions and Uncertainty in Daily Travel"