Supplementary material for the paper 'I see your gesture: A VR-based study of bi-directional communication between pedestrians and automated vehicles'
Automated vehicles (AVs) are able to detect pedestrians reliably but still have difficulty in predicting pedestrians’ intentions from their implicit body language. This study examined the effects of using explicit hand gestures and receptive external human-machine interfaces (eHMIs) in the interaction between pedestrians and AVs. Twenty-six participants interacted with AVs in a virtual environment while wearing a head-mounted display. The participants’ movements in the virtual environment were visualized using a motion-tracking suit. The first independent variable was the participants’ opportunity to use a hand gesture to increase the probability that the AV would stop for them. The second independent variable was the AV’s response “I SEE YOU,” displayed on an eHMI when the vehicle yielded. Accordingly, one-way communication (gesture or eHMI) and two-way communication (gesture and eHMI combined) were investigated. The results showed that the participants decided to use hand gestures in 70% of the trials. Furthermore, the eHMI improved the predictability of the AV’s behavior compared to no eHMI, as inferred from self-reports and hand-use behavior. A postexperiment questionnaire indicated that two-way communication was the most preferred condition and that the eHMI alone was more preferred than the gesture alone. The results further indicate limitations of hand gestures regarding false-positive detection and confusion if the AV decides not to yield. It is concluded that bidirectional human-robot communication has considerable potential.
- 2021-04-15 first online, published, posted
- This research was supported by grant 016.Vidi.178.047 (How should automated vehicles communicate with other road users?, 2018–2022), which was financed by the Netherlands Organisation for Scientific Research (NWO).