TY - DATA
T1 - Detecting Perceived Appropriateness of a Robot's Social Positioning Behavior from Non-Verbal Cues: a Dataset.
PY - 2019/12/13
AU - Jered Vroon
AU - G (Gwenn) Englebienne
AU - V (Vanessa) Evers
UR - https://data.4tu.nl/articles/dataset/Detecting_Perceived_Appropriateness_of_a_Robot_s_Social_Positioning_Behavior_from_Non-Verbal_Cues_a_Dataset_/12713627/1
DO - 10.4121/uuid:b76c3a6f-f7d5-418e-874a-d6140853e1fa
KW - Giraff
KW - Head position/orientation
KW - Non-verbal cues
KW - OptiTrack
KW - Perception of a robot's behavior
KW - Social feedback cues
KW - Social robotics
KW - Tracking
KW - Upper body position/orientation
N2 - What if a robot could detect when you think it got too close to you during its approach? This would allow it to correct or compensate for its social ‘mistake’. It would also allow
for a responsive approach, where that robot would reactively find suitable approach behavior through and during the interaction.
We investigated if it is possible to automatically detect such social feedback cues in the context of a robot approaching a person.
We collected a dataset in which our robot would repeatedly approach people (n=30) to verbally deliver a message. Approach distance and environmental noise were manipulated, and our participants were tracked (position and orientation of upper body
and head). We evaluated their perception of the robots behavior through questionnaires and found no single or joint effects of the manipulations, showing that, in this case, personal differences are more important than contextual cues – thus highlighting the importance of responding to behavioral feedback.
ER -