TY - DATA
T1 - Dataset: Video recordings of human-robot interactions with a Nao robot controlled via the SONAR adaptive control architecture for social norm aware robots
PY - 2024/09/23
AU - Davide Dell'Anna
AU - Anahita Jamshidnejad
UR - 
DO - 10.4121/50c7a19c-fc0e-4ef3-b35a-dd23bf08470d.v1
KW - Human-robot interaction
KW - HRI (Human-robot interaction)
KW - Human engineering
KW - Videos
KW - Video recordings
KW - Sonar
KW - Robots
KW - Autonomous robots
KW - Social robots
KW - Norm aware robots
N2 - <p># GENERAL INFORMATION</p><p></p><p>## Title</p><p>Video recordings of human-robot interactions with a Nao robot controlled via the SONAR adaptive control architecture for social norm aware robots</p><p>&nbsp;&nbsp;&nbsp;</p><p>### Dataset DOI</p><p>[https://doi.org/10.4121/50c7a19c-fc0e-4ef3-b35a-dd23bf08470d](https://doi.org/10.4121/50c7a19c-fc0e-4ef3-b35a-dd23bf08470d)</p><p></p><p>## Authors:&nbsp;</p><p>- **Dr. Davide Dell'Anna** ([ORCID](https://orcid.org/0000-0002-1162-8341), d.dellanna@uu.nl, Utrecht University, The Netherlands, [Personal webpage](https://www.davidedellanna.com)&nbsp;</p><p>- **Dr. Anahita Jamshidnejad** ([ORCID](https://orcid.org/0000-0001-9151-2607), a.jamshidnejad@tudelft.nl, Delft University of Technology, The Netherlands&nbsp;</p><p>&nbsp;</p><p></p><p># DESCRIPTION</p><p>The dataset contains video recordings of human-robot interactions with a Nao robot controlled via the SONAR adaptive&nbsp;control architecture for social norm aware robots.&nbsp;</p><p></p><p>The dataset was collected in the context of the "Human-like norm-aware cognitive robots for autonomous interactions&nbsp;with humans" research project funded by NWO Open Competition Domain Science - XS 21-3 grant. The dataset was generated&nbsp;via experiments with human participants. The experiments were conducted to evaluate the developed adaptive control&nbsp;architecture for social-norm aware robots SONAR. The dataset supplements the article [https://doi.org/10.1007/s12369-024-01172-8]("SONAR: An Adaptive Control&nbsp;Architecture for Social Norm Aware Robots"), where detailed information about the experiments and the related&nbsp;methodologies can be found.&nbsp;</p><p></p><p>In the video recordings, the participants interact with a Nao robot in a casual conversation scenario. Interactions&nbsp;between humans and the robot are entirely autonomous. The participants were instructed to have a conversation with&nbsp;the robot on any topic of their choice for about 10 minutes. During the conversation, they were asked to go through&nbsp;5 tasks (greeting, role playing game, discussing a personal issue, paying attention to an object, goodbye). Participants&nbsp;could decide on their own when and how to initiate and terminate the tasks. The robot could autonomously interact with&nbsp;the participants thanks to the developed control architecture SONAR which combines several state-of-the-art heories&nbsp;and technologies, including the belief, desire, intention model of reasoning and decision making for rational agents,&nbsp;fuzzy logic theory, and large language models.&nbsp;</p><p></p><p>For every participant, two videos recording are included:&nbsp;</p><p>- Video of the interaction with the Nao robot controlled via SONAR (Nao-SONAR)&nbsp;</p><p>- Video of the interaction with the Nao robot controlled via a baseline control architecture (Nao-Chatbot)&nbsp;&nbsp;</p><p></p><p>Participants were encouraged not to say anything personal during the experiments. Therefore anything that is being&nbsp;</p><p>said in the video recordings may or not may be true and should be assumed to be a product of fantasy.&nbsp;</p><p></p><p>It is also important to mention that these experiments were carried out in compliance with the General Data Protection&nbsp;Regulation (EU GDPR) 2016/679, and the Delft University of Technology guidance for working with personal data from human&nbsp;participants. Meaning the project was approved by the Human Research Ethics Committee of the Delft University of&nbsp;Technology. In that sense, all participants were informed since the beginning -via an informed consent form- that the&nbsp;videos of their interactions with the social robot would be made available upon request upon the completion of the project.&nbsp;&nbsp;</p><p>&nbsp;</p><p>## Keywords</p><p>Human-robot interaction, HRI (Human-robot interaction), Human engineering, Videos, Video recordings, Sonar, Robots,&nbsp;Autonomous robots, Social robots, Norm aware robots, BDI (Belief-Desire-Intention), Fuzzy logic,&nbsp;Large language model, LLM (large language model)&nbsp;</p><p></p><p>## Date of data collection</p><p>2022-12-01 to 2022-12-31&nbsp;&nbsp;</p><p></p><p>## Date of dataset publication</p><p>2023-11-30&nbsp;</p><p>&nbsp;&nbsp;&nbsp;</p><p>## Funding</p><p>This research has been supported by the NWO - Open Competition Domain Science XS project "Human-like&nbsp;norm-aware cognitive robots for autonomous interactions with humans" (OCENW.XS21.3.106), which has been financed by&nbsp;the&nbsp;Netherlands Organisation for Scientific Research (NWO).&nbsp;</p><p>&nbsp;</p><p># FILE OVERVIEW</p><p>The dataset contains two video files per participant. The following file naming convention has been used, where &lt;prid\&gt; is a random identifier assigned to a participant:</p><p>- Video of the interaction with the Nao robot controlled via SONAR (Nao-SONAR):&nbsp;&lt;prid\&gt;_sonar.mp4</p><p>- Video of the interaction with the Nao robot controlled via a baseline control architecture (Nao-Chatbot): &lt;prid\&gt;_baseline.mp4&nbsp;&nbsp;</p><p></p><p>Last updated: 2024-02-15&nbsp;</p>
ER -