Raw Data for ConfLab: A Data Collection Concept, Dataset, and Benchmark for Machine Analysis of Free-Standing Social Interactions in the Wild

doi: 10.4121/20017748.v2
The doi above is for this specific version of this dataset, which is currently the latest. Newer versions may be published in the future. For a link that will always point to the latest version, please use
doi: 10.4121/20017748
Datacite citation style:
Raman, Chirag; Vargas Quiros, Jose; Tan, Stephanie; Islam, Ashraful; Gedik, Ekin et. al. (2022): Raw Data for ConfLab: A Data Collection Concept, Dataset, and Benchmark for Machine Analysis of Free-Standing Social Interactions in the Wild. Version 2. 4TU.ResearchData. dataset. https://doi.org/10.4121/20017748.v2
Other citation styles (APA, Harvard, MLA, Vancouver, Chicago, IEEE) available at Datacite
choose version:
version 2 - 2022-10-10 (latest)
version 1 - 2022-06-20

This file contains raw data for cameras and wearables of the ConfLab dataset. 


contains the overhead video recordings for 9 cameras (cam2-10) in MP4 files. 

    These cameras cover the whole interaction floor, with camera 2 capturing the

    bottom of the scene layout, and camera 10 capturing top of the scene layout.

    Note that cam5 ran out of battery before the other cameras and thus the recordings 

    are cut short. However, cam4 and 6 contain significant overlap with cam 5, to 

    reconstruct any information needed.

    Note that the annotations are made and provided in 2 minute segments. 

    The annotated portions of the video include the last 3min38sec of x2xxx.MP4 

    video files, and the first 12 min of x3xxx.MP4 files for cameras (2,4,6,8,10), 

    with "x" being the placeholder character in the mp4 file names. If one wishes 

    to separate the video into 2 min segments as we did, the "video-splitting.sh"

    script is provided. 

./camera-calibration contains the camera instrinsic files obtained from 

    https://github.com/idiap/multicamera-calibration. Camera extrinsic parameters can 

    be calculated using the existing intrinsic parameters and the instructions in the 

    multicamera-calibration repo. The coordinates in the image are provided by the 

    crosses marked on the floor, which are visible in the video recordings. 

    The crosses are 1m apart (=100cm).  


subdirectory includes the IMU, proximity and audio data from each 

    participant at the Conflab event (48 in total). In the directory numbered 

    by participant ID, the following data are included:

        1. raw audio file

        2. proximity (bluetooth) pings (RSSI) file (raw and csv) and a visualization 

        3. Tri-axial accelerometer data (raw and csv) and a visualization 

        4. Tri-axial gyroscope data (raw and csv) and a visualization 

        5. Tri-axial magnetometer data (raw and csv) and a visualization

        6. Game rotation vector (raw and csv), recorded in quaternions. 

    All files are timestamped.

    The sampling frequencies are:

        - audio: 1250 Hz

        - rest: around 50Hz. However, the sample rate is not fixed

        and instead the timestamps should be used. 

    For rotation, the game rotation vector's output frequency is limited by the 

    actual sampling frequency of the magnetometer. For more information, please refer to 


    Audio files in this folder are in raw binary form. The following can be used to convert

    them to WAV files (1250Hz):

        ffmpeg -f s16le -ar 1250 -ac 1 -i /path/to/audio/file

Synchronization of cameras and werables data

    Raw videos contain timecode information which matches the timestamps of the data in

    the "wearables" folder. The starting timecode of a video can be read as:

        ffprobe -hide_banner -show_streams -i /path/to/video


./sync: contains wav files per each subject

./sync_files: auxiliary csv files used to sync the audio. Can be used to improve the synchronization. 

The code used for syncing the audio can be found here:   


  • 2022-06-07 first online
  • 2022-10-10 published, posted
zipped files
  • NWO 639.022.606.
  • Aspasia grant associated with vidi grant 639.022.606.
TU Delft, Faculty of Electrical Engineering, Mathematics and Computer Science, Intelligent Systems

DATA - restricted access


Note that you DO NOT need to request access by clicking the 'Request access to files' button below. You can request access by filling in the EULA form which can be found here: https://doi.org/10.4121/20016194 Once completed, please email it back to SPCLabDatasets-insy@tudelft.nl. When your request is approved, private links to download the data will be emailed to you.

▶  Request access to data.

Your request will be sent to the owner of the dataset.

Send request for access to data