Zum Inhalt springen

Over the past few years, several datasets have been published for intention detection of pedestrians and cyclists in road traffic. The data were recorded in public traffic using a research vehicle, a research intersection equipped with several sensors and smart devices carried by cyclists. The datasets contain trajectories and poses of pedestrians and cyclists, semantic environment maps, optical flow sequences, motion history images and data from smart devices. In addition to the datasets listed below, please refer to the following datasets published on zenodo:

  • Übersicht der Forschungskreuzung an der TH Aschaffenburg.

    Overview of the research intersection with view from wide angle stereo camera images.

VRU TRAJECTORY DATASET

The VRU Trajectory Dataset consists of 1068 pedestrian and 464 cyclist trajectories recorded at an urban intersection using cameras and LiDARs. A detailed description of the intersection can be found in [1]. The pedestrian trajectories were recorded by using a wide angle stereo camera system to track the pedestrians' head position and generating the 3D position by triangulation. The cyclists trajectories were recorded by using LiDARs to track the center of gravity of the cyclists. The cameras operate at 50 Hz, the LiDARs at 12.5 Hz. The dataset partly results from the projects DeCoInt² [2] funded by the "German Reasearch Foundation" (DFG) and AFUSS funded by the "Bundesministerium für Bildung und Forschung" (BMBF). Additionally, our work is supported by "Zentrum Digitalisierung Bayern".

  • Verlauf der Trajektorien von Fußgängern (links) und Radfahrern (rechts) auf der Abbildung der Kreuzung.

    Trajectories of pedestrians (left) and cyclists (right) plotted to map of intersection

  • DATASET FORMAT

    The complete dataset consists of 1532 files in csv format, where every file contains one VRU trajectory. A csv file consists of 4 columns:

    • ID: Measurement IDs
    • timestamp: Timestamp in seconds
    • x: position in x-direction in meter
    • y: position in y-direction in meter

    Note: Due to privacy laws, we are not able to publish image data.

  • CITATION

    Please cite us when using this dataset in your research.

    @MISC{VRUDataset,
      author = {},
      title = {{VRU} {T}rajectory {D}ataset},
      note = {\url{https://www.th-ab.de/vru-trajectory-dataset}},
    }

  • DOWNLOAD

    The trajectories can be downloaded as complete dataset containing pedestrian and cyclsist trajectories or seperatly:

    Download Complete Dataset

    Download Pedestrians Dataset

    Download Cyclists Dataset

    Download Extended Dataset on Zenodo

EXTENDED CYCLIST TRAJECTORY DATASET

To create the Extended Cyclist Trajectory Dataset, we changed the camera lenses to capture a wider filed of view enclosing the bike lane. The dataset currently consists of 1 746 cyclist trajectories including motion primitive labels. The motion primitve labels include the classes wait, which starts at the last visbible movement of the bicycle wheel and ends the first visible bicycle wheel, starting movement, which is the frame of the first visible movement of the cyclist before the end of wait, tr/tl (turn left, turn right), which start at the first and end at the last visible turning movement of the cyclist, and hand signal left/right, which start at the first and end at the last visible frame of the hand signal.

  • DATASET FORMAT

    The dataset consists of 1746 files in json format, where every file contains one cyclist trajectory.

    A json file is structured as follows:

    {"vru_type": "bike",
      "trajectory2: [{"Timestamp": [LIST OF UTC TIMESTAMPS],
                               "x": [LIST OF X POSISTIONS],
                               "y": [LIST OF Y POSISTIONS],
                               "z": [LIST OF Z POSISTIONS],
                               "x_smoothed": [LIST OF SMOOTHED X POSISTIONS],
                               "y_smoothed": [LIST OF SMOOTHED Y POSISTIONS],
                               "z_smoothed": [LIST OF SMOOTHED Z POSISTIONS]}],
      "motion_primitives": {"mp_labels":
              [{"mp_label": LABEL NAME,
                 "start_time": START UTC TIMESTAMP,
                 "end_time": END UTC TIMESTAMP, ...}, ...]}}

  • CITATION

    Please cite us when using this dataset in your research.
    @MISC{VRUDataset,   author = {},   title = {{VRU} {T}rajectory {D}ataset},   note = {\url{https://www.th-ab.de/vru-trajectory-dataset}}, }

  • DOWNLOAD

    The data can be downloaded here soon.

SMART DEVICE CYCLIST'S STARTING DATASET

The Smart Device Cyclist’s Starting Dataset consists of 84 different bicycle starting scenes from 52 different test subjects recorded at an urban intersection using a Samsung Galaxy S6.

We envision for future traffic scenarios that vulnerable road users (VRU)s such as pedestrian and cyclists equipped with smart devices are interconnected with vehicles and intelligent infrastructure. To ensure safe interactions between automated vehicles and VRUs, it is important to predict the intentions of the VRUs as early as possible, such that the automated vehicle can choose an appropriate reaction and avoid dangerous situations. The intention of this dataset is to allow the development of such active VRU protection systems using smart devices, e.g., smart phones. Here, we aim to support the development for methods to detect the cyclist starting movement. More detailed information on a sample methodology using human activity recognition techniques can be found in [3]. The dataset results from the project DeCoInt² [2] funded by the "German Reasearch Foundation" (DFG) and is joint work of the the Kooperative automatisierte Verkehrssysteme (KAV) Lab at the University of Applied Sciences Aschaffenburg, the FORWISS at the University of Passau [https://www.forwiss.uni-passau.de/], and Intelligent Embedded Systems (IES) Lab at the University of Kassel [https://www.ies.uni-kassel.de/].

A detailed description of the intersection at which the data was recorded can be found in [1]. The sensor data was generated by a Samsung Galaxy S6, which was carried inside the left trouser pocket. The display was facing in direction of travel and the upper edge of the smartphone was facing upwards. The following sensors were recorded and are included in this Dataset: Accelerometer, Linear Accelerometer (i.e., gravity compensated accelerometer), Gyroscope, Rotation Sensor. For detailed information regarding the sensors, see: [https://developer.android.com/guide/topics/sensors/sensors_motion.html].

The data was labelled with the help of a wide-angle stereo system, which operate at 50 Hz.

The following motion primitives were labelled:

  • No label: for periods of time without camera coverage
  • waiting: no longitudinal bicycle movement
  • starting movement: first detectable movement, which leads to starting
  • starting: first time rear wheel moves in driving direction until end of the scene
  • DATASET FORMAT

    The complete dataset is stored in csv format. It contains the sensor data and the corresponding label. The sensor data csv file consists of 19 columns:

    The data is split into meta data and payload data. The meta data contains the fields identifying
    an experiment, i.e., ExperimentID, SceneID, VRUID. Additionally, the Timestamp field is used to identify the corresponding sensor readings.

    • ExperimentID: meta data, identifies each different experiment
    • SceneID: meta data
    • VRUID: meta data, identifier for test subject
    • Timestamp: meta data, Timestamp, starting at 0 for each separate experiment

    An experiment always contains one VRU but it may consist of several scenes, i.e., starting movements. In the following the fields containing the sensor values are described.

     

    • Accelerometer_x: Acceleration force along the x axis (including gravity), in m/s²
    • Accelerometer_y: Acceleration force along the y axis (including gravity), in m/s²
    • Accelerometer_z: Acceleration force along the z axis (including gravity), in m/s²
    • Linear_Accelerometer_x: Acceleration force along the x axis (excluding gravity), in m/s²
    • Linear_Accelerometer_y: Acceleration force along the y axis (excluding gravity), in m/s²
    • Linear_Accelerometer_z: Acceleration force along the z axis (excluding gravity), in m/s²
    • Gyroscope_x: Rate of rotation around the x axis, in rad/s
    • Gyroscope_y: Rate of rotation around the y axis, in rad/s
    • Gyroscope_z: Rate of rotation around the z axis, in rad/s
    • Rotation_w: Scalar component of the rotation vector
    • Rotation_x: Rotation vector component along the x axis, unitless
    • Rotation_y: Rotation vector component along the y axis, unitless
    • Rotation_z: Rotation vector component along the z axis, unitless
    • target: indicates the label. Nothing = no label, 0 = waiting, 1 = starting_movement, 2 = starting

     

  • CITATION

    Please cite us when using this dataset in your research.

    @Article{BZH+18,author = {Bieshaar, M. and Zernetsch, S. and Hubert, A. and Sick, B. and Doll, K.},
       title = {Cooperative Starting Movement Detection of Cyclists Using Convolutional Neural Networks and a Boosted Stacking Ensemble},
       journal = {CoRR},
       year = {2018},
       volume = {abs/1803.03487},
       archiveprefix = {arXiv},
       eprint = {1803.03487}
    }

  • DOWNLOAD

    The data can be downloaded here:

    Download SMART DEVICE CYCLIST'S STARTING DATASET

  • REFERENCES

    [1] M. Goldhammer, E. Strigel, D. Meissner, U. Brunsmann, K. Doll and K. Dietmayer, "Cooperative multi sensor network for traffic safety applications at intersections," 2012 15th International IEEE Conference on Intelligent Transportation Systems, Anchorage, AK, 2012, pp. 1178-1183. doi: 10.1109/ITSC.2012.6338672
    [2] M. Bieshaar, G. Reitberger, S. Zernetsch, B. Sick, E. Fuchs and K. Doll, "Detecting Intentions of Vulnerable Road Users Based on Collective Intelligence", AAET – Automatisiertes und vernetztes Fahren, Braunschweig, pp. 67-87, Available: www.its-mobility.de/download/AAET/Dokumentation/AAET_2017_Tagungsband_Download.pdf

    [3] M. Bieshaar, S. Zernetsch, A. Hubert, B. Sick, and K. Doll. Cooperative starting movement detection of cyclists using convolutional neural networks and a boosted stacking ensemble. CoRR, abs/1803.03487, 2018.