Table of Contents:
RADIATE dataset was collected in a variety of weather scenarios to facilitate the research on robust and reliable vehicle perception in adverse weathers. It includes multiple sensor modalities from radar and optical images to 3D LiDAR pointclouds and GPS.
Dataset Size
The dataset were collected in 7 different scenarios:
Sunny (Parked)
, Sunny/Overcast (Urban)
, Overcast (Motorway)
, Night (Motorway)
, Rain (Suburban)
, Fog (Suburban)
and Snow (Suburban)
. 8 different types of objects, i.e., car
, van
, truck
, bus
, motorbike
, bicycle
, pedestrian
and group of pedestrian
, were annotated on the radar images. The figure below shows the numbers of individual instances labelled.
The sizes of the scenarios are as follows:
Sensors
- Stereo Camera: An off-the-shelf ZED stereo camera was used. It was set at 672 × 376 image resolution at 15 frames per second for each camera. It was protected by a waterproof housing for extreme weather. Note the images in the dataset may be seriously blurred, hazy or fully blocked due to rain drops, dense fog and/or heavy snow.
- LiDAR: A 32-channel Velodyne HDL-32e LiDAR was set at 10Hz for 360° coverage. Since the LiDAR signal could be severely attenuated and reflected by intervening fog or snow, the point cloud data may be missing, noisy and incorrect for some sequences in extreme weathers.
- Radar: RADIATE adopted the Navtech CTS350-X radar, a scanning radar which provides 360° high-resolution range-azimuth images at 4 Hz. It was set to have 100-meter maximum operating range with 0.175m range resolution, 1.8° azimuth resolution and 1.8° elevation resolution, Currently, it does not provide Doppler information.
- GPS/IMU: Advanced Navigation Spatial Dual GPS/IMU was equipped. It provides horizontal position accuracy 1.2 m, 0.5 m with SBAS and 0.008 m with RTK. Its full specifications can be found here.
Folder Structure and File Format
- annotations: The annotation is saved as a .json file, where each entry of a list contains
id
,class_name
,bboxes
.id
is the object identification.class_name
is a string with the class name.bboxes
containsposition
:(x, y, width, height)
where(x, y)
is the upper-left pixel locations of the bounding box of the given width and height. Androtation
is the angle in degrees using counter-clockwise.
- GPS_IMU_Twist: A readable .txt file is provided for GPS and IMU. Each line is defined below:
A GPS and IMU example file looks like:
- Navtech_Cartesian: Radar images in cartesian are provided as .png at 1152 x 1152 resolution. Nearest neighbour interpolation was used to convert the radar images from polar to cartesian. Each pixel in cartesian represents 0.17361m x 0.17361m.
- Navtech_Polar: Radar images in polar are provided as .png at 400 x 576 resolution, where each row represents the range
0m - 100m
with an resolution 0.17361m per pixel. Each column represents 1.1° in angle.
- velo_lidar: 3D Lidar point clouds are saved as readable .txt files, where each line represents a 3D point with
x, y, z, intensity, ring
. (x,y,z) represents the 3D location of the point in lidar frame. Intensity [0-255] is reflectance captured by the sensor. Ring [0-31] means from which of the 32 channels the point was detected from.
-
zed_left/right: Unrectified stereo images were stored as .png files. They have 672 × 376 resolution at 15 Hz. The calibration parameters of the two cameras are provided. See Sensor Calibration for details.
Left Camera Right Camera -
Timestamps: Each folder contains a set of sensor-named
.txt
files which gives the timestamp of each collected sensor frame. Since the sensors operate at different frame rates, we simply adopted the arrival time of each sensor data as its timestamp. It is defined as:
where Frame
is the frame ID which corresponds to the filename. Time
is the timestamp using UNIX time system in seconds.
Sensor Calibration
Sensor calibration is required for multi-sensor fusion and correspondence of sensors. The stereo camera were calibrated using the Matlab camera calibration toolbox and its intrinsic parameters and distortion coefficients are given. In terms of extrinsic calibration, the radar sensor is chosen as the origin of the local coordinate frame as it is the main sensor for RADIATE. The extrinsic parameters for the radar, camera and LiDAR are represented as 6 degree-of-freedom transformations (translation and rotation). They are performed by first explicitly measuring the distance between the sensors, and then fine-tuned by aligning measurements between each pair of sensors. The sensor calibration parameters are provided in the config/default-calib.yaml
file in the RADIATE SDK.
The sensor calibration parameters calculated are as below.
SDK and Pre-Trained Models
We provide a Python Software Development Kit (SDK) for using RADIATE, and some pre-trained models for quick baselines on radar object detection, recognition, etc. Please click this page for more details.