CVPRW 2025 SLAM Challenge
Table of Contents
About #
The goal of this challenge is to leverage the high temporal and spatial resolution of HD event cameras for SLAM and pose estimation applications. This challenge is part of the CVPR 2025 Workshop on Event-based Vision.
Tracks #
- Event (+ IMU): if you obtain your pose using a single or a pair of event cameras, with or without IMU.
- Event + Mono (+ IMU): if you obtain your pose using a single or a pair of event cameras fused with monocular global shutter cameras, with or without IMU.
Participation #
Task #
Your goal is to obtain the pose of the reference event camera on a set of challenging settings: car urban, UAV fast flight, and Spot.
What should I submit? #
You should submit a single zip file containing the poses (position + quaternions) for the following sequences:
Sequence | Test data | Reference timestamps |
---|---|---|
car_urban_day_ucity_big_loop | h5 | txt |
falcon_outdoor_day_fast_flight_3 | h5 | txt |
spot_outdoor_day_penno_building_loop | h5 | txt |
For each one of these sequences, provide a file sequence.txt
with the following format in camera frame.
timestamp tx ty tz qx qy qz qw
where:
timestamp
: timestamp in seconds. You should provide a pose estimate for the timestamps in the Reference timestamp file.tx ty tz
: position in meters.qx qy qz qw
: orientation quaternion.
Data #
This challenge is based on M3ED version 1.2. You can check the current
version of an HDF file with h5dump -a /version file.h5
.
You can use all sequences available in M3ED for development. We provide ground truth pose data in TUM trajectory file format so you can evaluate directly with EVO.
For example, you can use the following sequence to train your algorithm:
Sequence | Data | Reference timestamps | GT |
---|---|---|---|
falcon_indoor_flight_1 | h5 | txt | txt |
Evaluation #
You will be evaluated on the accuracy of your pose using evo, computing the APE (absolute pose error) of your pose with respect to the ground truth. The APE of the three sequences are added towards your final score.
You can use evo locally, to evaluate the accuracy of your algorithm.
export SEQ=falcon_outdoor_day_fast_flight_3
evo_traj tum result/${SEQ}.txt \
-p --ref=reference/${SEQ}_pose_evo_gt.txt \
--align --t_max_diff=0.01 --correct_scale \
--sync --downsample 100
Timeline #
- March 1: Challenge opens for submissions.
- June 2: Challenge ends.
- June 4: Winners announced.
- June 8: The top submissions should send their code for manual evaluation, report, and posters.
- June 11-12: Posters presented at the CVPR Workshop on Event-based vision.
- After the workshop: The top submissions are invited to collaborate on a report for the challenge.
Terms #
Participants are not required (but encouraged) to release their code. Nonetheless, the organizers of the challenge may request a copy of the code and instructions to run it locally to validate the results submitted by the authors. Failure to provide code to the organizers is a valid reason for disqualification.
We request participants that they do not release the results of their submission, as we may use this dataset for future challenges.