
This is the world’s first SLAM dataset recorded onboard real roller coasters, offering extreme motion dynamics, perceptual challenges, and unique conditions for benchmarking SLAM algorithms under aggressive real-world trajectories.
Key Highlights:
Unprecedented Motion Dynamics – Captures high-acceleration motion with rapid velocity changes, sharp turns, and steep vertical drops, providing a stress test for visual-inertial odometry and SLAM systems.
Ground-Truth-Aided Evaluation – Includes precise track layouts and synchronized vehicle timings for accurate pose validation, supporting quantitative benchmarking under controlled, repeatable conditions.
Multisensor Payload – Features tightly time-synced IMU, monocular/stereo cameras, and optional GPS, enabling advanced sensor fusion research in high-dynamic, low-feature environments.
Perception Under Real-World Stress – Environments include tunnels, rapid lighting changes, and dynamic occlusions (e.g., riders, motion blur), pushing SLAM systems to their robustness limits.
Loop Closures & Structural Repetition – Tracks often revisit similar locations at different orientations and speeds, ideal for evaluating loop closure detection and map consistency.
Cross-Domain Applicability – Insights and techniques developed here transfer to drone flight, autonomous driving, and AR/VR in dynamic or aggressive motion scenarios.
Included Sequences:
- TRON Lightcycle Power Run – Fast, low-light indoor/outdoor transitions with sharp direction changes.
- Seven Dwarfs Mine Train – Moderate-speed sequence with rich structural textures and dynamic ride motion.
Data Sequence Info
Name: tron_2023-12-06
Format: .bag
(ROS)
Download:
Device: OAK-D Pro W
- Stereo: 640×400 @ 30 FPS
- IMU: BMI270 @ 100 Hz
- Timestamp offset: ~0.01 ms
- Same boarding/alighting point: No
- Intrinsics/Extrinsics: see
camera_info
,tf_static
IMU Allan Variance:
- Accelerometer noise: 0.01 m/s²/√Hz | Random walk: 0.001 m/s³/√Hz
- Gyroscope noise: 0.001 rad/s/√Hz | Random walk: 0.0001 rad/s²/√Hz
How to Collect & Contribute
To meet SLAM-quality standards, your dataset should include:
- Calibrated intrinsics and extrinsics
- Time-synced camera and IMU (preferably global shutter)
- ROS bag export with proper metadata
Recommended Devices:
- GoPro (with IMU/GPS) – use tools like
gopro_ros
- OAK-D / Stereo Cameras – factory-calibrated, compact
Note: Most devices need external power and storage (e.g., UMPC or smartphone).
Mounting Tips:
- Use secure mounts (head, chest, wrist)
- Avoid handheld setups
- Check ride vehicle shape beforehand if possible
Preferred Format:
- ROS bags following REP standards (REP-104, REP-105, REP-145)
- Include all calibration and transform info via
CameraInfo
andtf
Related Resources
Project Video: https://youtu.be/g6IYMR6LCec?feature=shared
Github: https://github.com/Factor-Robotics/Roller-Coaster-SLAM-Dataset
Related articles from LearnOpenCV:
1. Visual SLAM: https://learnopencv.com/monocular-slam-in-python/
2. LiDAR SLAM: https://learnopencv.com/lidar-slam-with-ros2/
3. MASt3R SLAM: https://learnopencv.com/mast3r-slam-realtime-dense-slam-explained/