-
Notifications
You must be signed in to change notification settings - Fork 33
Description
Hello,
I'm currently working with the UTBM Robocar Dataset and need some guidance on comparing Lidar odometry estimations with GPS RTK ground truth. The Lidar estimations are in the Lidar frame, while the GPS RTK data is in global coordinates (likely NED). This requires a rotation and translation for proper alignment, but I'm unsure how to accurately obtain these transformations for robust trajectory evaluation.
Same question here #23
Is there an extrinsic transformation that can be applied? I believe this might depend on the initial position of the vehicle in each sequence. Any recommendations or methods for effectively aligning and comparing these trajectories would be greatly appreciated.
Here is the comparison among estimation and ground truth (from magellan_proflex500_driver) without any rotation:
Here is the comparison among estimation and ground truth (from magellan_proflex500_driver) with a rotation estimated by trying angles in yaw:
Thank you!