-
Notifications
You must be signed in to change notification settings - Fork 198
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lidar in pointcloud localization #25
Comments
Hello, The reference point cloud can be given in a file or a ros point cloud message: For 3 DoF localization you can specify the file in the parameter Have a nice day, |
Thank you very much! So that means I could specify the poitcloud filename, and then specify the topic of the live pointclud here:
|
The If your map is not static and changes over time, you can update the reference point cloud by sending a msg to The ambient sensor data is received in the If you have sensors that publish ros sensor_msgs/LaserScan, you can specify the topics in For every message received in When the system starts, it will try to find the initial pose of the robot using feature matching. After having a valid initial pose, the system will enter tracking mode using the Iterative Closest Point (ICP) algorithm to align the sensor data with the reference map. |
Thank you for the detailed response! I have one more question, how do I visualise the estimated localization in rviz? |
Hello, You can add a pose arrow or axis associated with topic Or you can show the Have a nice day, |
Hi, I tried the localization node, but I'm not able to visualise the ambient pointcloud nor the localization pose. What the fixed frame should be? In my |
Hello, The fixed frame should be the one configured in the argument map_frame_id, which by default is The dynamic_robot_localization_system.launch is configured for 3 DoF (x, y, theta) and expects sensor data from sensor_msgs/LaserScan messages. If you want to perform 6 DoF localization (x, y, z, roll, pitch, yaw), then you need to configure drl for that use case. Have a nice day, |
I stll have problems with this configuration. I used the dynamic_robot_localization_system_6dof.launch file as you recommended, and changed |
Hello, What kind of sensor do you have ? If it is a 2d lidar then you are likely performing 3 DoF localization (example here) and you should look at dynamic_robot_localization_system.launch. If it is a 3d lidar, then you probably want 6 DoF localization (example here) and you should look at dynamic_robot_localization_system_6dof.launch. The TF package manages matrix transformations between coordinate systems: The REP 105 specifies the naming conventions for mobile robotics: Namely:
drl when used for mobile robotics publishes
For following the conventions, you need to publish the tf that connects If you do not have odometry, then you can set the odom_frame_id to an empty string which will cause drl to publish the tf The default naming is For example, if you do not want to follow the conventions and just want to track you lidar in 3d space, you need to set:
But keep in mind that you need to specify the reference point cloud file path and the initial pose of the sensor within that reference point cloud, in the dynamic_robot_localization_system_6dof.launch. You can look at ethzasl_kinect_dataset_high_complexity_slow_fly_movement.launch (and analyze the chain of launch files until reaching the drl node) for an example of 6 DoF tracking of a 3d sensor (video here). Have a nice day, |
Hi, I would like to perform localization of a lidar which is moving in a space, which is also recorded as a pcl pointcloud. I checked the big config file and I haven't seen any option for map upload. Where can I find a good launch file for this? Maybe it would be better to use the
dynamic_robot_localization_tests
package?Regards
The text was updated successfully, but these errors were encountered: