Hi guys! I’m trying to map my workplace, but I’m not getting it, the map that is made rotates together with the robot and is unable to map effectively, as in the figure below.
This is probably the slightly infuriating type of answer, so I apologize in advance, but I’d say the problem lies in your usage of gmapping and hector slam. I haven’t tried hector myself, but gmapping is pretty old and there are more efficient and more accurate approaches to slam these days.
Our team has been using Iris LaMa slam for our internal projects for a few months now and I think I can say that it’s quite good, especially on the magni. The best part of it that it has no parameters to tweak and works instantly with a single rosrun line.
My suggestion would be to switch to that or something equally newer like it.
We’ve been using Gmapping for a long time and we have good results too, but the problem is with our new Lidar. We were using this first lidar, which works in 360 degrees and we had good results, but it didn’t have such a good resolution and ended up getting lost in the location.
So we decided to switch to a better quality lidar and chose this one from Keyence, which is much better, but we were unable to make it map in the same way as with the other.
Ok. So clearly you are already used to lidar and gmapping. That is good because that part of it is very involved.
The Sick Tim551 which is highly respected is also a 270 deg scan. It has 1 deg angular resolution much like the Neato lidar. There are a bunch of launch files and stuff for the Tim551 on this site:
There may be some very good clues there. I am just guessing but it seems similar and the laserscan topic may be about the same as yours perhaps.
Also this link: sick_tim - ROS Wiki
In the launches and on this wiki I see min_ang and max_ang values of 2.35 type ranges which is radians and so is about what you may want. These you may already know about but just offering some thoughts.
Good luck. Perhaps somebody will respond with more concrete info.
My best guess is that your lidar might have time synchronization issues, namely that it is putting the wrong timestamps on the laserscan data. This means that when you turn, the laser scan (after transformed with the timestamps) turns with the robot instead of staying static with the walls.
To see this in RViz set your fixed frame to /odom and visualize the robots base_link and the laser scan topic. Then click the “Experimental” checkbox in the bottom of RViz and turn Synchronization to Exact. This should make RViz visualize everything with the correct timestamps and transformations (instead of approximating by just using data as it comes as it usually does for effeminacy). Now when you turn the robot, you will see that the scan doesn’t stay fixed in space, but instead it has a “jello like” lagging and turning with the robot.
You might be able to tune or modify the ROS node that you are using with your lidar to get better time sync. I know the hokuyo lidars had a timing calibration that was needed to solve this issue, maybe the Keyence has similar?
We’ve tried this synchronization with the RViz and changed some base_link parameters (mostly the yaw ones), and the mapping seems to be going a bit better. Still, we’ve run into another problem - it can, now, actually map our place, but only if the robot is going overbearingly slow. If it’s just a little fast, the map loses itself and starts writing something completely absurd.
Do you know a way that we can map with faster velocities? A parameter or configuration, or something in the Magni?
Well the odometry should still be fairly accurate even at high speeds, so it’s likely something you need to adjust with gmapping. Looking at some other sources it would seem that gmapping performance is very dependant on CPU speed, which isn’t really that high on ARM.