Looks like depending on the location in the room and the hyper local conditions you are getting a standard deviation of somewhere between 0.6 to 3cm - that’s well within expectations. A couple of points here. I think that without a Kalman filter - I wouldn’t be too surprised by the fluctuations that you are seeing.
The data is coming off at a fairly high publish rate and there is no averaging - each position fix is based on a single frame of the camera. So if you have 100 observations in your data set (which only takes a few seconds) then you will get some observations that are “rare” events in that they are 1% / 3 sigma events.
No other data is being used to determine your location other than the fiducials - this is an absolute localization method. So at least in theory the error that you see stationary should be the same as when you go away and come back. You should be able to go away - and come back hours or days later and get the same / similar localization error. Obviously you will find your ground truth measurement will probably be off by a cm or two so the standard deviations will look bigger - but that’s ground truth error not localization error.
If the fluctuations are a worry then implementing a Kalman filter will eliminate them. This will also tighten the standard deviation - as the kalman filter will essentially reject outlier events.
A quick and dirty method would be to just average the data from several successive frames - we opted to not do that as you can always pipe the data in to a kalman filter or average once you have the data, however if we only provide averaged information there is no way to recover un-averaged data from averaged data.
Most users expect the localization accuracy to be around 10cm without optimization and that is what we usually quote.
There is a lot of things that can be done to optimize location accuracy. A partial list would include.
Adjusting fitting parameters on Aruco detect to trade off compute time with fitting accuracy
Increasing the size of the fiducials
Increasing the density of the fiducials
Adjustments to lighting to improve contrast (in general the more light the better)
Adjustments to the gain and shutter speed parameters on the camera to increase the contrast of the image
There is one user who is also trying to get higher accuracy for a docking application and is already working down this list in their environment and for their application. I will obviously feedback what comes back in to the stack if it is generally applicable.
Finally the improvements that we’ve made in the stack are likely to allow a little more fluctuation with a little more robustness of the fiducial field. We think that’s the right trade off, but the old method is just a launch option away.
I was absent from my work for the last couple of weeks but now that I’m back I started for implementing the @alexis script and calibrating the camera! I was able to put the robot in a infinite loop with quite success and a minor error in the path! So I was happy with the results, till i`ve put some weight on the robot! That was terrible! The robot just lost traction from the front wheels from time to time, and goes in zigzag when he suppose to do straight forward! The rotation is terrible as well, sometimes the robot can’t even rotate! He just slides trying to rotate! What’s the objective if the robot can’t handle weight? Am I doing something wrong!? I mess with the jerk limits and angular and linear speed as well to no avail! I was testing with +/- 60kg! In my opinions the robot needs to have a more advanced navigation like constantly adjusting the right course during the trip and not only when he stops between waypoints!
I am pleased you were able to get satisfactory performance unloaded. Now lets talk about loaded performance.
You put 60Kg on? What kind of surface? What speeds?
Most importantly how did you load it on the robot? Can you share a picture?
I would first Look at the way in which you’ve loaded the robot. In general you should load the robot so that the majority of the weight is on the front (big drive) wheels. If you don’t then they will likely slip.
The second thing might be to edit PID parameters for the load that you have on the robot. You’ve taken the load to 5x the unloaded weight and you may need to look at the PID parameters. However I’d look at the load you’ve put on first. PID parameter tuning is something you should do with a lot of caution - so come back here for advice before you start.
Also its possible for you to implement move-base which is much more sophisticated. Move-base is a full spanning tree style planner. It has all the bells and whistles you can imagine - however it also has all the problems you can imagine as well. So proceed with caution.
Finally we are hiring a developer to make all the edits you’ve suggested and more and to strengthen the team in this area. We are optimistic we will start to get output in a few weeks on this.
that is interesting feedback. I’m a still somehow disappointed that I cannot really use the solution as is.
However, I will make the measurements while moving around the magni as promised.
@brunolfm just so you know, I did sit on the magni while it was looping forever between waypoints. I had my feet on the caster wheels and my bottom approximately on the two third of the top plate (so that my body would not obstruct the camera). I weight about 90Kg…
It’s a bit scary. The stops were a bit arsh, some PID tuning most likely as david suggested or jerk/Acceleration limits to toy with. But it was basically working. I did not dare staying there more than a few minutes though. As David suggested, I would encourage you to have the load closer to motored wheels. However, without moving the camera (which would then require to change the urdf) => it might prove challenging. My goal is to get the magni stay within a repeatable and acceptable envelope, and then add weight.
@davecrawley can you please provide some guidance on this kalman filter thing? i’d like to give it a try. Should it be its own ros node? Would you please recommend an existing node?
@davecrawley Ok I was concentrating the weight on the back and that was making the robot to slide! Now that I’ve raised the camera to a test height I’m able to put weight on the front! And now the robot is doing the right path! Sometimes he does a small error in calculating the angular direction but it’s acceptable! I will start making a proper support for the camera at an even higher altitude! I’m happy with this result! I will leave the robot doing an 8 hour run with the @alexis script this night and check in the morning the results! Sorry Magni team for the robot “disfiguration”
Oh and one other thing to think about, if you want to run with lots of load for 8 hours or more consider your battery size. The biggest battery the robot is designed for is a 12350 battery (35Ah) but it can accept any pair of 12V lead acid batteries the 1270 (7Ah) or the 1290 (9Ah) are particularly recommended. However you’ll be at the edge of what those batteries can do trying to operate for 8 hours particularly with load.
Broadly speaking in most situations you get about 1 hour per Ah of battery size so consider something say 50% bigger or 2X bigger to make sure you have plenty of battery life and you don’t run your batteries completely dead (which is bad for the battery). That would imply getting a 16Ah battery for this type of thing. For more on batteries see:
@davecrawley right now even without modifying the urdf the robot is running fine tomorrow I will built almost a structure on top of the robot in order to transport products from departments to expedition! If this works fine, as I expect it will, i expect to buy more robots
@davecrawley I can say that I’m really happy with the results so far! Unfortunately I was unable to do the timelaspe video because right now the robot have very weak and tiny batteries! I’ve ordered one 12350 35ah to test it on a full day! Right now with a payload of almost 30kg I’m having an 3/4 hour of runtime! And I can tell that the results are being quite good actually! I will now advance for the next fase that it will be build an proper aluminium cabinet on top of the Magni! I’ll post results latter! The ambient tested was an rectangle of 9 meters long by 5 meters wide! After some map reading I activate the map_read_only! Hope I can test it on the real map anytime soon! That will be an 100meters long for 16meters wide! That will be the final test! Obviously I will add many checkpoints on the 100meters run to correct the angular direction!
Are you charging your batteries regularly. If you let those lead acid batteries run all the way down then you will damage them or give them shorter lifespan. The best is to get a big battery (like the 12350) only discharge it half way before recharging it.
If your robot does a two shift 18 hour day then overnight charging would do that for you with a 12350 battery more or less.
good job bruno!
Once the camera was positioned on the wood cabinet, did you update your magni urdf prior to do the mapping?
How high is your ceiling again?
Cant wait to see the time lapse in the new 100 x 16 meter location
It was not necessary to do any changes in the urdf for now… when i pass to the final location, maybe it will be necessary! Right now the height of my ceiling is 2.3meters high! its to close to the camera on top of the cabinet!
i have made a simple mapping with the fiducials! All ligned up, didnt know if this is the best method tough! But for me is importante to stay this way because on the final location im planing to put the field of fiducials in the same manner but at a bigger scale! if i have 100fiducials on an 9x3 space, on the production stage i will have more than 1.000 fiducials! it will be a big field! ill post the current field below!
@davecrawley as you can see in the picture above ive installed the 12350 battery and charge-it a little! But when im going to turn on the robot, booth the lights turn on but the robots didnt start! It seems that even the raspbPi didnt turn on! Ive connected the screen to the Pi to no avail! Do i need to make any modification in order to the robot accept this battery?
PS: I´ve replaced the cables black/red for the ones provided by ubiquity with the holes on it!
Did you put one 12350 battery? The robot needs 24V, or 2 in 12350 batteries in series. Use the included yellow cable to connect the batteries in series. This should take almost all the volume of the inside of the robot.
You probably only have ~15 volts, which is below the 20V under-voltage lockout.
Of course keep the batteries fully charged at every opportunity. Lead acid batteries like to be fully charged most of the time.
And if you do manage to run the battery dry - please tell me how many hours of robot operation it took.
We used 12350 batteries and ran the robot for 2 consecutive days and still were not able to get close to draining them. We don’t really know how long they will last for as we’ve not been able to do it.