How to calibrate raspicam v2.1


#1

Following @davecrawley advise to calibrate the camera, I tried to calibrate the camera.

my questions:
1. is the process ok overall ? did i miss anything? is using 0.050 as square size ok here? (i never did it before).
2. is the checkboard quality/size acceptable or should i order a pro print? (assuming I order a pro print on foam core, what size should it be? why?)
3. where am I supposed to save the calibration data?
4. how do i make sure my custome calibration data is used by the slam process instead of default values?

details follow


I printed the 8x6 chessboard on paper with my HP DesignJet 1050C. I did not get the parameters right at first because it did not print on the whole A0 paper roll.
447mm x 348 mmm (from the extremities of extreme squares)
Then i tired again in landscape using A0 format, but i had to enable autoresize otherwise the last squares would be cut).
Anyway, it the A0 format i ended up with approximately 97 cm x 75cm size (the measurement is not precise because i had to use a tape measure). Anyway in this size my available coardboard sheet wasn’t really flat (slightly bended). So i just dropped the big format in favor of the 447x348mm

Then i taped this chessboard on 3mm mdf wood sheet that seems flat/straight enough. I used a spay glue.

With my caliper i measured a square size of ~50mm (same in both side, it is indeed a square. at least that printer prints correctly). The black density isn’t great, maybe there are ink parameters to tune. i dunno.

image

For the rest, I forgot the laptop and did everything with a magni on a table with a desktop screen, a usb keyboard and mouse connected to the raspberry pi.

Then i basically kill all ros processed. followed the step in raspicam mode to build the node.
Then I started in different lxterminal windows:

roslaunch raspicam_node camerav2_1280x960.launch
there was a warning in the output… maybe all our problems come for the fact that the default camera calibration is not used?
[ WARN] [1538035667.663882307]: Camera calibration file /home/ubuntu/.ros/camera_info/camerav2_1280x960.yaml not found.

rosrun image_transport republish compressed /image raw out:=/raspicam_node/image

I tried to start rqt_image_view => everything was fine. i could the video feed. i closed it.

then i started

rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.050 image:=/raspicam_node/image camera:=/raspicam_node

is using 50mm as square value ok?
==> in the magni documentation there is --size 8x6 --square 0.74
==> in the ros documentation there is --size 8x6 --square 0.108 and looking at the guy’s hand 10cm makes sense. whereas magni documentation uses 70cm per square and it would result in a gigantic 7~8m long carpet that is plain impossible to hold/maneuver/print. (is there a typo in magni documentation?)

I did end up with the following GUI

I tried to move around the checkboard as the ros documentation suggests. Some news lines appeared in the output indeed.
After a while the X/Y/Skew bars became full and green. The Size bar did became green but only half full.

I then hit the green “calibrate button”. The GUI become unresponsive and i thought it was frozen. I went for a coffe and when i was back there was new lines in the output. The documentation suggested to hit the “commit” button. Which i did, but it triggered an exception in the output and a segmentation fault.

It looks like I should copy/paste the output somewhere. Where?
how do i make sure this calibration data is later used in the slam process?

thanks


#2
ubuntu@magni02:~$ rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.050 image:=/raspicam_node/image camera:=/raspicam_node
('Waiting for service', '/raspicam_node/set_camera_info', '...')
OK
libEGL warning: DRI2: failed to authenticate
*** Added sample 1, p_x = 0.498, p_y = 0.181, p_size = 0.215, skew = 0.049
*** Added sample 2, p_x = 0.498, p_y = 0.408, p_size = 0.185, skew = 0.068
*** Added sample 3, p_x = 0.545, p_y = 0.565, p_size = 0.184, skew = 0.065
*** Added sample 4, p_x = 0.422, p_y = 0.630, p_size = 0.181, skew = 0.052
*** Added sample 5, p_x = 0.428, p_y = 0.488, p_size = 0.174, skew = 0.129
*** Added sample 6, p_x = 0.416, p_y = 0.456, p_size = 0.167, skew = 0.278
*** Added sample 7, p_x = 0.407, p_y = 0.606, p_size = 0.165, skew = 0.211
*** Added sample 8, p_x = 0.371, p_y = 0.737, p_size = 0.180, skew = 0.008
*** Added sample 9, p_x = 0.424, p_y = 0.896, p_size = 0.184, skew = 0.058
*** Added sample 10, p_x = 0.404, p_y = 0.786, p_size = 0.191, skew = 0.136
*** Added sample 11, p_x = 0.489, p_y = 0.863, p_size = 0.307, skew = 0.064
*** Added sample 12, p_x = 0.389, p_y = 0.256, p_size = 0.180, skew = 1.000
*** Added sample 13, p_x = 0.502, p_y = 0.710, p_size = 0.250, skew = 0.084
*** Added sample 14, p_x = 0.576, p_y = 0.476, p_size = 0.151, skew = 0.019
*** Added sample 15, p_x = 0.769, p_y = 0.474, p_size = 0.144, skew = 0.040
*** Added sample 16, p_x = 0.814, p_y = 0.495, p_size = 0.152, skew = 0.196
*** Added sample 17, p_x = 0.713, p_y = 0.569, p_size = 0.145, skew = 0.093
*** Added sample 18, p_x = 0.658, p_y = 0.718, p_size = 0.134, skew = 0.129
*** Added sample 19, p_x = 0.699, p_y = 0.844, p_size = 0.161, skew = 0.196
*** Added sample 20, p_x = 0.567, p_y = 0.801, p_size = 0.158, skew = 0.079
*** Added sample 21, p_x = 0.710, p_y = 0.798, p_size = 0.156, skew = 0.008
*** Added sample 22, p_x = 0.835, p_y = 0.784, p_size = 0.146, skew = 0.097
*** Added sample 23, p_x = 0.897, p_y = 0.903, p_size = 0.151, skew = 0.070
*** Added sample 24, p_x = 0.756, p_y = 0.941, p_size = 0.175, skew = 0.127
*** Added sample 25, p_x = 0.351, p_y = 0.567, p_size = 0.134, skew = 0.088
*** Added sample 26, p_x = 0.256, p_y = 0.702, p_size = 0.143, skew = 0.141
*** Added sample 27, p_x = 0.127, p_y = 0.566, p_size = 0.171, skew = 0.114
*** Added sample 28, p_x = 0.123, p_y = 0.433, p_size = 0.173, skew = 0.040
*** Added sample 29, p_x = 0.051, p_y = 0.206, p_size = 0.200, skew = 0.107
*** Added sample 30, p_x = 0.048, p_y = 0.329, p_size = 0.227, skew = 0.002
*** Added sample 31, p_x = 0.106, p_y = 0.379, p_size = 0.212, skew = 0.166
*** Added sample 32, p_x = 0.019, p_y = 0.079, p_size = 0.231, skew = 0.162
*** Added sample 33, p_x = 0.045, p_y = 0.433, p_size = 0.302, skew = 0.022
*** Added sample 34, p_x = 0.133, p_y = 0.545, p_size = 0.321, skew = 0.029
*** Added sample 35, p_x = 0.145, p_y = 0.452, p_size = 0.353, skew = 0.142
*** Added sample 36, p_x = 0.173, p_y = 0.499, p_size = 0.394, skew = 0.227
*** Added sample 37, p_x = 0.176, p_y = 0.633, p_size = 0.423, skew = 0.269
*** Added sample 38, p_x = 0.333, p_y = 0.870, p_size = 0.326, skew = 0.029
*** Added sample 39, p_x = 0.176, p_y = 0.804, p_size = 0.336, skew = 0.016
*** Added sample 40, p_x = 0.173, p_y = 0.361, p_size = 0.338, skew = 0.073
*** Added sample 41, p_x = 0.348, p_y = 0.360, p_size = 0.339, skew = 0.028
*** Added sample 42, p_x = 0.517, p_y = 0.343, p_size = 0.329, skew = 0.002
*** Added sample 43, p_x = 0.647, p_y = 0.377, p_size = 0.331, skew = 0.074
*** Added sample 44, p_x = 0.788, p_y = 0.484, p_size = 0.335, skew = 0.210
*** Added sample 45, p_x = 0.633, p_y = 0.589, p_size = 0.346, skew = 0.068
*** Added sample 46, p_x = 0.475, p_y = 0.710, p_size = 0.355, skew = 0.309
*** Added sample 47, p_x = 0.438, p_y = 0.688, p_size = 0.342, skew = 0.459
*** Added sample 48, p_x = 0.383, p_y = 0.687, p_size = 0.356, skew = 0.641
*** Added sample 49, p_x = 0.321, p_y = 0.680, p_size = 0.375, skew = 0.781
*** Added sample 50, p_x = 0.328, p_y = 0.728, p_size = 0.404, skew = 0.482
*** Added sample 51, p_x = 0.353, p_y = 0.336, p_size = 0.292, skew = 0.627
*** Added sample 52, p_x = 0.387, p_y = 0.328, p_size = 0.249, skew = 0.462
*** Added sample 53, p_x = 0.406, p_y = 0.228, p_size = 0.245, skew = 0.542
*** Added sample 54, p_x = 0.351, p_y = 0.723, p_size = 0.277, skew = 0.121
*** Added sample 55, p_x = 0.382, p_y = 0.654, p_size = 0.268, skew = 0.510
('D = ', [0.13218282142579285, -0.1864506729599675, 0.0012672682642213108, 0.0006182537721607994, 0.0])
('K = ', [1017.8116476801164, 0.0, 646.6491487958014, 0.0, 1017.9241464782855, 476.6221135729139, 0.0, 0.0, 1.0])
('R = ', [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0])
('P = ', [1041.657958984375, 0.0, 647.446254202936, 0.0, 0.0, 1041.740966796875, 477.61519468633196, 0.0, 0.0, 0.0, 1.0, 0.0])
None
# oST version 5.0 parameters


[image]

width
1280

height
960

[narrow_stereo]

camera matrix
1017.811648 0.000000 646.649149
0.000000 1017.924146 476.622114
0.000000 0.000000 1.000000

distortion
0.132183 -0.186451 0.001267 0.000618 0.000000

rectification
1.000000 0.000000 0.000000
0.000000 1.000000 0.000000
0.000000 0.000000 1.000000

projection
1041.657959 0.000000 647.446254 0.000000
0.000000 1041.740967 477.615195 0.000000
0.000000 0.000000 1.000000 0.000000


('D = ', [0.13218282142579285, -0.1864506729599675, 0.0012672682642213108, 0.0006182537721607994, 0.0])
('K = ', [1017.8116476801164, 0.0, 646.6491487958014, 0.0, 1017.9241464782855, 476.6221135729139, 0.0, 0.0, 1.0])
('R = ', [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0])
('P = ', [1041.657958984375, 0.0, 647.446254202936, 0.0, 0.0, 1041.740966796875, 477.61519468633196, 0.0, 0.0, 0.0, 1.0, 0.0])
# oST version 5.0 parameters


[image]

width
1280

height
960

[narrow_stereo]

camera matrix
1017.811648 0.000000 646.649149
0.000000 1017.924146 476.622114
0.000000 0.000000 1.000000

distortion
0.132183 -0.186451 0.001267 0.000618 0.000000

rectification
1.000000 0.000000 0.000000
0.000000 1.000000 0.000000
0.000000 0.000000 1.000000

projection
1041.657959 0.000000 647.446254 0.000000
0.000000 1041.740967 477.615195 0.000000
0.000000 0.000000 1.000000 0.000000


Traceback (most recent call last):
Exception in thread Thread-2 (most likely raised during interpreter shutdown):Exception in thread Thread-5 (most likely raised during interpreter shutdown):  File "/opt/ros/kinetic/lib/python2.7/dist-packages/camera_calibration/camera_calibrator.py", line 248, in on_mouse
    rospy.signal_shutdown('Quit')
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/core.py", line 493, in signal_shutdown
    Exception in thread Thread-6 (most likely raised during interpreter shutdown):

Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
  File "/usr/lib/python2.7/threading.py", line 754, in run

Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/camera_calibration/camera_calibrator.py", line 90, in run
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/camera_calibration/camera_calibrator.py", line 164, in handle_monocular
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/camera_calibration/calibrator.py", line 744, in handle_msg
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/impl/registration.py", line 275, in startt.join(_TIMEOUT_SHUTDOWN_JOIN)
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/camera_calibration/calibrator.py", line 651, in remap
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/impl/registration.py", line 307, in run
  File "/usr/lib/python2.7/threading.py", line 203, in release
  File "/usr/lib/python2.7/threading.py", line 951, in join
    self.__block.wait(delay)
  File "/usr/lib/python2.7/threading.py", line 355, in wait
    remaining = endtime - _time()
TypeError: 'NoneType' object is not callable
<type 'exceptions.AttributeError'>: 'NoneType' object has no attribute 'remap'
<type 'exceptions.TypeError'>: 'NoneType' object is not callableTraceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/camera_calibration/camera_calibrator.py", line 88, in run
<type 'exceptions.TypeError'>: 'NoneType' object is not callable


QObject::~QObject: Timers cannot be stopped from another thread
Segmentation fault
ubuntu@magni02:~$

Move_basic tutorial / is my map ok?
#3

it seems i need to push my calibration data in /home/ubuntu/catkin_ws/src/raspicam_node/camera_info/camerav2_1280x960.yaml

…?


Move_basic tutorial / is my map ok?
#4

Sorry about the delayed response.

If you are running raspicam_node version of at least 0.3.0 (check apt policy ros-kinetic-raspicam-node), you should just be able to click the Commit button in the calibrator and have it automatically put the calibration in the right place.

Rohan


#5

My questions again;

1. is the process ok overall ? did i miss anything? is using 0.050 as square size ok here? (i never did it before).
2. is the checkboard quality/size acceptable or should i order a pro print? (assuming I order a pro print on foam core, what size should it be? why?)
3. where am I supposed to save the calibration data? ==> on this one, I think i’m fine
4. how do i make sure my custome calibration data is used by the slam process instead of default values?


#6
  1. The process is good. The small squares work, but larger squares are better. We use 7.5cm squares.

  2. The print quality is acceptable, as long as the size is correct and the corners are sharp. If you are going to be doing this a lot, it may be worth doing a pro print on a rigid surface like PVC or foam core.

  3. You can check the calibration parameters in rostopic echo /raspicam_node/camera_info and the ones in /home/ubuntu/catkin_ws/src/raspicam_node/camera_info/camerav2_1280x960.yaml and make sure that they match.


#7

I also want to try this camera calibration. But before I do, I want to backup the original data and I have no ideas which folder has it. Could you show me where it is?


#8

The default calibration file is never overwritten, instead when you calibrate the camera it creates a new calibration file. It looks for this file, and if it doesn’t exist it goes back to the default file.

Rohan


#9

I tried as the instruction,

roslaunch raspicam_node camerav2_1280x960_10fps.launch enable_raw:=true
then,

rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.108 image:=/raspicam_node/image/compressed camera:=/raspicam_node
The reason I use “raspicam_node/image/compressed” is “raspicam_node/image” was not published but “raspicam_node/image/compressed” is published.
I checked the image by using rviz, and it seemed to be published correctly.

The problem is The window will never show correctly. I can only see the incomplete windows, and there was no response. I cannot start this calibration. I also see “LibEGL warning: DRI2: failed to authenticate” on the terminal.


#10

Have you updated your software to the latest using

sudo apt update
sudo apt upgrade


#11

Hi David,

I updated to the latest version, but still I have same problem. Do you have any idea to solve this? If you think this problem is caused by version, could you tell me which version should be installed?

failed%20calibration


#12

Hi David,

I finally succeeded to do it!! I forgot to do enable “row image” after I updated. So the main factor to cause this problem is version. But I am not sure which node version will cause this.


#13

Hi David,

The calibration process seemed be done well, but the image processing result went bad. So I tried some. After calibration, the system says fiducial mark is set on the wall 1.8m or 2m far from the Magni even the real position is just 1.5m far (X pos).

The size of square on the chessboard and number I put to the terminal is same(0.108m). I put Magni on the flat floor and just started calibration. I move around Magni as same as the instruction. The bar of the X is not fully completed but at least the “calibrate” button were available. Are there any other things I should do before or during calibration? Also it takes over 15 min to finish calibration, sometimes more.


#14

Also I have another question. I deleted the bad yaml file, but it seemed to be remain the calibration data. How do I restore? Still before calibration is better so I want to restore.


#15

Calibration is compute intensive - the best answer is to do share the image data with a laptop and calibrate on a laptop so that it takes less time, then you need to copy the calibration data back to the robot for it to work well. Also make sure that the calibration shows green on all dimensions. The way you get full green bars on all dimensions is to have a large calibration board and gather a large amount of data from many many different perspectives - near, far; left, right; top, bottom.


#16

Hi David,

I tried but never got all-green result even I moved around to get many different perspectives.
And the biggest problem is I cannot fix to original. My understanding, generated yaml file is the calibration data, so I deleted it but still the robot behave as after bad-calibrated. Is there anything to do more?