Kalman Filter with fiducials


#1

In this topic , i will try to explain how kalman filter can be used in improving localization with the fiducial_slam method .

In the case of magni , we have two sources of localization

  • Directly from the wheel encoders (Odometry)

  • The transform(s) coming from the Fiducials

While the initial map is being build , its best that at least two fiducials are visible in the camera_frame all the time . This way the transform of the fiducials with respect to the map will not be depend on the odometry coming from the encoders , think of the fiducial_slam like our own local GPS system .

That being said ,
lets take a look at the published topics

  • Wheel_encoder
    /ubiquity_motor_controller/odom

    header:
    seq: 765
    stamp:
    secs: 15
    nsecs: 611000000
    frame_id: “odom”
    child_frame_id: “base_link”
    pose:
    pose:
    position:
    x: 0.000243330966893
    y: 1.11175950923e-08
    z: 0.0
    orientation:
    x: 0.0
    y: 0.0
    z: 3.87435834379e-05
    w: 0.999999999249
    covariance: [0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0 .0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0. 0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.03]
    twist:
    twist:
    linear:
    x: 3.61561518372e-06
    y: 0.0
    z: 0.0
    angular:
    x: 0.0
    y: 0.0
    z: 5.1088070178e-09
    covariance: [0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0 .0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0. 0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.03]

  • /fiducial_pose

    header:
    seq: 1382
    stamp:
    secs: 207
    nsecs: 166000000
    frame_id: “map”
    pose:
    pose:
    position:
    x: -0.475461433694
    y: 2.12627576506
    z: 0.842305579475
    orientation:
    x: -0.0317846697398
    y: 0.0862599626826
    z: -0.088511706501
    w: 0.991823891333
    covariance: [19007371.7965861, 0.0, 0.0, 0.0, 0.0, 0.0, 19007371.7965861, 0.0, 0.0, 0.0, 0.0, 0.0, 19007371.7965861, 0.0, 0.0, 0.0, 0.0, 0.0, 19007371.7965861, 0.0, 0.0, 0.0, 0.0, 0.0, 19007371.7965861, 0.0, 0.0, 0.0, 0.0, 0.0, 19007371.7965861, 0.0, 0.0, 0.0, 0.0, 0.0]

**You may take a look at the below repository , since i dont have a robot right now , i have made a gazebo environment with Fiducials . It has a husky with a kinect sensor mounted **

https://github.com/chrissunny94/husky

The config file for the robot_localization is here ,

I am still not clear about how to go about systematically tune the parameters to get the most optimum result but i will post on that in the days to come .

Cheers!


#2

Hi

Have you made progress on your attempts to use a kalman filter?
Don’t you think the ultimate result could be improved by using an IMU? (such as this one that would directly connect on the PI and is natively compatible with ROS


https://wiki.ros.org/razor_imu_9dof

Did you manage to get your hands on a real magni?


#3

Yes I had the magni with me for a while , I was working on alternatives to an IMU such as visual odometry .

But I now realize that an IMU kalman filter fusion would have done a much better job as to obtaining a reliable odometry .I regret not having tried an IMUs such as what you suggested whilst I had the robot (Razor 9DOF that can easily be added to the Magni , via the USB ports.)

The configuration for IMU + encoder is easily available , check out the GitHub repos of husky or jackal . They use the robot_localization package to do it .