Volume 43 Issue 1
Jan.  2014
Turn off MathJax
Article Contents

Li Jianguo, Cui Hutao, Tian Yang. Sensors relative calibration method for landing navigation based on feature matching[J]. Infrared and Laser Engineering, 2014, 43(1): 267-273.
Citation: Li Jianguo, Cui Hutao, Tian Yang. Sensors relative calibration method for landing navigation based on feature matching[J]. Infrared and Laser Engineering, 2014, 43(1): 267-273.

Sensors relative calibration method for landing navigation based on feature matching

  • Received Date: 2013-05-10
  • Rev Recd Date: 2013-06-25
  • Publish Date: 2014-01-25
  • In the vision-aided inertial navigation system, optimal information fusion depends on accurate calibration of the six degrees-of-freedom transformation between a camera and an inertial measurement unit. Considering the measurement information optimal fusion problem of autonomous navigation during soft landing on Mars, a sensor-to-sensor relative pose calibration algorithm was proposed based on the extended Kalman filter. The proposed algorithm can accurately calibrate the relative pose of the camera and inertial measurement unit, and simultaneously estimate the position, velocity and attitude of the spacecraft. Moreover, obtaining this calibration information requires no additional measurement equipment except the landmark features on the surface of the Mars. Furthermore, high fidelity sensor models for wide field-of-view camera and inertial measurement unit were developed taking into account effects of the probe maneuver and the Mars rotation. Finally, the validity of the sensors calibration algorithm presented in this paper was demonstrated by mathematical simulation.
  • [1] Mour ikis A I, Trawny N, Roumeliotis S I, et al. Vision-aided inertial navigation for spacecraft entry, descent, and pin -point landing [J]. IEEE Transactions on Robotics, 2009, 25(2): 264-279.
    [2]
    [3] Bayard D S, Brugarolas P B. On -board vision -based spacecraft estimation algorithm for small body exploration[J]. IEEE Transactions on Aerospace and Electronic Systems, 2008, 44(1): 243-260.
    [4]
    [5] Li S, Cui P Y, Cui H T. Vision-aided inertial navigation for pinpoint planetary landing [J]. Aerospace Science and Technology, 2007, 11(6): 449-506.
    [6]
    [7] Liu Rui, Wang Changhong, Li Baohua. Adaptive interactive multiple model and its application in INS/CNS integrated navigation system[J]. Infrared and Laser Engineering, 2010, 39(5): 843-847 (in Chinese) 刘睿, 王常虹, 李葆华. 自适应交互多模型滤波在INS/ CNS 组合导航中的应用[J]. 红外与激光工程, 2010, 39(5): 843-847.
    [8]
    [9]
    [10] Huang Yuan, Wang Kedong, Liu Bao. INS/CNS integration schemes for a maneuvering spacecraft [J]. Infrared and Laser Engineering, 2012, 41(6): 1622-1628. (in Chinese) 黄远, 王可东, 刘宝. 机动天基平台惯性/天文组合模式研 究[J]. 红外与激光工程, 2012, 41(6): 1622-1628.
    [11]
    [12] Pittelkau M E. Kalman filter for spacecraft system alignment calibration [J]. Journal of Guidance, Control and Dynamics, 2001, 24(6): 1187-1195.
    [13]
    [14] Griffith D T, Singla P, Junkins J L. Autonomous on -orbit calibration of approaches for star tracker cameras [J]. Advances in the Astronautical Sciences, 2002, 112: 39-57.
    [15] Qiao Peiyu, He Xin, Wei Zhonghui, et al. Calibration of high -accuracy star sensor [J]. Infrared and Laser Engineering, 2012, 41(10): 2779-2784. (in Chinese) 乔培玉, 何昕, 魏仲慧, 等. 高精度星敏感器的标定[J]. 红 外与激光工程, 2012, 41(10): 2779-2784.
    [16]
    [17] Lobo J, Dias J. Relative pose calibration between visual and inertial sensors [J]. International Journal of Robotics Research, 2007, 26(6): 561-575.
    [18]
    [19] Kelly J, Sukhatme G S. Visual -inertial sensor fusion: localization, mapping and sensor-to-sensor self-calibration[J]. International Journal of Robotics Research, 2011, 30(1): 56-79.
    [20]
    [21] Shuster M D, Oh S D. Three -axis attitude determination from vector observations [J]. Journal of Guidance, Control and Dynamics, 1981, 4(1): 70-77.
  • 加载中
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Article Metrics

Article views(434) PDF downloads(131) Cited by()

Related
Proportional views

Sensors relative calibration method for landing navigation based on feature matching

  • 1. Deep Space Exploration Research Center,Harbin Institute of Technology,Harbin 150080,China;
  • 2. Unit 61345 of PLA,Xi'an 710010,China

Abstract: In the vision-aided inertial navigation system, optimal information fusion depends on accurate calibration of the six degrees-of-freedom transformation between a camera and an inertial measurement unit. Considering the measurement information optimal fusion problem of autonomous navigation during soft landing on Mars, a sensor-to-sensor relative pose calibration algorithm was proposed based on the extended Kalman filter. The proposed algorithm can accurately calibrate the relative pose of the camera and inertial measurement unit, and simultaneously estimate the position, velocity and attitude of the spacecraft. Moreover, obtaining this calibration information requires no additional measurement equipment except the landmark features on the surface of the Mars. Furthermore, high fidelity sensor models for wide field-of-view camera and inertial measurement unit were developed taking into account effects of the probe maneuver and the Mars rotation. Finally, the validity of the sensors calibration algorithm presented in this paper was demonstrated by mathematical simulation.

Reference (21)

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return