Volume 52 Issue 3
Mar.  2023
Turn off MathJax
Article Contents

Ai Shuangzhe, Duan Fajie, Li Jie, Wu Linghao, Wang Xiaofeng. Long and narrow trajectory measurement system based on centroid matching optimization in close-up scenes[J]. Infrared and Laser Engineering, 2023, 52(3): 20220574. doi: 10.3788/IRLA20220574
Citation: Ai Shuangzhe, Duan Fajie, Li Jie, Wu Linghao, Wang Xiaofeng. Long and narrow trajectory measurement system based on centroid matching optimization in close-up scenes[J]. Infrared and Laser Engineering, 2023, 52(3): 20220574. doi: 10.3788/IRLA20220574

Long and narrow trajectory measurement system based on centroid matching optimization in close-up scenes

doi: 10.3788/IRLA20220574
Funds:  Guangdong Provincial Key Program (2020B0404030001); National Key Research and Development Plan Project (2020YFB2010800); National Natural Science Foundations of China under Grant (61905175, 61971307); National Key Laboratory Exploratory Project (Pilt2103); The Fok Ying Tung Education Foundation (171055); Young Elite Scientists Sponsorship Program by CAST (2021QNRC001); National Defense Science and Technology Key Laboratory Fund (614221210304); Outsourced Project of AVIC Sichuan Gas Turbine Research Institute (WDZC-2021-3-4)
  • Received Date: 2022-11-10
  • Rev Recd Date: 2022-12-24
  • Publish Date: 2023-03-25
  •   Objective   Three-dimensional trajectory measurement is a key technology involved in intelligent monitoring, motion analysis and target tracking, which has been widely used in transportation, military and other fields. In recent years, with the rapid development of computer vision technology, imaging equipment and computers are used to replace human eyes and brains to measure the three-dimensional trajectory of target objects with high accuracy. Monocular vision mostly estimates the depth distance of the target in the three-dimensional coordinate system through the proportion of pixel area changes. When the target object rotates and deforms, the depth estimation results are greatly affected. However, binocular vision based on 3D reconstruction mathematical model and polar constraint has the advantages of reliable calculation results and relatively high measurement accuracy in 3D trajectory measurement of flying objects. In the three-dimensional trajectory measurement based on binocular vision, the high-precision matching of binocular homonymous points is the key to improve the measurement accuracy. Especially in the narrow and long space near distance measurement scene for aeroengine safety monitoring, because the binocular camera shoots the target object from different angles, especially when the included angle of the optical axis of the binocular camera is large, the trajectory measurement accuracy of only centroid positioning matching is not high. In order to solve the above problems, a near distance trajectory measurement system in narrow and long space based on centroid matching optimization is developed. In the three-dimensional trajectory measurement based on binocular vision, the high-precision matching of binocular homonymous points is the key to improve the measurement accuracy. Especially in the narrow and long space near distance measurement scene of aeroengine safety monitoring, because the binocular camera shoots the target object from different angles, especially when the included angle of the optical axis of the binocular camera is large, the trajectory measurement accuracy of only centroid positioning matching is not high. To solve these problems, a trajectory measurement system based on centroid matching and optimization is developed.   Methods   First of all, on the basis of only using the centroid method to locate and match the object, the epipolar constraint projection is used to locate the centroid of the binocular. Then, a gray cross correlation method based on distance and method weight is proposed for subpixel matching of binocular centroids. Finally, Kalman filtering is used to correct the 3D reconstructed motion trajectory of the object, in order to improve the measurement accuracy of the trajectory, the three-dimensional trajectory points with large deviation from the ideal trajectory position caused by the unstable centroid position in the extraction of the target centroid are removed from the three-dimensional trajectory. In the laboratory environment, simulate the narrow and long movement space before the bird enters the engine, build a binocular measurement system at the side close position, and carry out the narrow and long trajectory measurement experiment verification.   Results and Discussions   According to the measurement experiment results of different texture target objects (Fig.11, Fig.12, Fig.13 and Tab.1), it can be seen that the depth of the target object's imaging texture has a certain impact on the trajectory measurement accuracy of the measurement system in this paper. Because the gray value distribution of the target object with deeper texture is more abundant, the sub-pixel matching based on gray level cross-correlation has better binocular matching effect, so it has higher measurement accuracy. According to the repeatability experiment results (Fig.14), in the full range of 128 mm, the average trajectory length measurement error of the trajectory measurement system in this paper is 13.14 μm for objects with good texture. The measurement accuracy of track length is about 0.01%, and the straightness error of track is small.   Conclusions   Compared with only using centroid method for coarse positioning and matching, the trajectory length measurement accuracy and straightness of the measurement system are significantly improved, and the high-precision measurement of flying object trajectory in the narrow and long space near distance measurement scene is realized. Based on the error analysis of the measurement results, in the actual measurement, the imaging clarity of the target object texture should be improved by improving the light source illumination and optimizing the optical path design, so as to improve the measurement accuracy of the target object trajectory. The follow-up work direction is to optimize the texture of the target object through image enhancement, improve the trajectory measurement accuracy of the measurement system for the target object with poor texture, and further study the high-precision extraction method of non-rigid body and rotating target matching points, so that the entire measurement system has better stability for the trajectory measurement of different target objects in different measurement scenes.
  • [1] Sun J B, Ji J. Pedestrian abnormal behavior detection using memory enhancement self coding under video surveillance [J]. Infrared and Laser Engineering, 2022, 51(6): 20210680. (in Chinese)
    [2] 刘菲. 运动人体行为分析系统及关键技术研究[D]. 西安: 西安电子科技大学, 2007. (in Chinese)

    Liu F. Research on sports human behavior analysis system and key technologies[D]. Xi'an: Xi'an University of Electronic Science and Technology, 2007. (in Chinese)
    [3] Wang J, Zhong Z L, Zhu W D. A near infrared binocular system for optical instrument tracking [J]. Infrared and Laser Engineering, 2022, 51(6): 20210517. (in Chinese)
    [4] 刘小于. 基于双目立体视觉的目标跟踪与速度估计[D]. 成都: 成都理工大学, 2015. (in Chinese)

    Liu X Y. Target tracking and velocity estimation based on binocular stereo vision[D]. Chengdu: Chengdu University of Technology, 2015. (in Chinese)
    [5] Huang G. Real-time badminton tracking with binocular vision system [J]. Journal of Electronic Measurement and Instrumentation, 2021, 35(6): 117-123.
    [6] Wang C, Yu M K, Yang C Y, et al. Study on night vision intelligent detection method of dropped mines [J]. Chinese Optics, 2021, 14(5): 1202-1211. (in Chinese) doi:  10.37188/CO.2020-0214
    [7] Zhang S L, Cui Y, Xing M. Target ranging technology of light field imaging [J]. Chinese Optics, 2020, 13(6): 1332-1342. (in Chinese)
    [8] Sun M C, Li T J, Shi J H, et al. Research on 3D vibration measurement method based on binocular vision [J]. Sensors and Microsystems, 2022, 41(10): 26-29. (in Chinese)
    [9] Yue L Q, Jia X, Miao Y, et al. High precision calibration of internal and external parameters of robot arm binocular vision system [J]. Infrared and Laser Engineering, 2021, 50(9): 20200525. (in Chinese)
    [10] Ying X L, Yao J Y, Zhang X S, et al. Three dimensional measurement system of light source step fringe projection using LD [J]. Optoelectronic Engineering, 2021, 48(11): 43-51. (in Chinese)
    [11] Zhang Jing, Hu Dong, Pan Changqing. A correlation filter target tracking algorithm combining LK optical flow[C]//Computer Science and Application Engineering, 2019.
    [12] Cui Y P, Ge X W, Fu Q F. Research on measurement method of landing speed of flying target based on binocular vision [J]. Sensors and Microsystems Technology, 2009, 28(8): 37-38, 42. (in Chinese)
    [13] Li L B, Chen C P, Wu Z, et al. Fish swimming 3D trajectory tracking based on binocular vision [J]. Journal of Three Gorges University (Natural Science Edition), 2018, 40(2): 95-99. (in Chinese) doi:  10.13393/j.cnki.issn.1672-948x.2018.02.019
    [14] Wang L H, Li L L, Guo H L. Target based camera calibration method [J]. Beijing Surveying and Mapping, 2021, 35(3): 372-375. (in Chinese) doi:  10.19580/j.cnki.1007-3000.2021.03.019
    [15] Deng Z Q, Wang Y, Zhang B, et al. Research on image segmentation of pitaya fruit based on Otsu algorithm and morphology [J]. Intelligent Computer and Application, 2022, 12(6): 106-109,115. (in Chinese) doi:  10.3969/j.issn.2095-2163.2022.06.019
    [16] Wang Z, Du Z C. A template matching optimization algorithm for sealing strip length detection [J]. Measurement and Control Technology, 2022, 41(7): 75-80. (in Chinese)
    [17] Jiang S P, Xiang W, Liu Y P, et al. Template matching using multi feature co-occurrence matrix [J]. Optics and Precision Engineering, 2021, 29(6): 1459-1467. (in Chinese) doi:  10.37188/OPE.20212906.1459
    [18] Lu R Q, Ma H M. Template matching for multi-scale salient region extraction [J]. Optics and Precision Engineering, 2018, 26(11): 2776-2784. (in Chinese) doi:  10.3788/OPE.20182611.2776
    [19] Li R R, Liu H, Ding Y Y. Block kaczmarz algorithm for solving large overdetermined linear algebraic equations [J]. Journal of Computational Mathematics of Colleges and Universities, 2021, 43(2): 150-160. (in Chinese)
  • 加载中
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Figures(14)  / Tables(1)

Article Metrics

Article views(155) PDF downloads(18) Cited by()

Related
Proportional views

Long and narrow trajectory measurement system based on centroid matching optimization in close-up scenes

doi: 10.3788/IRLA20220574
  • 1. State Key Laboratory of Precision Measuring Technology & Instruments, Tianjin University, Tianjin 300072, China
  • 2. AECC Sichuan Gas Turbine Research Establishment, Chengdu 611730, China
Fund Project:  Guangdong Provincial Key Program (2020B0404030001); National Key Research and Development Plan Project (2020YFB2010800); National Natural Science Foundations of China under Grant (61905175, 61971307); National Key Laboratory Exploratory Project (Pilt2103); The Fok Ying Tung Education Foundation (171055); Young Elite Scientists Sponsorship Program by CAST (2021QNRC001); National Defense Science and Technology Key Laboratory Fund (614221210304); Outsourced Project of AVIC Sichuan Gas Turbine Research Institute (WDZC-2021-3-4)

Abstract:   Objective   Three-dimensional trajectory measurement is a key technology involved in intelligent monitoring, motion analysis and target tracking, which has been widely used in transportation, military and other fields. In recent years, with the rapid development of computer vision technology, imaging equipment and computers are used to replace human eyes and brains to measure the three-dimensional trajectory of target objects with high accuracy. Monocular vision mostly estimates the depth distance of the target in the three-dimensional coordinate system through the proportion of pixel area changes. When the target object rotates and deforms, the depth estimation results are greatly affected. However, binocular vision based on 3D reconstruction mathematical model and polar constraint has the advantages of reliable calculation results and relatively high measurement accuracy in 3D trajectory measurement of flying objects. In the three-dimensional trajectory measurement based on binocular vision, the high-precision matching of binocular homonymous points is the key to improve the measurement accuracy. Especially in the narrow and long space near distance measurement scene for aeroengine safety monitoring, because the binocular camera shoots the target object from different angles, especially when the included angle of the optical axis of the binocular camera is large, the trajectory measurement accuracy of only centroid positioning matching is not high. In order to solve the above problems, a near distance trajectory measurement system in narrow and long space based on centroid matching optimization is developed. In the three-dimensional trajectory measurement based on binocular vision, the high-precision matching of binocular homonymous points is the key to improve the measurement accuracy. Especially in the narrow and long space near distance measurement scene of aeroengine safety monitoring, because the binocular camera shoots the target object from different angles, especially when the included angle of the optical axis of the binocular camera is large, the trajectory measurement accuracy of only centroid positioning matching is not high. To solve these problems, a trajectory measurement system based on centroid matching and optimization is developed.   Methods   First of all, on the basis of only using the centroid method to locate and match the object, the epipolar constraint projection is used to locate the centroid of the binocular. Then, a gray cross correlation method based on distance and method weight is proposed for subpixel matching of binocular centroids. Finally, Kalman filtering is used to correct the 3D reconstructed motion trajectory of the object, in order to improve the measurement accuracy of the trajectory, the three-dimensional trajectory points with large deviation from the ideal trajectory position caused by the unstable centroid position in the extraction of the target centroid are removed from the three-dimensional trajectory. In the laboratory environment, simulate the narrow and long movement space before the bird enters the engine, build a binocular measurement system at the side close position, and carry out the narrow and long trajectory measurement experiment verification.   Results and Discussions   According to the measurement experiment results of different texture target objects (Fig.11, Fig.12, Fig.13 and Tab.1), it can be seen that the depth of the target object's imaging texture has a certain impact on the trajectory measurement accuracy of the measurement system in this paper. Because the gray value distribution of the target object with deeper texture is more abundant, the sub-pixel matching based on gray level cross-correlation has better binocular matching effect, so it has higher measurement accuracy. According to the repeatability experiment results (Fig.14), in the full range of 128 mm, the average trajectory length measurement error of the trajectory measurement system in this paper is 13.14 μm for objects with good texture. The measurement accuracy of track length is about 0.01%, and the straightness error of track is small.   Conclusions   Compared with only using centroid method for coarse positioning and matching, the trajectory length measurement accuracy and straightness of the measurement system are significantly improved, and the high-precision measurement of flying object trajectory in the narrow and long space near distance measurement scene is realized. Based on the error analysis of the measurement results, in the actual measurement, the imaging clarity of the target object texture should be improved by improving the light source illumination and optimizing the optical path design, so as to improve the measurement accuracy of the target object trajectory. The follow-up work direction is to optimize the texture of the target object through image enhancement, improve the trajectory measurement accuracy of the measurement system for the target object with poor texture, and further study the high-precision extraction method of non-rigid body and rotating target matching points, so that the entire measurement system has better stability for the trajectory measurement of different target objects in different measurement scenes.

    • 三维轨迹测量是智能监控[1]、运动分析[2]和目标追踪[3-5]中涉及的关键技术,在交通、军事等领域[6-7]得到了广泛应用。近年来,研究人员开始用成像设备和计算机代替人类双眼和大脑对目标物体进行双目视觉三维轨迹测量,具有计算结果可靠、测量精度相对较高等优点。

      在航空工业中,空中鸟体的撞击一直是引起航空发动机损坏的主要原因之一,因此需要在发动机的近距离位置处对鸟体进入发动机之前的狭长三维运动轨迹进行高精度测量,以判断鸟体位置,通过应急措施保障航空发动机的安全。为实现双目视觉三维坐标与轨迹的高精度测量,众多学者在不同的应用场景下提出了不同的方法。孙猛超[8]等搭建了一种基于特征点的双目视觉高精度三维测量系统,通过特征点检测和匹配对目标像素位移进行测量,40 μm位移下的测量精度约为5%,且对目标轮廓没有特殊要求;岳丽清[9]等通过基于光束法平差算法的双目视觉高精度标定方法对机械臂三维坐标进行高精度测量,将光束法平差算法与经纬仪结合分别得到双目系统内外参,三维坐标定位测量精度优于±0.19 mm;应晓霖[10]等搭建了一种基于LD光源步进条纹投影的双目视觉三维测量系统,通过相移条纹投影实现相位精匹配,三维距离测量精度约为5%。为实现狭长空间近距离测量场景下的物体三维轨迹测量,Jing[11]等提出了一种基于双目视觉的室内狭长空间运动目标的高精度定位方法,融合HS光流法、Delaunay三角剖分和Otsu阈值分割,提高了三维定位精度,目标物体的三维定位精度约为2.5%,但存在光流计算耗时较长的问题。崔彦平[12]等提出了一种双目视觉测量飞行弹体的落地轨迹和速度的方法,在落弹点区域附近放置凝视等待式双目视觉摄像机,通过形心法对不同时刻飞行目标形心的三维坐标进行最小二乘拟合和求导得到目标飞行轨迹和速度,三维轨迹和速度的测量误差不低于2%。李林波[13]等提出了一种通过双目视觉对鱼类进行空间定位与连续跟踪的方法,通过对游动的鱼进行目标识别和轮廓提取,计算得到鱼的形心,再通过双目三维重建获得形心的三维坐标,鱼体的轨迹测量精度为1%左右,且测量速度较快。上述测量方法均能实现近距离狭长空间的三维轨迹测量,但仍未解决双目形心定位匹配精度不高导致轨迹测量精度一般的问题。

      在航空发动机安全监测的狭长空间近距离测量场景下,针对仅形心法定位匹配的轨迹测量精度不高的问题,文中提出了一种多方法组合进行形心匹配优化的轨迹测量系统,在实验室环境下模拟鸟体进入发动机之前的狭长运动空间,在侧面近距离位置处搭建双目测量系统,进行狭长空间近距离测量场景下的轨迹测量实验验证。由于双目相机从不同角度拍摄目标物体,特别是当双目相机光轴夹角较大时,仅形心法提取的双目形心匹配精度较低,此时首先通过极线约束投影对基本形心法定位的双目形心进行二次定位,接着利用基于距离和方法权重的灰度互相关方法进行双目形心亚像素匹配,最终通过卡尔曼滤波进行三维重建轨迹的滤波修正,去除三维轨迹中由于目标形心提取中形心位置不稳定产生的与理想轨迹位置偏差较大的三维轨迹点,从而最终提高轨迹的测量精度。以下分别对该轨迹测量系统的系统结构及工作原理、测量实验与结果、结论进行详细叙述。

    • 图1为三维轨迹测量系统整体设计方案,以下对系统的整体设计方案进行介绍:建立基于双目视觉的三维测量模型;利用张正友标定法[14]进行双目摄像机标定;通过双目相机同步采集和传输系统进行图像采集;使用图像处理算法提取目标物体;通过形心法对目标物体进行形心粗定位;通过双目极线约束对形心进行二次定位;基于灰度互相关匹配进行双目形心的亚像素匹配;通过双目视觉三维测量模型计算得到目标物体的三维坐标。进行测量时,输出视频图像每一帧目标物体的三维坐标,最后通过卡尔曼滤波对于运动轨迹中的离散三维点进行过滤修正,使其更接近理想运动轨迹,提高三维测量精度,由此实现目标物体的高精度三维轨迹测量。以下在图像目标提取之前均通过相机标定参数对图像进行了畸变校正,畸变之后的极线为一条直线,对双目极线约束的准确性无影响。

      Figure 1.  Overall system design scheme

    • 基于双目视觉三维测量模型,由左右相机构成的双目测量系统可以获取空间中物体的三维信息。为建立双目视觉三维测量模型,需要首先建立双目测量系统的成像模型,如图2所示,并对成像模型进行详细介绍。

      Figure 2.  Imaging model of binocular measurement system

      世界坐标系下的物方三维点P(x,y,z)在左相机参考系O1-X1Y1Z1下的三维坐标记为P(xl, yl, zl),在右相机参考系O2-X2Y2Z2下的三维坐标记为P(xr, yr, zr),左、右相机同时拍摄物方三维点P,物方三维点P的左相机理想像点记为Pl(Xl,Yl)、右相机理想像点记为Pr(Xr,Yr),其中物点P、左相机参考系原点O1和右相机参考系原点O2构成一个三角形。若己知双目相机标定参数和左右相机中的匹配点,通过双目视觉三维测量模型计算可得到基于左相机参考系O1-X1Y1Z1下的物方三维点坐标P(xl, yl, zl)。

      双目测量系统中左、右相机的成像模型均简化为针孔成像模型,其中左、右相机参考系间的变换矩阵记为Mlr=[R|T],由Mlr可获得左相机和右相机之间的空间位姿关系,如公式(1)所示:

      在左相机参考系O1-X1Y1Z1下,物方三维坐标的求解如公式(2)所示:

      式中:fl为左相机的物理焦距;fr为右相机的物理焦距;(Xl,Yl)和(Xr,Yr)分别为左相机和右相机对应的理想成像点的图像像素坐标,即为下文中经过二次定位和亚像素相关匹配输出的最终双目匹配点像素坐标。

    • 双目测量系统在获取图像时,在目标物体清晰成像的基础上,还需保证图像采集和传输的同步性。当目标物体运动速度较快时,若双目图像的实时采集和传输不能保证较高精度的同步,则目标物体可能在左右两幅图像中有较大的位置误差,不利于双目的高精度匹配。因此,在该测量系统中设计了一种双目同步采集和传输系统如图3所示,并对其工作原理进行介绍,其中同步触发器的高精度同步优势主要表现在高速飞行物的测量场景,在文中验证实验中可将同步触发延迟忽略不计。

      Figure 3.  Schematic diagram of binocular synchronous acquisition and transmission system

      通过纳秒级同步触发器对双目相机进行同步外触发,双目相机开始同步采集图像;基于PTP精密时钟同步协议,双目采集图像通过网络交换机同步传输至PC存储单元,可以保证双目图像采集和传输具有纳秒级同步精度,有利于提高双目匹配精度。

    • 在完成图像采集之后,需要对目标物体进行提取,并进行形心的粗定位。目标物体提取过程包括图像的背景帧差处理、自适应二值化处理、去噪和空洞填充。形心粗定位即利用形心公式对目标物体形心进行粗定位。以下为目标物体提取和形心粗定位的全过程:

      (1)针对于目标背景较为复杂的情况,可以先通过背景帧差法进行目标物体周围复杂背景消除,得到接近目标物体的灰度图像,然后对整个灰度图像进行自适应二值化处理对目标物体进行提取;对于背景较为简单的情况,可以直接通过自适应二值化处理对目标物体进行提取,在这里使用OTSU[15]法进行自适应阈值确定,然后进行二值化。

      (2)对二值化图像的背景部分进行去噪。遍历整个二值化图像,找到所有灰度值为1的连通域,去除较小的连通域区域,即可较好地去除背景噪声。

      (3)对目标物体边缘轮廓的内部区域进行填充,使整个目标物体区域灰度值均为1,此时得到整个经过填充的二值化目标物体图像X

      (4)对目标物体的边缘轮廓部分进行去噪。相对于单一使用传统滤波器而言,腐蚀和膨胀算子对边缘的去噪效果有明显的改善。其数学表达式如公式(3)所示:

      式中:$ \oplus $表示图像X被结构元素S腐蚀;X1表示腐蚀之后的图像;$ \otimes $表示图像X被结构元素S膨胀;X2的集合表示膨胀之后的图像。通过腐蚀膨胀操作对目标物体的边缘区域进行去噪处理,使目标物体的形状和边缘更加清晰。

      (5)对于图像X进行先腐蚀后膨胀操作,得到图像X2。经上述处理之后的二值化图像,白色像素区域即为目标物体,用1表示其灰度值;黑色像素区域为背景,用0表示其灰度值,形心公式如公式(4)所示:

      式中:(x0,y0)即为形心坐标;f(x,y)是二值化图像X2在图像坐标系(x,y)坐标处的灰度值,大小为0或1。

      通过上述公式(11)可以得到左相机图像和右相机图像中目标物体的粗定位形心PL(xL, yL)和PR(xR, yR)。

      背景帧差图和目标提取过程中步骤(1)~(4)的输出如图4所示。

      Figure 4.  Schematic diagram of target extraction output

    • 在双目视觉中,同一物体在左右相机上的图像间存在一种几何上的极线约束关系,原理如图5所示并表述如下:I1为左相机图像;I2为右相机图像;Pl(Xl,Yl)和Pr(Xr,Yr)分别为世界坐标系下物方三维点P(x,y,z)在左右相机图像上的理想成像点,与图2中意义相同;其中O1O2PlPrP位于同一极平面Q上;l1称为I1上对应于I2Pr点的外极线,l2称为I2上对应于I1上点Pl的外极线,且PlPr分别位于l1l2上;eler是基线与左右相机图像I1I2的交点,分别位于外极线l1l2上,称为极点。

      Figure 5.  Schematic diagram of binocular epipolar constraint

      由极线约束的原理可知,左相机图I1中投影点Pl对应的理想匹配点Pr,必在外极线l2上,可以利用这种几何关系来进行形心的二次定位,以下对基于极线约束的形心二次定位的原理进行详细介绍。

      首先要求解双目的极线约束方程。两个相机的投影方程如公式(5)所示:

      式中:Xw为空间点P(x,y,z)在世界坐标系下的齐次坐标,Xw=(P 1)TM1M2分别为左右相机的投影矩阵;对于每一个3×4投影矩阵Mi(i=1,2),Mi1Mi左方的3×3矩阵;miMi的第4列。

      令向量m=m2M22M11−1m1,[m]x为由m确定的反对称矩阵。其中,左右相机图像理想匹配点Pl(Xl,Yl)和Pr(Xr,Yr)的齐次坐标uLuR满足匹配关系如公式(6)所示,uL=[Xl,Yl,1],uR=[Xr,Yr,1]。

      此时,根据上述公式(6)中双目理想匹配点的关系和1.3节中通过目标提取得到的双目图像的粗定位形心Pl(xl,xl)和Pr(xr,xr),将左相机图I1中粗定位形心Pl(xl,xl)的齐次坐标ul代入公式(6),得到一个关于ur的线性方程,即为右相机图I2中对应于左相机图粗定位形心Pl(xl,xl)的极线方程。

      由于右相机图粗定位形心Pr(xr,yr)不一定位于此极线之上,所以需要对其在极线上进行投影,如图6所示。其中极点er(xer,yer)位于右相机图极线l2之上。

      Figure 6.  Schematic diagram of epipolar projection

      根据极点的定义,由公式(7)得到右相机图极线上的极点er的坐标。

      根据直线外一点在直线上的投影坐标计算方法,可知右相机图粗定位形心Pr(xr,xr)在极线上的投影点P2(x2,y2)的计算公式表示为:

      最后由公式(8)计算得到投影点P2(x2,y2),即为右相机图目标物体基于极线约束的二次定位形心。

    • 图像的相关模板匹配[16-18]对于提高双目的匹配精度具有重要意义。匹配基元的选择是相关模板匹配中提取图像重要信息的关键,常用基元选择包括点、线和面。基于点的匹配方法速度快但获取图像特征过于稀疏,导致匹配准确度较低;基于线的匹配基元可以获得图像特征的轮廓信息,但在物体轮廓不稳定时会产生较大的匹配误差,匹配速度一般;基于面的匹配基元可以获得鲁棒性较好的图像特征,不容易产生误匹配,但匹配速度较慢。文中测量系统为实现高精度匹配,选择基于面的匹配基元,并通过畸变校正之后的双目极线约束缩小匹配范围,从而提高匹配速度。其中基于面匹配基元的主要方法有基于特征互相关的匹配方法和基于灰度互相关的匹配方法。基于特征互相关的匹配算法通过在原始图中提取区域显著特征作为匹配基元,但在飞行物运动过程中物体特征可能被遮挡或有明显形变,不利于图像特征匹配;基于灰度互相关的匹配算法通过滑动图像灰度模板进行图像匹配,文中测量系统基于合理的光路设计可使飞行物运动过程中灰度纹理成像清晰,且模板匹配不易受到遮挡或形变的影响,因此基于灰度互相关匹配是文中测量系统中最佳选择方法。

      基于上述分析,文中测量系统基于灰度互相关的匹配算法,为在不降低匹配效率的基础上提高匹配准确度,提出了一种基于距离和方法权重的灰度互相关亚像素匹配算法,通过对灰度互相关值进行拟合求亚像素级峰值,实现双目形心的亚像素匹配,提高双目匹配的准确度。以下对本测量系统基于灰度互相关的相关模板匹配的数学模型和匹配过程进行详细介绍。

      由以上原理分析,建立灰度互相关匹配的数学模型如公式(9)所示:

      式中:S(x1,y1)表示左相机图中的正方形结构元素模板;S(i,j)表示右相机图中以(i,j)坐标为中心,与S(x1,y1)等大的像素区域;$ \odot $为互相关运算符,即上述两个像素区域的重合部分进行互相关运算;SF(i,j)表示右相机图中点(i,j)处的灰度互相关结果,即上述两个像素区域对应点灰度乘积的和;G(i,j)表示距离权重高斯函数,代表右相机图待搜索区域F中的像素点到右相机图极线上P2(x2,y2)的距离权重,距离越近权重越大;高斯函数中σiσj的数值与极线约束和灰度互相关匹配在提高双目形心匹配精度上的权重有关,当σiσj较小时代表极线约束对于提高双目形心匹配精度的权重更大,σiσj较大时代表灰度互相关匹配对于提高双目形心匹配精度的权重更大;d(i,j)表示右相机图中坐标(i,j)处基于距离加权的灰度互相关结果,其中(i,j)坐标系的原点为待搜索区域F的初始中心点A

      其匹配过程如图7所示,以左相机图粗定位形心为中心的结构元素模板在右相机图覆盖区域做灰度互相关匹配,从右相机图待搜索区域的左上角开始,每次步进1个像素,从左到右、从上到下,直到待搜索区域的右下角结束。通过搜索找到右相机图待搜索区域F中灰度互相关结果最大的整数像素点,即右相机图中与S(x1,y1)灰度分布最接近区域的中心点,该像素点为与左相机图粗定位形心匹配度最高的右相机图整数像素点。

      Figure 7.  Schematic diagram of relative template matching

      其中,S为左相机图中的正方形结构元素模板;F为右相机图待搜索区域;P1(x1,y1)为左相机图的粗定位形心,即为Pl(xl,xl);P2(x2,y2)为右相机图的二次定位形心;A、B为左相机图结构元素模板S在右相机图待搜索区域F中做灰度互相关匹配的初始中心点和结束中心点;L1表示左相机图中正方形结构元素模板S的边长像素数,为奇数;L2表示右相机图中待搜索正方形区域F的边长像素数。经分析可知,当L1接近目标物体外接矩形短边像素数的一半,且L2略小于L1时,双目匹配精度具有较好的提升。

      从上述基于灰度互相关的相关模板匹配结果可以看出,虽然该方法可以提升双目形心的匹配精度,但是其定位的精度也只能在像素级,这是由图像的分辨率本身所决定的,从亚像素技术角度出发,也可以达到更高的精度。在右相机图中灰度互相关结果最高的整数像素位置邻域内的亚像素区域,可能存在着互相关结果更大的位置,因此可以用二次拟合函数来确定这个亚像素位置的坐标。可设二次函数表示为:

      式中:Fcross(x)和Fzong(y)表示右相机图待搜索区域F中互相关结果最高的整数像素点所在行和列的互相关结果所拟合的二次函数。

      将上述行和列的所有灰度互相关结果代入,解超定方程[19]即可得到公式(10)中的各个系数,可以得到极大值如公式(11)所示:

      当然该极大值不一定是在整数像素上,而是在该整数像素位置与八邻域像素位置的中间的某一亚像素位置,这就实现了基于灰度互相关的亚像素形心匹配。

      右相机图待搜索区域F的坐标原点为初始中心点A(xA,yA),且初始中心点A和右相机图粗定位形心P2(x2,y2)之间的相对位置关系如公式(12)所示:

      由上述公式(12)的相对位置关系,可知右相机图中亚像素形心坐标如公式(13)所示:

      可以得到左右相机图像中形心匹配点如公式(14)所示:

      基于双目相机标定参数和上述公式(14)的双目匹配点,根据公式(2)即可得到目标物体在左相机坐标系下的三维坐标。

    • 根据以上1.1~1.5节可以得到视频图像中每一帧目标物体的高精度三维坐标,输出所有帧的目标三维坐标即可得到视频图像中目标物体的运动轨迹。由于目标物体运动时在相机中的视角不断发生变化,且目标物体可能存在非刚体或旋转特性,导致目标物体形心提取时位置不稳定,所以三维重建点位置不稳定,导致实测三维轨迹与理想三维轨迹存在一定的偏差,即三维轨迹中存在一些突变点。

      针对以上轨迹测量场景中出现的问题,文中采用卡尔曼滤波对于运动轨迹进行滤波修正,使整个运动轨迹中几乎无突变点,接近真实运动轨迹,提高三维轨迹测量精度。以下对卡尔曼滤波的滤波修正原理进行详细介绍:

      卡尔曼滤波包括两个主要过程:预估与校正。预估过程主要是利用时间更新方程建立对当前状态的先验估计,及时向前推算当前状态变量和误差协方差估计的值,以便为下一个时间状态构造先验估计值;校正过程负责反馈,利用测量更新方程在预估过程的先验估计值及当前测量变量的基础上建立起对当前状态的改进的后验估计。这样一个过程称之为预估-校正过程,对应的这种估计算法称为预估-校正算法。以下给出离散卡尔曼滤波的时间更新方程和状态更新方程。

      时间更新方程如公式(15)所示:

      状态更新方程如公式(16)所示:

      公式(15)和(16)中的变量说明:k表示不同的时刻,k=2 ··· N,N为时刻总数,与单个相机视频图像中的总帧数相同;U为作用在(XG)k-1上的状态变换矩阵,一般情况下初始设置为三阶单位矩阵;Pk为估计误差协方差,对应于估计值和真实值之间的协方差矩阵,其中对角线元素即为对应的方差;Pk^代表先验估计; Qc为系统噪声协方差,反映两个连续的三维测量点之间的方差,与测量精度有关;H为观测模型矩阵,初始一般取三阶单位矩阵,用来把真实状态空间映射到观测空间;Rc为测量噪声协方差,反映预估测量精度;Kk为卡尔曼增益值,用来使后验估计误差协方差Pk最小;(XG)k为卡尔曼滤波后的估计值,此处为经过滤波后的三维点坐标估计值;(XG)k^表示先验估计;(ZR)k为实际测量值,此处为三维离散点坐标的实际测量结果。

      通过上述卡尔曼滤波模型,可以对目标物体的运动轨迹进行滤波修正,提高运动轨迹测量精度。卡尔曼滤波前后和理想的离散三维点轨迹如图8所示,可见卡尔曼滤波对于偏差较大的三维点进行了滤波修正,使测量运动轨迹与理想运动轨迹更接近,提高运动轨迹测量精度。其中理想运动轨迹为根据目标物体的物理运动规律得到,为已知量。

      Figure 8.  Kalman filter trajectory diagram

    • 为了验证文中提出的轨迹测量系统对于狭长空间近距离测量场景下的目标物体轨迹测量精度提高的有效性,需要进行验证实验。首先设计双目测量实验系统光路如图9所示。

      Figure 9.  Optical path schematic diagram of binocular measurement experimental system

      其中,双目相机的型号均为Basler-acA-1440;双目相机基线距为181 mm;双目相机光轴夹角为19.22°;轨迹测量长度为128 mm;Z轴方向测量物距约为500 mm;系统的理论总景深为73 mm,前后景深均为36.5 mm。

      由公式(17)景深计算公式计算可得系统的实际总景深为104.5 mm,满足理论总景深的要求,且前后景深均满足要求。

      式中:dL为系统的实际总景深;f为镜头焦距;F为光圈数值;h为目标物体在深度方向的物距;c为像距;δ为容许弥散圆直径。其中,镜头焦距f为12 mm,光圈数值F为8,目标物体在深度方向的物距h为500 mm,像距c为12.3 mm,容许弥散圆直径δ为3.45 μm。

      在景深范围内,由公式(18)计算可得双目测量系统XY方向分辨力均为0.15 mm,Z方向深度分辨力为0.46 mm。

      式中:dX、dY和dZ为双目测量系统XY、Z方向分辨力;d为像元尺寸;hmax为测量范围内最大物距;D为双目相机基线距。其中,像元尺寸d为3.45 μm,测量范围内最大物距hmax约为520 mm,双目相机基线距D为181 mm。

      在按照上述设计参数完成双目测量系统的搭建之后,进行近距离目标物体的三维轨迹长度的测量实验,实验原理图和实物图如图10所示。实验平台主要包括双目相机、聚光灯光源、集成光栅尺的电动位移台、具有纹理结构的模拟鸟体。其中集成有光栅尺的电动位移台具有较高位移精度,其位移数值可以作为实验验证部分的基准值。

      Figure 10.  Verification experiment of 3D trajectory measurement accuracy

      实验过程如下:使用30幅标定图像进行双目相机标定;把具有纹理结构的模拟鸟体紧贴放置于位移台上,对相机进行对焦,使模拟鸟体在景深范围内清晰成像;通过设置相机曝光时间、控制聚光灯的照明情况对模拟鸟体的成像效果进行调整;控制电动位移台单次步进移动模拟鸟体,双目相机拍摄包括初始位置在内的所有离散位置处的模拟鸟体;通过目标提取和三维重建计算每一帧模拟鸟体在左相机坐标系下的三维坐标;最后通过卡尔曼滤波对于整个运动轨迹中进行滤波修正,去掉一些突变三维点,使理论上的理想轨迹变得更加平滑;最后计算相邻帧模拟鸟体之间的距离,进行距离累加即可得到双目测量系统对模拟鸟体轨迹的测量长度,计算公式如公式(19)所示:

      式中:Dist为双目测量系统对模拟鸟体轨迹的测量长度;ci(i=1~s)为卡尔曼滤波修正之后的运动轨迹中的离散三维坐标点;s为电动位移台单次步进移动模拟鸟体的总次数。其中,电动位移台单次步进移动模拟鸟体λ mm,共移动s次,移动过程中双目相机共拍摄包括初始位置在内的2s+2幅图像,每个相机拍摄s+1幅图像。

      由于该测量实验中的理想运动轨迹为一条直线,所以轨迹测量精度指标中除了轨迹长度测量精度之外,还应该包括直线度误差。轨迹的直线度误差指轨迹各点偏离此理想轨迹直线的程度,在该测量实验中主要体现在原始测量轨迹散点和预测理想轨迹直线之间的分布误差。实验所用光栅尺位移台在其运动方向具有很高的直线度,所以理想运动轨迹可以视为一条直线。由三维重建公式可知,物体三维坐标是以左相机坐标系为原点坐标,所以需测量三维轨迹散点在左相机光轴方向投影值的直线度误差,从而对整个轨迹的直线度误差进行判定。直线度误差计算过程如下:将测量所得轨迹散点在左相机光轴方向的投影值大小进行线性拟合,拟合为一条理想直线,计算传统仅用形心法和文中轨迹测量系统所测得的轨迹散点在进行线性拟合时的误差参数SSE和R-square。其中,SSE是预测拟合数据和原始数据对应点的误差的平方和,SSE越接近于0,说明线性模型选择和拟合越好,数据预测越成功;R-square由预测数据与原始数据均值之差的平方和与原始数据和均值之差的平方和的比值决定,值越接近1,表明方程的变量对因变量的解释能力越强,表明该模型对数据拟合的也较好,即此数据更符合该模型的分布特性。

      实验中,λ=4,s=32,基于目标模拟鸟体在图像中所占的像素比重,互相关匹配的模板S和待搜索区域F的边长像素数为L1=100,L2=90。实验中采用配有光栅尺的位移台作为参考,由位移台带动鸟体运动,光栅尺精度1 μm,远优于轨迹测量精度,所以光栅尺校正之后的位移数值可以在实验验证部分作为基准轨迹长度。以下为验证实验中的相机采集图像和对应的三维轨迹长度测量结果。

      图11为纹理较深目标模拟鸟体图像和基于文中测量系统的目标轨迹测量结果,位移台运动128 mm后,该系统的轨迹长度测量结果为127.99 mm。为验证目标物体纹理深浅对于该测量系统精度的影响,实验中λ=4,s=32,L1=100,L2=90不变,只调整上述实验中的相机曝光时间或光源照明来减弱模拟鸟体的成像纹理特征,图12为纹理较浅目标模拟鸟体图像和基于文中测量系统的目标轨迹测量结果,位移台运动128 mm后,该系统的轨迹长度测量结果为128.25 mm。

      Figure 11.  Target trajectory map based on the measurement system in this paper

      Figure 12.  Target trajectory map based on the measurement system in this paper

      保持λ=4,s=32,L1=100L2=90不变,继续调整上述实验中的相机曝光时间或光源照明,使模拟鸟体的成像纹理特征几乎消失。图13为无纹理目标模拟鸟体图像和基于文中测量系统的目标轨迹测量结果,位移台运动128 mm后,该系统的轨迹长度测量结果为128.55 mm。

      Figure 13.  Target trajectory map based on the measurement system in this paper

      在128 mm的测量范围内,测量系统在不同纹理清晰程度下的轨迹长度和直线度误差参数测量结果如表1所示。当目标物体纹理较深时,轨迹长度测量精度约为0.008%,SSE为0.132,R-square为0.9999;当目标物体纹理变浅时,轨迹长度测量精度约为0.195%,SSE为0.427,R-square为0.9997;当目标物体几乎无纹理时,轨迹长度测量精度约为0.430%,SSE为10.37,R-square为0.9943。

      Imaging texture
      features
      Measurement
      result/mm
      Measurement
      accuracy
      SSER-square
      Deep texture127.990.008%0.1320.9999
      Light texture128.250.195%0.4270.9997
      No texture128.550.430%10.370.9943

      Table 1.  Measurement results of track length and straightness error parameters

      为了验证文中提出的轨迹测量系统的稳定性,选取图11中纹理较深的模拟鸟体,共进行5次128 mm轨迹测量重复实验,在基于Intel(R) Core(TM) i5-4690 K CPU 3.50 GHz、4.00 GB RAM的MATLAB 2016b平台进行数据处理。5次重复对比实验的轨迹长度测量误差如图14所示,基于仅用形心法粗定位匹配的轨迹长度测量误差均值为230.5 μm,轨迹长度精度约为0.18%,SSE均值约为0.378,R-square均值约为0.999 8;基于文中形心匹配优化的轨迹长度测量误差均值为13.14 μm,精度约为0.01%,SSE均值约为0.131,R-square均值约为0.999 9。

      Figure 14.  Measurement error of track length of 5 repeated comparative experiments

      通过上述不同纹理目标物体的测量实验结果可知,目标物体成像纹理的深浅对文中测量系统的轨迹测量精度有一定影响,由于纹理较深目标物体的灰度值分布更加丰富,所以基于灰度互相关的亚像素匹配具有更好的双目匹配效果,从而具有更高的测量精度;通过重复对比实验结果可知:由于近距离狭长空间的测量场景下双目相机拍摄角度不同,导致双目采集图像中目标物体区域有微小差异,因此直接通过形心法定位匹配的双目匹配精度较低,而文中测量系统在此基础上提出了一种基于距离和方法权重的灰度互相关亚像素匹配算法,通过形心匹配优化显著提高了纹理较好目标物体的轨迹测量精度,实现了狭长空间近距离测量场景下飞行物轨迹的高精度测量。

      基于上述测量误差分析,在实际测量中要通过改善光源照明和优化光路设计等方式改善目标物体纹理的成像清晰度,从而提高目标物体轨迹的测量精度。

    • 在基于双目视觉的狭长空间近距离测量场景下,由于双目相机从不同角度拍摄目标物体,特别是当双目相机光轴夹角较大时,仅形心法定位匹配的轨迹测量精度不高。针对此问题,文中对多方法进行组合优化,研制了一种形心匹配优化下的狭长空间近距离轨迹测量系统。首先,介绍了轨迹测量系统的系统构成和工作原理。然后,以模拟鸟体为目标物体在实验室环境下进行近距离轨迹测量实验验证。最后,分析了目标物体成像纹理对文中测量系统精度的影响。结果表明:在128 mm的全量程范围内,文中轨迹测量系统对于较好纹理目标物体的平均轨迹长度测量误差为13.14 μm,轨迹长度测量精度约为0.01%,且轨迹直线度误差较小。相比于仅用形心法粗定位匹配,轨迹长度测量精度和直线度都得到显著提高,实现了狭长空间近距离测量场景下飞行物轨迹的高精度测量。后续工作方向是通过图像增强的方式优化目标物体的纹理,提高该测量系统对于纹理较差目标物体的轨迹测量精度,同时对于非刚体和旋转目标匹配点的高精度提取方法进行进一步研究,使整个测量系统对于不同测量场景下的不同目标物体的轨迹测量具有更好的稳定性。

Reference (19)

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return