-
单像素成像利用一系列照明散斑
$ {S_n}\left( {x,y} \right) $ 照射目标物体$ f\left( {x,y} \right) $ ,目标物体反射或透射的强度信号被单像素探测器所接收。探测得到的强度值$ {I_n} $ 可以表示为:$$ {I_n} = \sum\limits_{x,y} {f\left( {x,y} \right){S_n}\left( {x,y} \right)} $$ (1) 式中:
$ n $ 为照明散斑的索引值。根据参考文献[12],在二维空间内,物体的$ \left( {p + q} \right) $ 阶几何矩$ {m_{pq}} $ 可以定义为:$$ {m_{pq}} = \sum\limits_{x,y} {{x^p}{y^q}f\left( {x,y} \right)} $$ (2) 对比公式(1)、(2),当3个照明散斑
${S_n}\left( {x,y} \right) \left( {n = 1,2,3} \right)$ 分别满足$ {S_1}\left( {x,y} \right) = \left[ {\begin{array}{*{20}{c}} 1&1& \cdots &1 \\ 1&1& \cdots &1 \\ \vdots & \vdots & \vdots & \vdots \\ 1&1& \cdots &1 \end{array}} \right] $ ,$ {S_2}\left( {x,y} \right) = \left[ {\begin{array}{*{20}{c}} 1&2& \cdots &N \\ 1&2& \cdots &N \\ \vdots & \vdots & \vdots & \vdots \\ 1&2& \cdots &N \end{array}} \right] $ 和$ {S_3}\left( {x,y} \right) = \left[ {\begin{array}{*{20}{c}} N&N& \cdots &N \\ {N - 1}&{N - 1}& \cdots &{N - 1} \\ \vdots & \vdots & \vdots & \vdots \\ 1&1& \cdots &1 \end{array}} \right] $ 时,由单像素探测器探测得到的强度值$ {I_n}\left( {n = 1,2,3} \right) $ 分别与物体的零阶$ {m_{00}} $ 和一阶几何矩值$ \left( {{m_{10}},{m_{01}}} \right) $ 相对应。根据几何矩的性质构造非正交基照明散斑。由调制信息$ {S_{\text{1}}}\left( {x,y} \right) $ ,$ {S_{\text{2}}}\left( {x,y} \right) $ 和$ {S_{\text{3}}}\left( {x,y} \right) $ 生成的散斑称为几何矩散斑,如图1所示。标识运动物体位置的质心$ \left( {{x_c},{y_c}} \right) $ 可以通过零阶和一阶的几何矩值计算得到。计算公式表示为:$$ {x_c} = {I_2}/{I_1} $$ (3) $$ {y_c} = {I_3}/{I_1} $$ (4) 经上述理论分析,仅用3张几何矩散斑调制入射光,照射到目标物体上,单像素探测器所接收的反射或透射强度值与零阶和一阶几何矩值相对应,由此获取表征运动物体位置的质心参数。由于获取每帧物体位置所需的散斑数少,所以利用文中方法可实现高帧频实时运动物体定位。
-
为了验证该方法的可行性,首先开展了仿真计算。在单像素成像的过程中,由于噪声的存在会影响所得实验结果的准确性。而在整个单像素成像系统中,噪声主要来自于单像素探测器,一般分为读出噪声、背景噪声、背景光子噪声以及离散采样误差等[13]。仿真选择加入高斯噪声来模拟单像素探测器中的噪声模型。
为了衡量高斯噪声的大小,选用峰值信噪比来评估噪声的强弱。峰值信噪比PSNR的表达式为:
$$ PSNR = 1{\text{0}} \cdot {\log _{10}}\left( {\frac{{MA{X^2}}}{{MSE}}} \right) $$ (5) $$ MSE = \frac{1}{{mn}}\sum\limits_{i = 0}^{m - 1} {{{\sum\limits_{j = 0}^{n - 1} {\left[ {I\left( {i,j} \right) - K\left( {i,j} \right)} \right]^2} }}} $$ (6) 式中:
$ MAX $ 表示图像灰度的最大值;$ MSE $ 为均方误差。选用质心误差(Centroid Estimation Error)[14]作为检验获取质心准确性的参数,用来表征计算质心位置$ \left({x}_{c},{y}_{c}\right) $ 与真实质心位置(无噪声无背景情况下质心的位置$ \left({x}_{0},{y}_{0}\right) $ )的距离之差。其数学表达式为:$$ CEE = \sqrt {{{\left( {{x_c} - {x_0}} \right)}^2} + {{\left( {{y_c} - {y_0}} \right)}^2}} $$ (7) 图2展示了两种仿真场景。图2(a)表示物体在一个均匀的黑色背景下,此时为无噪声均匀背景的理想情况,此时的质心位置作为真实质心位置,图2(b)为复杂背景,图2(c)表示物体处在图2(b)的复杂背景中。3张图像的空间分辨率均为256
$ \times $ 256,分别对两种场景进行仿真计算。不同信噪比下的质心误差结果如图3所示,图中绿色线是物体在均匀背景情况下的结果,蓝色线是复杂背景下的结果。对比仿真结果,可以看出随着峰值信噪比不断提高,质心误差CEE不断减小。当峰值信噪比小于40 dB时,质心误差CEE随着峰值信噪比的增大明显减小,而当峰值信噪比大于40 dB时,质心误差CEE减小趋势逐渐趋于平稳,稳定在两个像素以内。
Geometric moment detection with single-pixel for moving object localization (Invited)
-
摘要: 针对快速实时定位运动物体的需求,提出了一种使用几何矩探测的单像素快速定位运动物体的方法。该方法的核心是通过探测运动物体的质心实现快速定位运动物体。根据几何矩性质构造3个几何矩照明光,并照射运动物体,利用单像素探测器收集运动物体与调制光相互作用后的反射或透射光的强度值。根据单像素成像理论,探测强度值与物体的零阶和一阶几何矩值相对应。标识物体位置的质心参数可由物体的零阶和一阶几何矩值获取。结合数字微镜调制器(DMD)和时间抖动的方法产生几何矩照明光,利用所提方法在不成像的前提下分别实现了帧频约为500 fps和1 000 fps运动物体定位。所提方法获取质心的误差在1.63个像素以内,均方误差为0.118 3个像素。文中所提方法为使用单像素探测器实现快速追踪运动物体提供了一种新思路。Abstract: In response to the demands of fast real-time positioning of a moving object, a single-pixel fast moving object positioning method using geometric moment detection was proposed. The key to the approach was to locate the moving object by detecting the centroid of the moving object. According to the properties of geometric moments, three geometric moment illumination lights were constructed and illuminated the moving object. A single-pixel detector was used to collect the intensities of the reflected or transmitted light after the moving object interacted with the modulated light. According to the theory of single-pixel imaging, the detected intensity values corresponded to the zero-order and first-order geometric moment values. The centroid that identified the object position parameter could be obtained by corresponding calculations from the zero-order and first-order geometric moment values. The proposed method could reach an approximately frame rate of 500 fps and 1 000 fps to position a moving object without imaging. The error of the centroid obtained by the proposed method was within 1.63 pixels, and the mean square error was 0.118 3 pixels. The proposed method provided a new way for tracking of a fast-moving object.
-
Key words:
- single-pixel imaging /
- centroid /
- geometric moment
-
-
[1] Wei M S, Xing F, You Z. A real-time detection and positioning method for small and weak targets using a 1D morphology-based approach in 2D images [J]. Light: Science & Applications, 2019, 7(5): 18006. [2] El-Desouki M, Deen M J, Fang Q Y, et al. CMOS image sensors for high speed applications [J]. Sensors, 2009, 9(1): 430-444. doi: 10.3390/s90100430 [3] Edgar M P, Gibson G M, Padgett M J. Principles and prospects for single-pixel imaging [J]. Nature Photonics, 2019, 13(1): 13-20. [4] Sun B Q, Edgar M P, Bowman R, et al. 3D computational imaging with single-pixel detectors [J]. Science, 2013, 340(6134): 844-847. doi: 10.1126/science.1234454 [5] Stantchev R I, Yu X, Blu T, et al. Real-time terahertz imaging with a single-pixel detector [J]. Nature Communications, 2020, 11: 2535. doi: 10.1038/s41467-020-16370-x [6] Magana-Loaiza O S, Howland G A, Malik M, et al. Compressive object tracking using entangled photons [J]. Applied Physics Letters, 2013, 102: 231104. [7] Sun S, Lin H Z, Xu Y K, et al. Tracking and imaging of moving objects with temporal intensity difference correlation [J]. Optics Express, 2019, 27(20): 27851-27862. doi: 10.1364/OE.27.027851 [8] Ni M Y, Deng H X, He X K, et al. A single-pixel tracking system for microfluidic device monitoring without image processing [J]. Optics and Lasers in Engineering, 2021, 151: 106875. [9] Shi D F, Yin K X, Huang J, et al. Fast tracking of moving objects using single-pixel imaging [J]. Optics Communications, 2019, 440: 155-162. doi: 10.1016/j.optcom.2019.02.006 [10] Zhang Z B, Ye J Q, Deng Q W, et al. Image-free real-time detection and tracking of fast moving object using a single-pixel detector [J]. Optics Express, 2019, 27(24): 35394-35402. doi: 10.1364/OE.27.035394 [11] Deng Q W, Zhang Z B, Zhong J A. Image-free real-time 3-D tracking of a fast-moving object using dual-pixel detection [J]. Optics Letters, 2020, 45(17): 4734-4737. [12] Flusser J, Suk T, Zitova B. 2D and 3D Image Analysis by Moments[M]. United Kingdom: John Wiley & Sons, 2016: 47-49. [13] Ma X Y, Rao C H, Zheng H Q. Error analysis of CCD-based point source centroid computation under the background light [J]. Optics Express, 2009, 17(10): 8525-8532. [14] Li Z Q, Li X Y. Centroid computation for Shack-Hartmann wavefront sensor in extreme situations based on artificial neural networks [J]. Optics Express, 2018, 26(24): 31675-31693. [15] Huang J, Shi D F, Yuan K E, et al. Computational-weighted Fourier single-pixel Imaging via binary illumination [J]. Optics Express, 2018, 26(13): 16547-16560. doi: 10.1364/OE.26.016547 [16] Baker K L. Iteratively weighted centroiding for Shack-Hartmann wave-front sensors [J]. Optics Express, 2018, 15(8): 5147-5160. [17] Zha L B, Shi D F, Huang J, et al. Single-pixel tracking of fast-moving object using geometric moment detection [J]. Optics Express, 2021, 29(19): 30327-30336. doi: 10.1364/OE.436348