-
图5为文中实验装置图,使用的DLP LightCrafter投影仪输出分辨率为1280 pixel×720 pixel, 刷新率为60 Hz。相机为海康威视(MV-CA013-21UM)130万像素170 fpsUSB3.0黑白工业相机,工业相机镜头焦距为8 mm,分辨率为1280 pixel×1024 pixel。两个双棱镜具有相同的参数和尺寸,楔角
$ \gamma $ 为$ {45^ \circ } $ ,两个直角边的长度均为50 mm。受双棱镜材质的限制,在相机采集被测物体时极易受到环境光和两块双棱镜之间光线交互的影响,文中使用遮光片对基于双棱镜的虚拟双目系统进行处理去除了由环境光折射和棱镜反射光照成的光线干扰,并依据相机标定参数对原始图像进行校正以保证图像质量。该三维重建系统测量距离约为
$ 600\;\mathrm{m}\mathrm{m} $ ,测量范围约为$ 200\;\mathrm{m}\mathrm{m}\times 320\;\mathrm{m}\mathrm{m} $ ,由投影仪向被测物体投射条纹的波长分别为16、18和21。 -
文中使用硬触发的方式采集图像,实现投射条纹和采集同步。通过投射三频外差共12幅条纹图,采集到的其中一组条纹图如图6(a)所示。由上文虚拟双目成像模型分析可知,相机采集到被测对象的图像是成左右镜像的,因此在图像分割之后对图像进行左右翻转操作。处理后的图像如图6(b)和图6(c)所示。
-
为获得被测对象的深度信息,需建立两个虚拟相机之间的联系。文中使用尺寸为
$ {\text{150 mm}} \times {\text{150 mm}} $ ,角点数为$ 11 \times 8 $ 的棋盘格标定板作为标定工具对虚拟双目系统进行标定,在确保标定板能够完整位于相机视野范围内的前提下多次改变标定板的位置,并使用相机捕捉棋盘格标定板图像。文中使用张正友标定法[27]对虚拟双目系统进行标定以获取虚拟双目系统的内外参数。输入分割处理后的15组标定板图像,进行畸变校正处理得到标定的重投影误差如图7所示,总体平均重投影误差约为0.04个像素点,每组棋盘格图片的重投影误差最高不超过0.05个像素点。
-
文中使用多频外差法获得被测对象的展开相位,通过使用虚拟双目条纹结构光三维重建系统重建多种不同形状的物体,验证文中提出方法鲁棒性。
图8(a)~图8(b)和图8(e)~图8(f)分别显示了左右虚拟相机用条纹投影法获得的板状和弧状物体的捕获图像。图8(c)和图8(g)显示了其中一个虚拟相机应用滤波算法后获得的展开相位图。使用极线校正方法处理后使得校正后右展开相位图像中的所有候选匹配点与校正后左图像对应点保持在同一行。使用文中的方法将校正后的展开相位图像通过立体匹配得到被测对象的深度信息,并生成点云数据。板状和弧状物体的三维重建结果如图8(d)和图8(h)所示,可以重建出被测对象形状合理点云模型。
Figure 8. 3D reconstruction process diagram. (a) Fringe image of left camera of plate model; (b) Fringe image of right camera of plate model; (c) Unwrapped phase of plate model; (d) Point cloud image of plate model; (e) Left camera fringe image of arc model; (f) Right camera fringe image of arc model; (g) Unwrapped phase of arc model; (h) Right camera stripe image of arc model
-
为验证虚拟双目方案的可行性,文中设立了对比实验。该实验选用标称直径为38.1118 mm (±5 μm)和38.1122 mm (±5 μm)的两个高精度哑光陶瓷标准球测量结果验证方法的精度。在相同实验环境下多次使用虚拟双目系统和真实双目系统对该物体进行三维重建,得到该标准球的点云数据并拟合球体得到重建标准球的三维信息。其中真实双目系统由两个与虚拟双目系统参数完全相同的相机组成。通过对比真实、虚拟双目重建系统得到的标准球的尺寸信息与真实尺寸信息的差异,来验证虚拟双目系统方案的可行性。
图9(a)~图9(b)和图9(d)~图9(e)分为虚拟双目系统和真实双目系统捕获的条纹图像。虚拟、真实双目三维重建系统的三维重建结果如图9(c)和图9(f)所示,两种方法重建出来的点云均为圆弧状,细节符合球体表面形状。
Figure 9. Reconstruction of point cloud with standard sphere. (a) Virtual binocular left fringe image of standard sphere; (b) Virtual binocular right fringe image of standard sphere; (c) Virtual binocular reconstruction point cloud image of standard sphere; (d) Real binocular left fringe image of standard ball; (e) Real binocular right fringe image of standard ball; (f) Real binocular reconstruction point cloud image of standard sphere
通过去除点云数据噪点后进行球体拟合,得到拟合球的直径信息。并将两种方法得到的拟合球直径与标准球真实直径进行比较,得到两种方法的测量误差。部分拟合球的直径信息如表1所示,其中每组数据包含两个标准球的拟合信息。经过多次对比实验,真实双目条纹结构光测量的平均均方根误差约为0.0305 mm,文中提出的基于虚拟双目的条纹结构光三维重建系统测量的平均均方根误差约为0.0379 mm。
Method Result Virtual stereo camera Real stereo camera Ball 1/mm Ball 2/mm Ball 1/mm Ball 2/mm 1 38.1462 38.1631 38.1012 38.1441 2 38.0310 38.0947 38.0624 38.1193 3 38.1188 38.1349 38.1049 38.1249 4 38.1406 38.1560 38.0837 38.1228 5 38.0946 38.1016 38.1211 38.1792 Table 1. Comparison of virtual and real binocular reconstruction results
Fringe structured light 3D reconstruction based on virtual binocular
doi: 10.3788/IRLA20210955
- Received Date: 2022-03-10
- Rev Recd Date: 2022-04-25
- Publish Date: 2022-11-30
-
Key words:
- 3D reconstruction /
- virtual binocular /
- fringe structured light /
- multi-frequency heterodyne /
- biprism
Abstract: In order to solve such problems as synchronization, high cost and others in traditional binocular fringe structured light 3D reconstruction, a virtual binocular fringe structured light 3D reconstruction method based on a single camera is proposed. The virtual binocular fringe structured light 3D reconstruction system with binocular vision is designed using a single camera, two biprisms and one projector. Biprism refraction and light splitting are used to change the path of reflected light on the surface of the measured object, by which the goal of acquisition of multi-view images are attained under single camera. Through the multi-frequency heterodyne method, stereo matching and binocular calibration of virtual binocular, the depth information of the measured object is obtained and the point cloud is reconstructed. Experimental results show that the root mean square error of the proposed method and the real binocular structured light method to measure the standard ball are 0.037 9 mm and 0.030 5 mm respectively. The proposed method in this paper can promote the development of binocular fringe structured light technology in the aspects of rapidity, low cost, miniaturization, etc. At the same time, this method can be extended to the color camera based fringe structured light 3D reconstruction and projection speckle structured light 3D reconstruction.