留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

计算光学成像:何来,何处,何去,何从?

左超 陈钱

左超, 陈钱. 计算光学成像:何来,何处,何去,何从?[J]. 红外与激光工程, 2022, 51(2): 20220110. doi: 10.3788/IRLA20220110
引用本文: 左超, 陈钱. 计算光学成像:何来,何处,何去,何从?[J]. 红外与激光工程, 2022, 51(2): 20220110. doi: 10.3788/IRLA20220110
Zuo Chao, Chen Qian. Computational optical imaging: An overview[J]. Infrared and Laser Engineering, 2022, 51(2): 20220110. doi: 10.3788/IRLA20220110
Citation: Zuo Chao, Chen Qian. Computational optical imaging: An overview[J]. Infrared and Laser Engineering, 2022, 51(2): 20220110. doi: 10.3788/IRLA20220110

计算光学成像:何来,何处,何去,何从?

doi: 10.3788/IRLA20220110
基金项目: 国家自然科学基金(U21B2033);江苏省基础研究计划前沿引领专项(BK20192003);中央高校科研专项资助项目(30920032101)
详细信息
    作者简介:

    左超,男,教授,博士生导师,博士,主要从事计算光学成像与光信息处理技术的研究 (Email: zuochao@njust.edu.cn; Website: www.scilaboratory.com)

    通讯作者: 陈钱,男,教授,博士生导师,博士,主要从事光电成像与信息处理等方面的研究 (Email: chenqian@njust.edu.cn)。
  • 中图分类号: O438

Computational optical imaging: An overview

  • 摘要: 计算光学成像是一种通过联合优化光学系统和信号处理以实现特定成像功能与特性的新兴研究领域。它并不是光学成像和数字图像处理的简单补充,而是前端(物理域)的光学调控与后端(数字域)信息处理的有机结合,通过对照明、成像系统进行光学编码与数学建模,以计算重构的方式获取图像与信息。这种新型的成像方式将有望突破传统光学成像技术对光学系统以及探测器制造工艺、工作条件、功耗成本等因素的限制,使其在功能(相位、光谱、偏振、光场、相干度、折射率、三维形貌、景深延拓、模糊复原、数字重聚焦,改变观测视角)、性能(空间分辨、时间分辨、光谱分辨、信息维度与探测灵敏度)、可靠性、可维护性等方面获得显著提高。现阶段,计算光学成像已发展为一门集几何光学、信息光学、计算光学、现代信号处理等理论于一体的新兴交叉技术研究领域,成为光学成像领域的国际研究重点和热点,代表了先进光学成像技术的未来发展方向。国内外众多高校与科研院所投身其中,使该领域全面进入了“百花齐放,百家争鸣”的繁荣发展局面。作为本期《红外与激光工程》——南京理工大学专刊“计算光学成像技术”专栏的首篇论文,本文概括性地综述了计算光学成像领域的历史沿革、发展现状、并展望其未来发展方向与所依赖的核心赋能技术,以求抛砖引玉。
  • 图  1  常见的光电成像系统

    Figure  1.  Common optoelectronic imaging systems

    图  2  传统光学成像系统的成像过程

    Figure  2.  Conventional optical imaging process

    图  3  光学成像技术的五方面发展目标

    Figure  3.  Five goals for the development of optical imaging technology

    图  4  传统数字图像处理往往仅作为成像的后处理过程

    Figure  4.  Conventional digital imaging processing is only a post-processing step in the whole imaging process

    图  5  计算光学成像系统的成像过程

    Figure  5.  Computational optical imaging process

    图  6  从2017年修订后的国家自然科学基金委学科代码,其中“计算成像”被列入信息科学部四处下一个独立的子方向(F050109)

    Figure  6.  The revised discipline code of the National Natural Science Foundation of China in 2017. “Computational imaging” has been listed as an independent sub-direction of Information Science (F050109)

    图  7  16世纪用于绘图的暗箱装置

    Figure  7.  Camera obscura box, 16th century

    图  8  尼埃普斯使用的暗箱相机和所拍摄的《牵马少年》

    Figure  8.  The Camera Obscura box used by Joseph Nicephore Niépce and his photo “the man with a horse”

    图  9  尼埃普斯所拍摄的《窗外景色》

    Figure  9.  “Window at Le Gras” taken by Joseph Nicephore Niépce

    图  10  1838年达盖尔所拍摄的《Boulevard du Temple》

    Figure  10.  “Boulevard du Temple” taken by Joseph Nicephore Niépce, 1838

    图  11  塔尔博特所拍摄的“冬天里的橡树”(负片与正片)

    Figure  11.  William Henry Fox Talbot – An oak tree in winter (Negative and positive)

    图  12  火棉胶湿摄影术的基本流程

    Figure  12.  Wet-collidion process

    图  13  1878年迈布里奇拍摄的《奔马》连续照片

    Figure  13.  Eadweard Muybridge——The horse in motion, 1878

    图  14  布朗尼相机与莱卡相机

    Figure  14.  Brownie and Leica camera

    图  15  1888年在纽约刊登的柯达相机广告

    Figure  15.  The advertisement of Kodak camera in New York, 1888.

    图  16  托马斯·萨顿拍摄的“苏格兰格纹丝带”

    Figure  16.  Thomas Sutton — Tartan Ribbon

    图  17  柯达K135-20彩色胶卷

    Figure  17.  Kodachrome K135-20 Color Film

    图  18  柯达傻瓜相机“Instamatic”

    Figure  18.  Kodak instamatic camera

    图  19  安培公司推出的首款磁带录像机VR-1000

    Figure  19.  The first of Ampex's videotape recorder VR-1000

    图  20  博伊尔和史密斯发明的首个CCD相机

    Figure  20.  The first CCD camera developed by Boyle and Smith

    图  21  史蒂文·萨森研发出世界上第一部数码相机

    Figure  21.  The first digital camera developed by Steven Sasson

    图  22  索尼推出的世界上首台电磁记录照相机“玛维卡”

    Figure  22.  Mavica camera developed by SONY

    图  23  夏普联合J-Phone推出的全球首款拍照手机J-SH04

    Figure  23.  The first handphone with camera, J-SH04, developed by SHARP and J-Phone

    图  24  采用了卡尔蔡司(Carl Zeiss)认证镜头的诺基亚N90

    Figure  24.  Nokia N90 handphone with Carl Zeiss optics

    图  25  尼康单反相机D1

    Figure  25.  Nikon SLR camera D1

    图  26  乔布斯在Macworld 2007大会上发布的第一代iPhone

    Figure  26.  Apple iPhone 1 released in the Macworld 2007 by Steve Jobs

    图  27  2010年乔布斯推出苹果划时代的产品iPhone 4

    Figure  27.  Apple iPhone 4 released by Steve Jobs in 2010

    图  28  全世界第一台双摄手机LG Optimus 3 D

    Figure  28.  The first dual camera mobile phone — LG Optimus 3 D

    图  29  苹果在iPhone X上引入的结构光 3D人脸识别

    Figure  29.  Structured light 3D face recognition technique in iPhone X

    图  30  华为P30 Pro及其在50倍的变焦下拍摄的月亮表面的细节清晰可见(虽然有争议说是AI修正合成的结果)

    Figure  30.  Huawei P30 Pro and captured moon with highly clear surface details (though it is controversial that it is the result of AI synthesis)

    图  31  胶卷相机(Nikon F80)与数码相机(Nikon D50)的对比

    Figure  31.  Comparison of film camera (Nikon F80) with digital camera (Nikon D50)

    图  32  最早的“计算成像”技术——合成孔径雷达

    Figure  32.  Synthetic Aperture Radar (SAR), the earliest computational imaging technique

    图  33  最早的采用“计算成像”思想设计的光学成像系统——波前编码成像

    Figure  33.  Wave-front coding, the earliest optical imaging system involving the idea of computational imaging

    图  34  3LCD投影仪与DLP投影仪的基本结构

    Figure  34.  Basic configurations of the 3LCD projector and the DLP projector

    图  35  16张不同曝光时间(30~1/1000 s)下拍摄的教堂图像[23]

    Figure  35.  Sixteen photographs of a church taken at 1-stop increments from 30 to 1/1000 second[23]

    图  36  采用两张正交偏振图像进行图像去雾的效果[27]

    Figure  36.  Image dehazing using two images with orthogonal polarization state[27]

    图  37  2005年于麻省理工学院举办的“Computational Photography and Video”研讨会的会议议程[29]

    Figure  37.  Program of Symposium on Computational Photography and Video in MIT, 2005 [29]

    图  38  吴义仁研制的首个光场相机及其商业化产品

    Figure  38.  The first light field camera developed by Ng and its commercialized verison——Lytro

    图  39  Rice大学于2006年所设计的单像素相机[40-41]

    Figure  39.  The 1st single-pixel camera developed by Rice University in 2006[40-41]

    图  40  GS相位恢复算法基本原理[54-55]

    Figure  40.  Principle of the GS phase retrieval method[54-55]

    图  41  傅里叶变换轮廓术的基本原理[64]

    Figure  41.  Principle of the Fourier transform profilometry method[64]

    图  42  传统全息术基本原理

    Figure  42.  Principle of the conventional holographic imaging technique

    图  43  离轴数字全息重构基本原理

    Figure  43.  Principle of the off-axis digital holographic reconstruction

    图  44  编码孔径成像基本原理与掩模板实物图

    Figure  44.  Principle of coded aperture imaging and a photograph of a coded mask

    图  45  美国光学学会(OSA,现Optica)计算光学传感与成像国际会议议题

    Figure  45.  Topic categories of the Optica (formerly OSA) topic meeting COSI

    图  46  Levoy教授所领导研发的Google Pixel相机多次登顶DXOMark榜单

    Figure  46.  Professor Levoy's Google Pixel camera tops DXOMark several times

    图  47  “计算成像”、“计算光学”、“计算摄影”已经逐渐成为智能手机各大厂商的营销词汇

    Figure  47.  "Computational imaging", "computational optics" and "computational photography" have gradually become marketing terms for smartphone manufacturers

    图  48  Facebook创始人Mark Zuckerberg宣布将Facebook更名为Meta并提出“元宇宙”概念。而三维传感技术有望将物理世界“数字化”,对于元宇宙的基建与落成都有着重大的实际意义

    Figure  48.  Facebook founder Mark Zuckerberg announced the renaming of Facebook as Meta and proposed the concept of "Metaverse", 3D sensing technology that promises to "digitize" the physical world and has great practical significance for the infrastructure and completion of Metaverse

    图  49  基于“目的与动机”对典型计算光学成像技术所作的分类

    Figure  49.  Classification of typical computational imaging techniques according to their ''objectives and motivations''

    图  50  相位成像技术的分类

    Figure  50.  Classification of the phase imaging techniques

    图  51  Zernike相差显微术与微分干涉相差显微术

    Figure  51.  Zernike phase contrast microscopy and Differential Interference Contrast (DIC) microscopy

    图  52  巨型迈克尔逊干涉仪——LIGO引力波探测器

    Figure  52.  Giant Michelson interferometer——LIGO wavefront detector

    图  53  Shack-Hartmann波前传感器与四棱锥波前传感器

    Figure  53.  Schematics of Shack-Hartmann and pyramid wavefront sensors

    图  54  迭代法相位恢复技术

    Figure  54.  Schematics of iterative phase retrieval techniques

    图  55  傅里叶叠层成像技术

    Figure  55.  Schematic of Fourier ptychographic microscopy

    图  56  晴天下游泳池底的光波图案。(池中水面的涟漪让阳光发生折射,在池底产生了明暗相间的网络结构)

    Figure  56.  The wave-like pattern at the bottom of a swimming pool in sunlight. (The pool surface refracts the incident sunlight to produces the characteristic pattern)

    图  57  光强传输方程在不同研究领域的应用

    Figure  57.  Applications of TIE in different research fields

    图  58  部分相干光场下的广义光强传输方程

    Figure  58.  Generalized transport of intensity equation (GTIE) for partially coherent field

    图  59  光强传输方程对乳腺癌细胞的定量相位三维成像[238]

    Figure  59.  Quantitative phase 3D imaging of a breast cancer cell using TIE[238]

    图  60  基于弱相位近似的差分相衬定量相位成像原理示意图

    Figure  60.  Schematic diagram of the principle of quantitative phase imaging with DPC based on weak phase approximation

    图  61  照明优化策略的对比。 (a) 对应的单次相位传递函数与合成相位传递函数;(b)最优照明下的各向同性定量相位成像结果

    Figure  61.  Comparison illumination-optimized schemes. (a) PTFs and their synthetic PTFs corresponding to different illumination functions; (b) Isotropic quantitative phase imaging results under optimal illumination

    图  62  差分相衬定量相位成像的成像效率优化方案。(a)彩色复用三波段的差分相衬定量相位成像方案;(b) 基于三波长照明的多模态成像及定量相位成像方案;(c) 单帧差分相衬最优照明成像方案

    Figure  62.  Imaging efficiency optimization schemes of DPC. (a) Triple-wavelength multiplexed illumination scheme; (b) Triple-wavelength illumination scheme for multimodal imaging and DPC; (c) Single-shot optimal illumination scheme of DPC

    图  63  成像光谱分辨率逐渐提升

    Figure  63.  Gradual increase in spectral imaging resolution

    图  64  计算层析成像光谱技术的数据立方投影过程

    Figure  64.  The projection of the data cube in CTIS

    图  65  画幅型层析成像光谱仪原理图

    Figure  65.  Schematic diagram of frame-type computer tomographic imaging spectrometer

    图  66  Wagadarikar设计的单色散元件的编码孔径成像光谱仪及其成像结果[49]

    Figure  66.  Single Disperser CASSI instrument designed by Wagadarikar, and the imaging results[49]

    图  67  傅里叶变换光谱仪示意图

    Figure  67.  Schematic diagram of Fourier transform spectrometer

    图  68  哈达玛变换光谱仪示意图

    Figure  68.  Schematic diagram of Hadamard transform spectrometer

    图  69  同一场景的可见光、长波红外以及偏振成像结果

    Figure  69.  Visible light, long wave infrared and polarization imaging results for the same scene

    图  70  基于旋转偏振片的偏振成像系统

    Figure  70.  Polarization imaging system based on rotating polarizer

    图  71  Farlow等研制的分振幅偏振成像系统[299]

    Figure  71.  Split amplitude polarization imaging system developed by Farlow et al. [299]

    图  72  分孔径偏振成像系统

    Figure  72.  Split aperture polarization imaging system

    图  73  分焦平面偏振成像系统

    Figure  73.  Split focal plane polarization imaging system

    图  74  Luna等设计的多波长双旋转相位片偏振仪系统结构图[303]

    Figure  74.  Structure diagram of multi-wavelength dual-rotating phase plate polarization imaging system designed by Luna et al. [303]

    图  75  大气散射模型与偏振去雾图像前后效果对比

    Figure  75.  Atmospheric scattering model and comparison of images before and after polarization defogging

    图  76  牛心肌样品的三维PS-OCT成像结果。(a)三维整体结构图;(b)局部光轴图;(c)局部延迟图;(d)局部二向衰减图

    Figure  76.  PS-OCT imaging results of bovine myocardial samples. (a) 3D global structure map; (b) Local optical axis diagram; (c) Local delay map; (d) Local bi-direction attenuation map

    图  77  对径压缩圆盘六步相移彩色光弹图像

    Figure  77.  Six-step phase-shifting color photoelastic images of the diametric compression disk

    图  78  典型的光学三维传感技术

    Figure  78.  Representative techniques for 3D optical sensing

    图  79  (a)立体视觉法[386];(b)飞行时间法[387];(c)激光线扫法[388];(d)散焦恢复形状测量法[389]

    Figure  79.  (a) Schematic diagrams of stereo vision[386]; (b) Time-of-flight method[387]; (c) Laser scanning[388];(d) Defocus recovery method[389]

    图  80  条纹投影轮廓术示意图[403]

    Figure  80.  Schematic diagram of fringe projection profilometry [403]

    图  81  孤立物体和不连续表面的包裹相位存在条纹级次歧义[424]

    Figure  81.  Fringe order ambiguity in the wrapping phase of isolated objects and discontinuity surfaces[424]

    图  82  基于立体相位展开的四目实时三维测量系统及其测量结果。(a)笔者课题组提出的四目实时系统[450];(b)该系统获取的动态场景下的实时彩色三维轮廓数据[450];(c)该系统获取的全方位点云数据[456];(d)该系统实现的360°三维面型缺陷检测[457]

    Figure  82.  Quad-camera real-time 3D measurement system based on stereo phase unwrapping and its measurement results. (a) Quad-camera real-time system proposed by our research group[450]; (b) The real-time color 3D data in the dynamic scene obtained by our system[451]; (c) The omnidirectional point cloud data obtained by our system[456]; (d) 360° 3D surface defect detection obtained by our system[457]

    图  83  基于散斑相关法的商业产品。(a) Kinect; (b) PrimeSense; (c) iPhone X

    Figure  83.  Commercial products based on speckle correlation. (a) Kinect; (b) PrimeSense; (c) iPhone X

    图  84  利用深度学习的单帧相位恢复方法流程图以及不同方法的三维重建结果。(a)基于深度学习相位恢复原理[460];(b)不同条纹分析方法(FT、WFT、基于深度学习的方法和12步移相轮廓术)的三维重建比较[460];(c)利用深度学习方法对一台不同转速的电扇进行了测量[462];(d)利用深度学习单帧彩色条纹投影轮廓术对旋转工件的动态三维重建[464];(e)利用深度学习单帧复合条纹投影轮廓术对旋转女孩模型的动态三维重建[465]

    Figure  84.  Flowchart of the single-frame phase retrieval approach using deep learning and the 3D reconstruction results of different approaches. (a) The principle of deep-learning-based phase retrieval method[460]; (b) Comparison of the 3D reconstructions of different fringe analysis approaches (FT, WFT, the deep-learning-based method, and 12-step phase-shifting profilometry) [460]; (c) The measurement results of a desk fan rotating at different speeds using our deep-learning method[462]; (d) The dynamic 3D measurement result of a rotating workpiece by deep-learning-based color FPP method[464]; (e) The dynamic 3D measurement result of a rotating bow girl model by composite fringe projection deep learning profilometry(CDLP)[465]

    图  85  各种基于微透镜阵列的光场相机系统

    Figure  85.  Various light field cameras based on microlens array

    图  86  基于相机阵列的光场采集。(a) 斯坦福光场龙门架[479]; (b) 斯坦福大学的大规模相机阵列[481];(c) 5×5相机阵列实现显微光场采集[482]

    Figure  86.  Light field capture based on camera arrays. (a) Stanford Spherical Gantry[479]; (b) Stanford large camera arrays[481]; (c) Acquiring micro-object images with the 5×5 camera array system[482]

    图  87  基于编码掩膜的计算光场成像。(a)掩膜增强相机光场采集[483];(b)压缩光场采集[484]

    Figure  87.  Computational light field. (a) Mask enhanced camera[483]; (b) Compressive light field photography[484]

    图  88  基于可编程孔径的光场成像。(a) 可编程孔径光场相机[485];(b) 可编程孔径光场显微镜[251]

    Figure  88.  Light field imaging based on programmable aperture. (a) Programmable aperture light field camera[485]; (b) Programmable aperture microscope[251]

    图  89  光场成像在计算摄像的应用。(a)光场重聚焦[476];(b)合成孔径成像[492]

    Figure  89.  Light field imaging in computational photography. (a) Light field refocusing[476]; (b) Synthetic aperture imaging[492]

    图  90  X 射线断层扫描技术。(a) X-ray 二维图像 与(b)三维CT 的对比及螺旋锥束扫描CT

    Figure  90.  X-ray computed tomography. (a) 2D X-ray image versus; (b) 3D X-ray CT and Spiral cone beam scanning CT

    图  91  典型的颅脑MRI图像

    Figure  91.  Typical brain MRI images

    图  92  宽场(左)与共聚焦显微镜(右)的光路结构[502]

    Figure  92.  Schematic of widefield (left) and confocal fluorescence microscope (right) optical path structure[502]

    图  93  荧光显微镜拍摄到的细胞三维图像

    Figure  93.  An example of the acquired 3 D image of a cell, captured by a fluorescence microscope

    图  94  理论计算得到的三维PSF的x-yx-z平面切片图像。(a) x-y平面切片图像,每一切片上方数字表示该切片沿z轴方向距离点扩散函数中心亮点的距离;(b) x-z平面切片图像,每一切片上方数字表示该切片沿y轴方向距离点扩散函数中心亮点的距离

    Figure  94.  x-y and x-z slice images of three-dimensional PSF are calculated theoretically. (a) x-y slice images. The number above each slice represent the distance between the slice along the z-axis direction and the central highlights of the point spread function; (b) x-z slice images. The number above each slice indicates the distance between the slice along the y-axis and the highlight of the point spread function center.

    图  95  反卷积三维荧光显微成像的工作流程

    Figure  95.  Workflow of deconvolution three-dimensional fluorescence microscopic imaging

    图  96  光场显微镜模型[516]。(a)传统明场显微镜;(b)光场显微镜[516];(c)基于波动光学的光场显微模型[518];(d)傅里叶光场显微镜模型[519]

    Figure  96.  Model of Light field microscope[516]. (a) Traditional bright field microscope; (b) Light field microscopy[516]; (c) Light field microscopic based on wave optics theory[518]; (d) Fourier light field microscopy[519]

    图  97  光场显微在生物科学中的应用。(a)小鼠头戴MiniLFM[522];(b)使用HR-LFM成像COS-7活细胞中的高尔基源膜泡[523];(c) DAOSLIMIT观测小鼠肝脏中中性粒细胞迁移过程中的迁移[525];(d)共聚焦光场显微镜,观测斑马鱼的捕猎活动以及探测小鼠大脑的神经活动[525]

    Figure  97.  Light field applications in biological science. (a) Mouse with a head-mounted MiniLFM [522]; (b) Imaging Golgi-derived membrane vesicles in living COS-7 cells using HR-LFM [523]; (c) Migrasome dynamics during neutrophil migration in mouse liver with DAOSLIMIT[525]; (d) Confocal light field microscopy, tracking and imaging whole-brain neural activity during larval zebrafish’s prey capture behavior and imaging and tracking of circulating blood cells in awake mouse brain[524]

    图  98  全息衍射层析显微术的代表性工作。(a)瑞士洛桑联邦理工学院Charriere等[532]的旋转物体测量;(b)美国麻省理工学院的Choi等[534]的扫描振镜测量;(c)瑞士洛桑联邦理工学院的Cotte等[537]楔形棱镜扫描;(d)韩国技术科学院的Park团队[547]的DMD扫描测量

    Figure  98.  Representative work on holographic diffraction tomography microscopy. (a) Rotating object measurements by Charriere et al[532]; (b) Scanning galvanometer measurements by Choi et al[534]; (c) Wedge prism scanning by Cotte et al[537]; (d) Park's team[547] for DMD scanning measurements

    图  99  相位恢复衍射层析显微术的代表性工作。(a)澳大利亚墨尔本大学X衍射成像研究团队的Barty等[539]的显微镜平台旋转物体测量;(b)加州大学洛杉矶分校的Ozcan课题组[548]的无透镜片上层析平台;(c)笔者课题组[239]的基于LED阵列的无透镜平台;(d)笔者课题组[240]的基于LED阵列的显微镜平台

    Figure  99.  Representative work on phase retrieval diffraction tomography microscopy. (a) Microscope platform rotating object measurements by Barty et al[539] from the X diffraction imaging research team at the University of Melbourne, Australia; (b) Lens-free on-chip chromatography platform by the Ozcan group at UCLA[548]; (c) Lens-free LED array-based platform by our group[239]; (d) LED array-based microscopy platform of our group[240]

    图  100  光强衍射层析显微术的两种实现方式。(a)基于轴向扫描的“光强传输衍射层析”(TIDT)显微术;(b)基于照明角度扫描的“傅里叶叠层衍射层析”(FPDT)显微术

    Figure  100.  Two implementations of optical intensity diffraction tomography. (a) TIDT microscopy based on axial scanning; (b) FPDT microscopy based on illumination angle scanning

    图  101  光强传输衍射层析术的代表性工作。(a)笔者课题组[238]的基于高数值孔径环形照明的定量相位成像;(b) 西班牙马德里大学Alieva课题组[549]的电控变焦透镜的光强传输衍射层析;(c)笔者课题组[241]的基于环形照明的多孔径光强传输衍射层析

    Figure  101.  Representative work on TIDT. (a) Quantitative phase imaging based on high numerical aperture ring illumination by our group[238]; (b) TIDT with electronically controlled zoom lens by Alieva's group[549] at the University of Madrid, Spain; (c) Multi-aperture optical intensity transfer diffraction tomography based on ring illumination by our group[241]

    图  102  傅里叶叠层衍射层析术的代表性工作。(a)美国加州大学伯克利分校的Waller课题组[550]的基于多层模型的傅里叶叠层三维成像;(b)美国加州理工学院的Yang课题组[185]的傅里叶叠层层析技术(一阶Born近似下不含暗场强度图);(c)笔者课题组[186]的傅里叶叠层衍射层析成像技术(一阶Rytov近似下含暗场强度图)

    Figure  102.  Representative work on FPDT. (a) FPDT 3D imaging based on a multilayer model by Waller's group at UC Berkeley[550]; (b) FPDT without dark field intensity under the first-order Born approximation by Yang's group at Caltech[185]; (c) FPDT with dark field intensity under the first-order Rytov approximation by our group[186]

    图  103  通过干涉测量法进行相干测量。(a)杨氏双缝干涉仪[561];(b)逆波前杨氏干涉仪[562];(c)非冗余孔径阵列[563];(d)自参考干涉法[565];(e)两点干涉仪;(f) Sagnac干涉仪[565-566]

    Figure  103.  Coherent measurement using interferometer. (a) Young’s interferometer[561]; (b) Reversed-wavefront Young interferometer[562]; (c) Non-redundant array[563]; (d) Self-referencing interferometer[565]; (e) Two-point interferometer; (f) Sagnac interferometer[565-566]

    图  104  相空间断层扫描的原理与光路图。(a)原理示意图;(b)相空间断层扫描的光路结构,实验系统采用一对柱状透镜,在轴向z0处测量光强

    Figure  104.  The principal and optical setup of phase-space tomography. (a) Principle of phase space tomography; (b) A pair of cylindrical lenses oriented perpendicularly are used to introduce astigmatism to the measurement. Intensities are measured at planes with axial coordinate z0

    图  105  相空间的直接测量。(a) 基于小孔扫描的相空间直接测量[569];(b)基于微透镜阵列的相空间直接测量[570]

    Figure  105.  The direct measurement of phase space. (a) Direct measurement based on pinhole scanning[569]; (b) Direct measurement based on microlens array[570]

    图  106  两类成像分辨率对最终图像清晰度的影响。(a) 理想高分辨率图像;(b) 对于小视场的制导系统而言,成像系统的分辨率最终由光学分辨率,即成像系统的口径所决定(如图(c)所示),而对于大部分宽视场的搜索/跟踪系统而言,成像系统的分辨率最终由图像分辨率,即探测器的像素尺寸决定(如图(d)所示)

    Figure  106.  The influence of two kinds of imaging resolution on the final image definition. (a) ideal high resolution image; (b) for the guidance system with small field of view, the resolution of the imaging system is finally determined by the optical resolution, that is, the aperture of the imaging system (as shown in Figure(c)), while for most search / tracking systems with wide field of view, the resolution of the imaging system is finally determined by the image resolution, that is, the pixel size of the detector (as shown in Figure(d))

    图  107  光学系统口径所限制的衍射分辨极限(艾里斑)。(a) 成像系统的最小可分辨距离(光学角分辨率)与成像系统的孔径成反比;(b)~(d) 两个非相干的点目标在不同间距下所能拍摄到的艾里斑图像

    Figure  107.  Diffraction resolution limit limited by the aperture of optical system (Airy spot). (a) The minimum resolvable distance (optical angular resolution) of the imaging system is inversely proportional to the aperture of the imaging system; (b)-(d) Airy spot images of two incoherent point targets at different distances

    图  108  探测器像元大小所限制的奈奎斯特采样极限(马赛克效应)。(a) 像素采样不足(像素尺寸过大)所导致的信息混叠现象;(b) 恰好满足奈奎斯特采样极限时的情况; (c) 一个典型的红外热像仪对于人体目标在不同距离下的成像效果(像元尺寸为38 μm,像素为320×240,50 mm焦距镜头)

    Figure  108.  Nyquist sampling limit limited by detector pixel size (mosaic effect). (a) Information aliasing caused by insufficient pixel sampling (excessive pixel size); (b) When the Nyquist sampling limit is exactly met; (c) The imaging effect of a typical infrared thermal imager for human targets at different distances (Pixel size: 38 μm. 320× 240 pixels, 50 mm focal length lens)

    图  109  像素超分辨重建的基本原理(逆向病态问题的最优解)

    Figure  109.  Basic principle of pixel super-resolution reconstruction (Optimal solution of inverse ill-posed problem)

    图  110  基于SCRNN的单帧重建算法

    Figure  110.  Single frame reconstruction algorithm based on SCRNN

    图  111  被动亚像素移动超分辨成像基本原理

    Figure  111.  Basic principle of passive subpixel moving super-resolution imaging

    图  112  可控亚像素移动所引起的像素级光强变化

    Figure  112.  Pixel level light intensity change caused by controllable sub-pixel movement

    图  113  微扫描装置。(a)光学折射法;(b)平板旋转法;(c) 压电陶瓷体

    Figure  113.  Micro scanning device. (a) Optical refraction method; (b) Plate rotation method; (c) Piezoelectric ceramics body

    图  114  长春光机所通过采用微扫描成像器件实现亚像素级光强变换以实现图像超分率[589]

    Figure  114.  Changchun University of technology realizes sub-pixel light intensity conversion by using micro scanning imaging devices to realize image super-resolution[589]

    图  115  编码孔径超分辨率成像基本思想[594]

    Figure  115.  Basic principle of coded aperture super resolution imaging[594]

    图  116  (a) 研制的可见光波段孔径编码原理样机及成像结果;(b) 研制的红外波段孔径编码原理样机及成像结果

    Figure  116.  (a) Visible coded aperture imaging system and its reconstruction results; (b) Infrared coded aperture imaging system and its reconstruction results

    图  117  合成孔径雷达示意图

    Figure  117.  Schematic diagram of Synthetic aperture radar

    图  118  (a) 美国Aerospace公司研制的基于光纤的激光合成孔径雷达成像原理图;(b) 成像结果对比(右图为衍射受限成像结果,左图为合成孔径后的结果图)

    Figure  118.  (a) Principle diagram of laser synthetic aperture radar imaging based on optical fibers developed by Aerospace Corporation of the United States; (b) Comparison of imaging results (right image is diffraction-limited imaging results, left image is synthetic aperture results)

    图  119  非干涉傅里叶叠层成像合成孔径技术系统原理示意图

    Figure  119.  Schematic of non-interferometric synthetic aperture imaging technology based on Fourier ptychography

    图  120  反射式宏观傅里叶叠层成像系统实物与原理图[600]

    Figure  120.  Reflective Fourier ptychography imaging system and schematic diagram[600]

    图  121  传统非相干合成孔径系统结构。(a)迈克尔逊型干涉仪;(b)中次镜结构;(c)相控阵列结构

    Figure  121.  Conventional incoherent synthetic aperture structure. (a) Michelson interferometer; (b) Common secondary structure; (c) Phased array structure

    图  122  初代SPIDER成像概念系统设计模型。 (a) SPIDER设计模型和分解图;(b)两个物理基线和三个光谱波段的PIC示意图;(c) SPIDER微透镜排列方式;(d)对应排列方式下频谱覆盖

    Figure  122.  Design model of the initial generation of SPIDER imaging conceptual system. (a) Design model and explosive view of SPIDER; (b) PIC schematics of the two physical baselines and three spectral bands; (c) Arrangement of SPIDER lenslets; (d) Corresponding spatial frequency coverage

    图  123  基于FINCH的非相干合成孔径技术[605]

    Figure  123.  Incoherent synthetic aperture based on FINCH[605]

    图  124  STORM的超分辨原理示意图与结果图[608,615]

    Figure  124.  Super-resolution schematic diagram and result diagram of STORM[608,615]

    图  125  STED的超分辨原理示意图和结果图[618]

    Figure  125.  The schematic diagram and results of super-resolution STED [618]

    图  126  SIM的超分辨原理及在不同时刻对动态微管的超分辨重建结果[610]

    Figure  126.  The of SIM and the super-resolution reconstruction results of dynamic microtubules at different times [610]

    图  127  3D超分辨显微典型实验结果图。(a) 3D SIM[626];(b) 3D STORM[628]

    Figure  127.  3D super-resolution microscopy experimential results. (a) 3D SIM[629]; (c) 3D STORM[628]

    图  128  两种代表性的主动式超快光学成像技术。(a)Nakagawa等人提出的一种基于顺序时间全光映射摄影术的超快成像技术(sequentially time all-optical mapping photography, STAMP)[647]; (b)Kristensson等人提出的一种基于多曝光频率识别算法的超快成像技术(frequency recognition algorithm for multiple exposures, FRAME)[649]

    Figure  128.  Two representative active ultrafast optical imaging techniques. (a) An ultrafast imaging technique based on sequential time all-optical mapping photography (STAMP) proposed by Nakagawa et al.[647]; (b) An ultrafast imaging technique based on frequency recognition algorithm for multiple exposures (FRAME) proposed by Kristensson et al.[649]

    图  129  Gao等人提出的一种单帧压缩超快成像技术(compressed ultrafast photography, CUP)[653]

    Figure  129.  A single-shot compressed ultrafast photography technique (CUP) proposed by Gao et al.[653]

    图  130  基于数字光处理DLP(Digital Light Processing)技术的数字投影仪基本结构及其核心部件DMD

    Figure  130.  Basic structure of a digital projector based on Digital Light Processing (DLP) technology and its core component DMD

    图  131  单个DMD微镜的工作原理

    Figure  131.  Working principle of a single DMD micromirror

    图  132  DMD显示8位灰度图像的二元时间脉冲宽度调制机理

    Figure  132.  Binary time pulse width modulation mechanism for 8-bit grayscale image displayed by DMD

    图  133  跳动兔子心脏的测量结果[671]

    Figure  133.  The measurement result of beating rabbit heart[671]

    图  134  对气枪发射的子弹的三维测量与跟踪[669]。(a)不同时间点的相机图像; (b)相应3D重建结果; (c)枪口区域的3D重建(对应于(b)中所示的盒装区域)以及在飞行过程中的三个不同时间点的子弹(7.5 ms,12.6 ms和17.7 ms)的3 D重建 (插图显示在17.7 ms处穿过飞行子弹中心的水平(x-z)和垂直(y-z)轮廓);(d)最后时刻(135 ms)场景的3 D点云,彩色线显示130 ms长的子弹轨迹(插图为子弹速度-时间的图)

    Figure  134.  3D measurement and tracking a bullet fired from a toy gun[669]. (a) Representative camera images at different time points; (b) Corresponding color-coded 3D reconstructions; (c) 3D reconstruction of the muzzle region (corresponding to the boxed region shown in (b)) as well as the bullet at three different points of time over the course of flight (7.5 ms, 12.6 ms, and 17.7 ms) (The insets show the horizontal (xz) and vertical (y-z) profiles crossing the body center of the flying bullet at 17.7 ms); (d) 3D point cloud of the scene at the last moment (135 ms), with the colored line showing the 130 ms long bullet trajectory (The inset plots the bullet velocity as a function of time)

    图  135  阵列投影技术及GOBO投影技术[678-679]。(a)阵列投影仪及用该投影仪搭建的三维测量系统;(b)GOBO投影仪及用该投影仪搭建的三维测量系统

    Figure  135.  Array projection technology and GOBO projection technology[678-679]. (a) Array projector and three-dimensional measuring system set up with the projector; (b) GOBO projector and three-dimensional measuring system set up with the projector

    图  136  对安全气囊弹出过程的3D重建结果[679]

    Figure  136.  3D reconstruction results for the airbag ejection process[679]

    图  137  5D高光谱成像系统、结果及高速热成像系统、结果[680-681]。(a) 5D高光谱成像系统;(b) 高速热成像系统;(c) 5D高光谱成像结果:对柑橘植物的吸水性的测量;(d)高速热成像结果:不同时间对篮球运动员的测量

    Figure  137.  The systems and results of 5D hyperspectral imaging and high speed thermal imaging[680-681]. (a) 5D hyperspectral imaging system; (b) High speed thermal imaging system; (c) 5D hyperspectral imaging results: The measurement of water absorption by a citrus plant; (d) High-speed thermal imaging results: The measurement of a basketball player at different times

    图  138  μDLP对高速转动风扇的三维测量[682],这些场景在训练过程中都不存在。第一行至第三行为通过μDLP获得风扇在1000~5000 r/min相应的3D重建

    Figure  138.  Measurement of a dynamic scene that includes a static model and a falling table tennis[682], which are also not present in the training process. The first line to the third line pass µDLP obtains the corresponding 3D reconstruction of the fan at 1000 ~ 5000 r/min

    图  139  像增强器工作原理及成像示意图

    Figure  139.  Working principle and imaging diagram of image intensifier

    图  140  远距离成像情况下,EMCCD成像结果与单光子四种不同算法重建结果对比图

    Figure  140.  EMCCD imaging result is compared with the reconstruction results of four different single photon algorithms in the case of long-distance imaging

    图  141  光子计数成像的原理

    Figure  141.  Principle of the photon counting imaging system

    图  142  不同情况下的回波示意图与重建结果

    Figure  142.  Schematic diagram of echo and reconstruction results under different conditions

    图  143  8.2 km外目标的超分辨成像结果

    Figure  143.  Super-resolution results of target located at 8.2 km

    图  144  远距离单光子激光雷达成像示意图

    Figure  144.  Illustration of long range single photon Lidar imaging

    图  145  首达光子的三维重建结果。(a)~(c) 单光子结果三个方向的逐点的最大似然处理;(d)~(e) 对应反射率估算的结果;(g)~(i) 环境噪声处理;(j)~(l) 结果的3D估计

    Figure  145.  Calculate the first photon 3D reconstruction of reflectance. (a)-(c) Point-by-point maximum likelihood processing in the three directions of the single photon result; (d)-(f) Corresponding reflectance estimation results; (g)-(i) Environmental noise processing; (j)-(l) 3D estimation results

    图  146  超过 201.5 km的远程主动成像示意图。在中国乌鲁木齐市附近实施的实验的卫星图像,单光子激光雷达被放置在野外的一个临时实验室。(a) 由配备望远镜的标准天文相机拍摄的山脉可见波段照片,海拔约4500 m;(b) 实验装置示意图;(c) 设置硬件的照片,包括光学系统(左上角和左下角)和电子控制系统(右下角);(d) 临时实验室在海拔 1770 m 处的视图

    Figure  146.  Illustration of the long-range active imaging over 201.5 km. Satellite image of the experiment implemented near the city of Urumqi, China, where the single-photon lidar is placed at a temporary laboratory in the wild. (a) Visible-band photograph of the mountains taken by a standard astronomical camera equipped with a telescope. The elevation is approximately 4500 m; (b) Schematic diagram of the experimental setup; (c) Photograph of the setup hardware, including the optical system (top and bottom left) and the electronic control system (bottom right; (d) View of the temporary laboratory at an altitude of 1770 m

    图  147  201.5 km以上场景重建结果。(a)真实可见光照片;(b) Lindell等人在2018年对SBR ~ 0.04、平均信号PPP ~ 3.58的数据的重建深度结果;(c)重建结果的三维剖面图

    Figure  147.  Reconstruction results of a scene over 201.5 km. (a) Real visible-band photo; (b) The reconstructed depth result by Lindell et al. in 2018 for the data with SBR ~ 0.04 and mean signal PPP ~ 3.58; (c) A 3 D profile of the reconstructed result

    图  148  基于深度学习进行极弱光成像的结果。(a)摄像机输出(ISO 8000);(b)摄像机输出(ISO 409600);(c)由原始数据(a)恢复得到的结果[705]

    Figure  148.  Results of extremely weak light imaging based on deep learning. (a) Camera output with ISO 8000; (b) Camera output with ISO 409600; (c) Our result from the raw data of (a) [705]

    图  149  提出的单光子三维成像多尺度网络图

    Figure  149.  Diagram of proposed multi-scale network for single-photon 3D imaging with multiple returns

    图  150  三个远程户外场景的重建结果。第一行的高层建筑距离成像系统21.6 km,空间分辨率256×256,信噪比为0.114,每像素1.228光子。第二行距离成像系统1.2 km,空间分辨率176×176,信噪比为0.109,每像素3.957光子。第三行的高塔,距离成像系统3.8 km,空间分辨率512×512,信噪比为0.336,每像素1.371光子。GT表示系统在较长的采集时间内捕获的地面真实深度图

    Figure  150.  The reconstruction results for three long range outdoor scenes. First row: A tall building, that locates at 21.6 km away from imaging system with a spatial resolution of 256×256, signal-to-noise ratio is 0.114, and 1.228 photons per pixel. Second row: That locates at 1.2 km away from our imaging system with a spatial resolution of 176×176, signal-to-noise ratio is 0.109, and 3.957 photons per pixel. Third row: A tall tower named Pole, that locates at 3.8 km away from our imaging system with a spatial resolution of 512×512, signal-to-noise ratio is 0.336, and 1.371 photons per pixel. GT denotes the ground truth depth maps captured by system with a long acquisition time

    图  151  对于传统光学系统,视场与分辨率这两个参数互相矛盾,无法同时兼顾。(a) 35 mm单反相机不同焦距下所对应的视场角;(b) 35 mm单反相机不同焦距下所拍摄到的典型图像

    Figure  151.  For traditional optical systems, the two parameters of field of view and resolution are contradictory and cannot be taken into account at the same time. (a) The corresponding field of view angle of 35 mm SLR camera under different focal length; (b) Typical images taken by 35 mm SLR camera under different focal lengths

    图  152  GigaPan全景拍摄系统及拍摄拼接所得的像素全景图

    Figure  152.  GigaPan panoramic shooting system and pixel panorama obtained by shooting splicing

    图  153  ARGUS-IS系统及其成像效果。(a) ARGUS-IS 系统外型;(b)系统采用了368个图像传感器和四个主镜头,其中92个传感器为一组,共用一个主镜头。通过巧妙设置传感器的安装位置,使得每组传感器获得的图像错位,互为补充,再通过图像拼接,能够得到较好的整体成像结果;(c)此成像系统在 6000 m高空有效覆盖7.2 km×7.2 km的地面区域

    Figure  153.  ARGUS-IS system and its imaging effect. (a) ARGUS-IS system appearance; (b) The system uses 368 image sensors and four main lenses, of which 92 sensors are a group and share a main lens. By skillfully setting the installation position of sensors, the images obtained by each group of sensors are misaligned and complementary to each other, and then through image mosaic, better overall imaging results can be obtained; (c) The imaging system effectively covers 7.2 km × 7.2 km ground area at an altitude of 6 km

    图  154  多相机拼接系统。(a) Lytro公司所研制的光场采集系统Immerge;(b) 斯坦福半环型相机阵列系统;(c) 斯坦福平面型相机阵列系统;(d) Camatrix环型相机阵列系统;(e) 清华大学鸟笼相机阵列系统

    Figure  154.  Multi camera splicing system. (a) Light field acquisition system Immerge developed by lytro company; (b) Stanford semi ring camera array system; (c) Stanford planar camera array system; (d) Camatrix ring camera array system; (e) Tsinghua University birdcage camera array system

    图  155  (a) 瑞士洛桑联邦理工学院(EPFL)的科研团队设计并研制了仿生复眼成像设备Panoptic;(b) 大视场高分辨率的OMNI-R系统;(c) Nicholas Law研制的艾弗里地基望远系统Evryscope

    Figure  155.  (a) The research team of the Federal Institute of Technology (EPFL) in Lausanne, Switzerland, designed and developed the bionic compound eye imaging device Panoptic; (b) OMNI-R system with large field of view and high resolution; (c) Everyscope, avery ground-based telescope system developed by Nicholas Law

    图  156  多尺度成像系统。(a) AWARE-2结构图;(b) AWARE-10结构图;(c) AWARE-40结构图

    Figure  156.  Multiscale imaging system. (a) AWARE-2 structure drawing; (b) AWARE-10 structural drawing; (c) AWARE-40 structure drawing

    图  157  传统显微镜存在分辨率与视场大小难以同时兼顾的矛盾:低倍镜下视野大,但分辨率低;切换到高倍镜后分辨率虽得以提升,视场却相应的成更高比例的缩减

    Figure  157.  There is a tradeoff between the resolution and FOV in traditional microscopes: The FOV under low-magnification objective is large with the low resolution; for high-magnification objective, the resolution is improved while the FOV is reduced dramatically

    图  158  克服传统显微镜空间带宽积受限四类可能的解决方案。(a) 芯片上无透镜全息显微成像技术;(b) 傅里叶叠层显微成像技术;(c) 合成孔径/合成视场全息显微技术;(d) 基于流式细胞术的显微成像技术

    Figure  158.  Four types of possible solutions to overcome the limited spatial bandwidth area of conventional microscopes. (a) On-chip lens-free holographic microscopy; (b) Fourier ptychography microscopy; (c) Synthetic aperture/FOV holographic microscopy; (d) Flow cytometric microscopy

    图  159  芯片上无透镜全息显微成像的“亚像素”超分辨技术。(a) 通过移动照明实现亚像素微扫描;(b) 笔者课题组所提出的基于倾斜平行平板的主动亚像素微扫描方案

    Figure  159.  Sub-pixel super-resolution technology based on the lens-free holographic microscope. (a) Sub-pixel micro-scanning by moving illumination; (b) Active sub-pixel micro-scanning scheme with inclined parallel plate proposed by our research group

    图  160  相量传播方法对全息成像重构过程中数据的利用提升[750]

    Figure  160.  Propagation phasor approach improves the data efficiency of holographic imaging[750]

    图  161  基于单帧傅里叶叠层显微成像的高通量定量显微成像

    Figure  161.  High throughput quantitative microscopic imaging based on single frame Fourier ptychographic microscopy

    图  162  单像素成像原理示意图[803]

    Figure  162.  Schematic of single-pixel imaging[803]

    图  163  二维傅里叶单像素成像实验装置示意图[815]

    Figure  163.  Experimental set-up of two-dimension Fourier single-pixel imaging[815]

    图  164  二维傅里叶单像素成像实验结果[815],重建图像的分辨率为256×256 pixel

    Figure  164.  Experimental results of two-dimension Fourier single-pixel imaging[815], the pixels of the reconstructed image are 256×256

    图  165  基于立体视觉的三维单像素成像实验光路[42]

    Figure  165.  Experimental set-up of a stereo vision based 3D single-pixel imaging[42]

    图  166  图像立方法的概述[824]。(a)从场景中背向散射的照明激光脉冲的原始信号;(b)展宽后的信号;(c)利用探测信号得到的一组包含不同深度图像的立方;(d)横截面上的每个位置沿纵向轴的强度分布,包含深度信息;(e)反射率图;(f)深度图可以从图像立方体中解算出来,然后用于重建;(g)场景的三维图像

    Figure  166.  Overview of the image cube method[824]. (a) The illuminating laser pulses back-scattered from a scene are measured as (b) broadened signals; (c) An image cube, containing images at different depths, is obtained using the measured signals; (d) Each transverse location has an intensity distribution along the longitudinal axis, indicating depth information; (e) Reflectivity and (f) a depth map can be estimated from the image cube, and then be used to reconstruct; (g) A 3D image of the scene

    图  167  多维傅里叶单像素成像实验结果[832]。(a)包含目标物体三个模态(空间-三维-彩色)的傅里叶谱,采样率为12%;(b)对重建效果图选中部分区域(红色实线部分);(c)三维彩色重建上视图;(d)三维彩色重建斜视图;(e)三维彩色重建侧视图

    Figure  167.  Experimental results of multi-modality Fourier single-pixel imaging[832]. (a) Fourier transform with spatial, 3D, and color three modality information of target object, where sampling ratio = 12%; (b) Image reconstructed from (a) with partial enlargement; (c)-(e) Top, perspective, and side views of the three-dimensional reconstruction of the object

    图  168  实时太赫兹波段单像素成像实验图[833]

    Figure  168.  Experimental set-up of terahertz imaging with a single-pixel detector[833]

    图  169  (a) 投影式无透镜显微成像实验系统图[833-835];(b) 无透镜光流体显微镜横截面示意图;(c) (b) 中实验装置顶视图。白色圆圈为小孔,浅灰色虚线网格为镀铝二维CMOS图像传感器,蓝线为微流体通道[758,836]

    Figure  169.  (a) Experimental setup[833-835] for lens-free shadow imaging platform; (b) Cross-sectional scheme of the opToFluidic microscopy; (c) The top view of the device (b) The white circles are apertures. The gray dashed grid is the CMOS sensor coated with Al, and the blue lines are the whole microfluidic channel[758,836]

    图  170  无透镜荧光成像原理图。整个成像系统放大率~1,全反射发生在玻璃-空气界面,位于玻璃层底部。为避免检测到散射激发光,在玻璃层下添加了塑料吸收滤光片。(TIR:全反射的缩写;图像是根据参考文献[837-838,841]修改而来)

    Figure  170.  Schematic diagram of the lens-free on-chip fluorescent imaging platform, whose platform has unit magnification. The TIR occurs at the glass-air interface at the bottom facet of the cover glass. To avoid detection of scattered photons a plastic absorption filter is used behind the faceplate. (TIR: Short for total reflection; The image was modified from the references[837-839,841])

    图  171  (a) 无透镜三维层析显微镜实物图[239];(b) (左)用于原位纳米透镜成型和无透镜成像的紧凑装置的实物图和(右)结构原理图[847]

    Figure  171.  (a) Photograph of the lens-free tomography platform[239]; (b) (Left) Photograph and (right) computer graphic diagram of a compact device for in situ nanolens formation and lens-free imaging[847]

    图  172  基于多角度照明的无透镜三维层析成像结果。(a) 马蛔虫子宫切片的折射率重构结果;(b) (a)中红色矩形框中折射率的三维渲染[239];(c) 线虫在z=3μm位置的成像结果;(d1)~(d2) 分别为线虫的前部和后部在y-z平面的成像结果;(e1)~(e2) 分别为沿着(c)中实心箭头和虚线箭头方向的x-z平面的成像结果[548]。(图像是根据参考文献[239,548]修改而来)

    Figure  172.  3D tomographic reconstructions of lens-free on-chip microscope based on multi-angle illumination. (a) The recovered refractive index depth sections of a slice of the uterus of Parascaris equorum; (b) The 3D renderings of the refractive index for the boxed area in (a)[239]; (c) A tomogram for the entire worm corresponding to a plane that is 3 μm above the center of the worm; (d1)-(d2) y-z ortho slices from the anterior and posterior regions of the worm, respectively; (e1)-(e2) x-z ortho slices along the direction of the solid and dashed arrow in (c), respectively[548]

    图  173  无透镜非干涉编码孔径相关全息术。(a) 两颗LEDs和 (b) 两个硬币相距15 mm的重建结果[851]

    Figure  173.  Incoherent lens-free imaging. (a) Two LEDs and (b) two one-dime coins separated by a distance of 15 mm by LI-COACH[851]

    图  174  基于菲涅尔波带片的非相干无透镜摄像。(a) 无镜头相机的实时图像捕获和重建[853];(b) 利用菲涅尔波带片单帧无透镜相机对二值、灰度和彩色图像进行重建[856]

    Figure  174.  Lens-free imaging with FZP and incoherent illumination. (a) Real-time image capturing and reconstruction demonstration of a prototyped lens-free camera[853]; (b) the reconstructions for the binary, grayscale and color images using the FZP single-shot lens-free camera[856]

    图  175  FlatCam结构。(a) 一个二进制的编码掩码被放置在离一个现成的数字图像传感器0.5 mm的地方,对场景进行编码;(b) 一个通过传感器测量和图像重建解决一个计算逆问题的例子

    Figure  175.  FlatCam architecture. (a) A binary, coded mask is placed 0.5 mm away from an off-the-shelf digital image sensor; (b) An example of sensor measurements and the image reconstructed by solving a computational inverse problem

    图  176  自适应光学系统的成像原理图

    Figure  176.  Imaging principle of the system based on adaptive optics

    图  177  SOR 望远镜对低轨卫星的成像效果[875]。(a) 校正前;(b) 校正后;(c) 校正+图像处理

    Figure  177.  Low orbit satellite imaging by SOR telescope[875]. (a) Uncompensated; (b) Compensated; (c) Compensated + image processing

    图  178  成像和人眼测试所用的自适应光学系统的基本布局

    Figure  178.  Basic layout of an adaptive optics system for imaging and vision testing

    图  179  AO-CSLO拍摄的人眼视网膜分层高分辨力图像。(a) 活体人眼视网膜感光细胞层;(b) 毛细血管层;(c) 神经纤维层图像

    Figure  179.  Layered high resolution images taken by the AO-CSLO system. (a) Layer of human retina photoreceptors in vivo; (b) Layer of blood capillaries; (c) Layer of nerve fibers

    图  180  自适应光学在宽场荧光与共聚焦显微镜中的应用

    Figure  180.  Application of adaptive optics in wide field fluorescence and confocal microscope

    图  181  自适应光学在共聚焦显微镜与多光子显微镜中的应用

    Figure  181.  Application of adaptive optics in confocal microscope and multiphoton microscope

    图  182  自适应光学在宽场荧光显微镜和超分辨率荧光显微镜中的应用。 (a) 微管蛋白染色的 HeLa 细胞在校正之前(左) 和之后(右) 的宽场荧光显微成像[892];(b) 一组标称直径为 121 nm 的荧光微球通过常规、共焦和结构光显微成像[894];(c) 使用 DM 和 SLM 来补偿 STED显微镜中所有三个路径的像差[897]; (d) Atto647N 标记的囊泡谷氨酸转运蛋白在完整果蝇大脑突触中的共聚焦(左) 和 3D STED(右) 图像的比较[895]

    Figure  182.  Application of adaptive optics in wide field fluorescence microscopy and super-resolution fluorescence microscopy. (a) Wide-field fluorescence microscopy of tubulin stained HeLa cells before(left) and after(right) correction[892]; (b) A cluster of fluorescent microspheres of nominal diameter 121 nm, as imaged by conventional , confocal , and structured illumination microscopy[894]; (c) By using DM and SLM to compensate all of the three path aberrations in STED microscopy[897]; (d) Comparison of Confocal(left) and 3D STED(right) images of Atto647N labelled vesicular glutamate transporter in synaptic boutons in intact Drosophila brains[895]

    图  183  基于反馈的波前调制原理及实验结果[919]

    Figure  183.  Principle and experimental results of feedback-based wavefront shaping[919]

    图  184  基于散射介质的TM测量原理[920]

    Figure  184.  TM measurement principle based on scattering medium[920]

    图  185  基于光学相位共轭的生物组织散射成像[924]

    Figure  185.  Optical phase conjugation based scattering imaging of biological tissue[924]

    图  186  透过强散射层非入侵式散射成像示意图[18]

    Figure  186.  Schematic of the apparatus for non-invasive imaging through strongly scattering layers

    图  187  基于单帧散斑自相关的透过强散射层成像[923]。(a)实验装置模型; (b)相机原始图像; (c)自相关; (d)通过迭代相位恢复算法重建物体; (e)实验系统; (f)相机原始数据; (g)~(k)第一列为自相关,第二列为重建物体,第三列为真实的物体

    Figure  187.  Single frame imaging based on speckle autocorrelation[923] (a) Experimental set-up; (b) Raw camera image; (c) The autocorrelation of the seemingly information-less raw camera image; (d) The object’s image is obtained from the autocorrelation of by an iterative phase-retrieval algorithm; (e) Photograph of the experiment; (f) Raw camera image; (g)-(k) Left column: calculated autocorrelation of the image in (b), Middle column: reconstructed object from the image autocorrelation. Right column: image of the real hidden object

    图  188  基于深度学习进行散射介质成像的网络原理图[927]

    Figure  188.  Network schematic diagram of imaging through scattering medium based on deep learning[927]

    图  189  典型的非视域成像系统示意图

    Figure  189.  Schematic diagram of typical non field of view imaging system

    图  190  (a)捕获过程:通过用脉冲激光依次照亮墙上的单个点并用条纹相机记录墙上虚线段的图像来捕获一系列图像;(b)按顺序收集的条纹图像示例。根据校准信号对强度进行标准化。红色对应于最大强度,蓝色对应于最小强度;(c)通过重建算法恢复的隐藏物体的3D形状的2D投影视图

    Figure  190.  (a) The capture process: capture a series of images by sequentially illuminating a single spot on the wall with a pulsed laser and recording an image of the dashed line segment on the wall with a streak camera; (b) An example of streak images sequentially collected. Intensities are normalized against a calibration signal. Red corresponds to the maximum, blue to the minimum intensities; (c) The 2D projected view of the 3D shape of the hidden object, as recovered by the reconstruction algorithm

    图  191  间接光传输的双重摄影[934]。(a)系统实验装置;(b)室内照明下拍摄到的扑克牌及书本视图;(c)投影仪扫描(d)扑克牌上指示点时获取的样本图像

    Figure  191.  Dual photography of indirect light transmission[934]. (a) System experimental device; (b) View of playing cards and books taken under indoor lighting; (c) Sample image obtained when the projector scans the indicated points on the playing cards in (d)

    图  192  基于单像素的“广播式”成像系统[935]

    Figure  192.  Proposed secured single-pixel broadcast imaging system[935]

    图  193  共焦非视域成像

    Figure  193.  Diagram of confocal non-line-of-sight imaging

    图  194  远距离NLOS成像实验。(a) 非视域成像实验的航空示意图;(b)光学仿真结果成像系统的装置图,它由两台同步望远镜组成,分别用于发射和接收;(c)尺寸为2 m×1 m的房间,作为隐藏场景示意图;(d) 非视域成像设置的实际照片;(e)、(f)在A位置拍摄的隐藏场景以及其放大照片,在A位置只能看到可见的墙;(g)隐藏物体的照片,在位于B的房间拍摄

    Figure  194.  Long-range NLOS imaging experiment. (a) An aerial schematic of the NLOS imaging experiment; (b) The optical setup of the NLOS imaging system, which consists of two synchronized telescopes for transmitter and receiver; (c) Schematic of the hidden scene in a room with a dimension size of 2 m×1 m; (d) An actual photograph of the NLOS imaging setup; (e)-(f) Zoomed-out and zoomed-in photographs of the hidden scene taken at location A, where only the visible wall can be seen; (g) Photograph of the hidden object, taken at the room located at B

    图  195  用不同方法比较重建的结果。(a)人体模型隐藏场景的重建结果;(b)字母H隐藏场景的重构结果

    Figure  195.  Comparison of the reconstructed results with different approaches. (a) The reconstructed results for the hidden scene of mannequin; (b) The reconstructed results for the hidden scene of letter H

    图  196  温度跳变约1 ℃所引起的热成像相机非均匀性[945]

    Figure  196.  Nonuniformity of the thermal imaging camera caused by temperature jump of approximately 1 °C[945]

    图  197  基于场景的非均匀性校正效果示意图

    Figure  197.  Scene-based non-uniformity correction results

    图  198  时域高通滤波的非均匀性校正方法

    Figure  198.  Non-uniformity correction method based om temporal high-pass filter

    图  199  长时间运动场景的期望(均值)图像近似满足恒定统计假设

    Figure  199.  The expected (mean) image of a long-time motion scene approximately satisfies the constant statistical assumption

    图  200  各类统计恒定法非均匀性校正的实验对比图。(a) 未校正图像; (b) 多尺度恒定统计; (c) 全局恒定统计; (d) 局部恒定统计

    Figure  200.  Experimental comparison plots of non-uniformity correction for various types of statistical constancy methods. (a) Uncorrected image; (b) Multiscale constant statistics; (c) Global constant statistics; (d) Local constant statistics

    图  201  基于神经网络模型的非均匀性校正方法

    Figure  201.  Non-uniformity correction method based on neural network

    图  202  全景图积累法示意图

    Figure  202.  Motion compensation average method

    图  203  基于帧间配准的非均匀性校正方法

    Figure  203.  Nonuniformity correction method based on inter-frame registration

    图  204  基于帧间配准的非均匀性校正方法要求精确估计强非均匀性对图像的相对位移

    Figure  204.  Non-uniformity correction method based on inter-frame registration require accurate estimation of the relative displacement of an image pair imposed by strong non-uniformity

    图  205  针对红外探测器的非均匀性和动态范围低等问题,南京理工大学针对性发展高性能红外图像信号处理技术,设计了集成有基于场景非均匀性校正与红外图像数字细节增强等核心算法定制化的ASIC芯片,并基于此研制了高性能无挡片热像仪

    Figure  205.  In response to the problems of non-uniformity and low dynamic range of infrared detectors, Nanjing University of Science and Technology has developed high-performance infrared image signal processing technology, designed an ASIC with customized core algorithms based on scene-based non-uniformity correction and digital detail enhancement of infrared images, and developed a high-performance shutterless thermal imaging camera

    图  206  高端光学设备仪器及其核心器件技术是西方军事强国对我国禁运的“卡脖子”技术和产品

    Figure  206.  High-end optical instruments and their core technologies are the "bottle-neck" technologies and products embargoed by the Western military powers to China

    图  207  中华人民共和国主席令(第一〇三号)中明确指出在功能、质量等指标能够满足需求的条件下,鼓励采购国产科研仪器

    Figure  207.  The Decree of the President of the People's Republic of China (No. 103) clearly states that under the condition that the function, quality and other indicators can meet the demand, the procurement of domestic scientific research instruments is encouraged

    表  1  典型35 mm单反相机镜头的空间带宽积

    Table  1.   Spatial bandwidth product of typical 35 mm SLR lens

    Focal length/mmField angle
    (diagonal)/(°)
    Typical
    F#
    Equivalent
    NA
    Focal plane resolution
    (550 nm)/μm
    Spatial bandwidth product
    Mega pixel/MP
    Megapixel/mrad
    81803.50.142.3963.350.29
    2094.51.80.271.24212.40.06
    5046.81.20.410.81828.70.016
    8528.61.40.350.95920.90.011
    10024.42.80.171.9734.90.018
    20012.340.122.7952.40.013
    4006.25.60.084.1931.10.009
    10002.580.065.5910.610.005
    下载: 导出CSV

    表  2  典型的显微物镜的空间带宽积

    Table  2.   Spatial bandwidth product of typical microscopic objectives

    Objectives
    (Magnification/Numerical aperture/Field number)
    Resolution/nm
    (Incident wavelength 532 nm)
    SBP/
    Megapixel·MP−1
    1.25×/0.04/26.5811321.5
    2×/0.08/26.5405733.5
    4×/0.16/26.5202833.5
    10×/0.3/26.5108218.9
    20×/0.5/26.564913.1
    40×/0.75/26.54337.4
    60×/0.9/26.53614.7
    100×/1.3/26.52503.5
    下载: 导出CSV
  • [1] 佚名. Nimrud lens[Z/OL]. (2018–12–07)[2019–06–17].https://en.wikipedia.org/w/index.php?title=Nimrud_lens&oldid=872400603.
    [2] 佚名. Huawei rewrites the rules of photography with ground-breaking huawei P30 series[EB/OL]. (2019-03-26)[2019–06–17]. https://consumer.huawei.com/en/press/news/2019/huawei-rewrites-the-rules-of-photography-with-the-new-p30-series/.
    [3] 佚名. Computational imaging[Z/OL]. (2019–03–15)[2019–06–17]. https://en.wikipedia.org/w/index.php?title=Computational_imaging&oldid=887843699.
    [4] KUBALA K, DOWSKI E, CATHEY W. Reducing complexity in computational imaging systems [J]. Optics Express, 2003, 11(18): 2102. doi:  10.1364/OE.11.002102
    [5] MAIT J, ATHALE R, van der GRACHT J. Evolutionary paths in imaging and recent trends [J]. Optics Express, 2003, 11(18): 2093-2101. doi:  10.1364/OE.11.002093
    [6] BIMBER O. Guest editor’s introduction: Computational photography-The next big step [J]. Computer, 2006, 39(8): 28-29. doi:  10.1109/MC.2006.261
    [7] RASKAR R. Computational photography[C/OL]//Frontiers in Optics 2009/Laser Science XXV/Fall 2009 OSA Optics & Photonics Technical Digest, 2009: CTuA1. [2019–06–23].https://www.osapublishing.org/abstract.cfm?uri=COSI-2009-CTuA1.
    [8] MAIT J N, EULISS G W, ATHALE R A. Computational imaging [J]. Advances in Optics and Photonics, 2018, 10(2): 409. doi:  10.1364/AOP.10.000409
    [9] BRADY D J. Optical Imaging and Spectroscopy[M]. New Jersey: John Wiley & Sons, 2009.
    [10] Computational imaging: Rethinking how we look at the world[J/OL]. [2019–06–26]. https://www.mitre.org/publications/project-stories/computational-imaging-rethinking-how-we-look-at-the-world.
    [11] CATHEY W T, FRIEDEN B R, RHODES W T, et al. Image gathering and processing for enhanced resolution [J]. JOSA A, 1984, 1(3): 241-250. doi:  10.1364/JOSAA.1.000241
    [12] MATIC R M, GOODMAN J W. Optimal pupil screen design for the estimation of partially coherent images [J]. JOSA A, 1987, 4(12): 2213-2227. doi:  10.1364/JOSAA.4.002213
    [13] MATIC R M, GOODMAN J W. Comparison of optical predetection processing and postdetection linear processing for partially coherent image estimation [J]. Journal of the Optical Society of America A, 1989, 6(2): 213. doi:  10.1364/JOSAA.6.000213
    [14] MATIC R M, GOODMAN J W. Optical preprocessing for increased system throughput [J]. JOSA A, 1989, 6(3): 428-440. doi:  10.1364/JOSAA.6.000428
    [15] VELDKAMP W B. Wireless focal planes “On the road to amacronic sensors” [J]. IEEE Journal of Quantum Electronics, 1993, 29(2): 801-813. doi:  10.1109/3.199331
    [16] DOWSKI E R, CATHEY W T. Extended depth of field through wave-front coding [J]. Applied Optics, 1995, 34(11): 1859-1866. doi:  10.1364/AO.34.001859
    [17] van der GRACHT J, JR E R D, JR W T C, et al. Aspheric optical elements for extended depth-of-field imaging[C/OL]//Novel Optical Systems Design and Optimization. International Society for Optics and Photonics, 1995: 279–288. [2019–06–24]. https://www.spiedigitallibrary.org/conference-proceedings-of-spie/2537/0000/Aspheric-optical-elements-for-extended-depth-of-field-imaging/10.1117/12.216392.short.
    [18] van der GRACHT J, DOWSKI E R, TAYLOR M G, et al. Broadband behavior of an optical–digital focus-invariant system [J]. Optics Letters, 1996, 21(13): 919-921. doi:  10.1364/OL.21.000919
    [19] ADELSON E H, BERGEN J R. The Plenoptic Function and the Elements of Early Vision[M]// Landy M, Movshon J A. Computational Models of Visual Processing. Massachusetts: MIT Press, 1991: 3-20.
    [20] LEVOY M, HANRAHAN P. Light field rendering[C/OL]//Proceedings of the 23 rd Annual Conference on Computer Graphics and Interactive Techniques-SIGGRAPH ’96. New York: ACM Press, 1996: 31–42. [2019–06–24]. http://portal.acm.org/citation.cfm?doid=237170.237199.
    [21] NAYAR S K, NOGUCHI M. Real-time focus range sensor [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1996, 18(12): 13.
    [22] BATLLE J, MOUADDIB E, SALVI J. Recent progress in coded structured light as a technique to solve the correspondence problem: a survey [J]. Pattern Recognition, 1998, 31(7): 963-982. doi:  10.1016/S0031-3203(97)00074-5
    [23] DEBEVEC P E, MALIK J. Recovering high dynamic range radiance maps from photographs[C/OL]//Proceedings of the 24 th Annual Conference on Computer Graphics and Interactive Techniques. New York, USA: ACM Press/Addison-Wesley Publishing Co., 1997: 369–378. [2019–06–24].https://doi.org/10.1145/258734.258884.
    [24] NAYAR S K, MITSUNAGA T. High dynamic range imaging: spatially varying pixel exposures[C]//Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2000.
    [25] NAYAR S K, BRANZOI V, BOULT T E. Programmable imaging using a digital micromirror array[C/OL]//Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR. Washington, DC, USA: IEEE, 2004: 436–443. [2019–06–24]. http://ieeexplore.ieee.org/document/1315065/.
    [26] MARKS D L, STACK R A, BRADY D J. Three-dimensional coherence imaging in the fresnel domain [J]. Applied Optics, 1999, 38(8): 1332-1342. doi:  10.1364/AO.38.001332
    [27] SCHECHNER Y Y, NARASIMHAN S G, NAYAR S K. Instant dehazing of images using polarization[C/OL]//Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR. Kauai, HI, USA: IEEE Comput Soc, 2001: I-325-I–332. [2019–06–24]. http://ieeexplore.ieee.org/document/990493/.
    [28] 佚名. OSA Topical Meeting on Integrated Image Gathering and Processing, Albuquerque, New Mexico[R]. Optical Society of America, 2001.
    [29] 佚名. CS 448 - topics in computer graphics: Computational photography[EB/OL]. [2019–06–25]. http://graphics.stanford.edu/courses/cs448-04-spring/.
    [30] 佚名. Symposium on computational photography and video[EB/OL]. [2019–06–25]. http://scpv.csail.mit.edu/.
    [31] RASKAR R, TUMBLIN J. Computational Photography, Imaging and Video[EB/OL]. [2019–06–25]. https://web.media.mit.edu/~raskar/photo/.
    [32] WILBURN B, JOSHI N, VAISH V, et al. High-speed videography using a dense camera array [C]//Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004.
    [33] LEVOY M, CHEN B, VAISH V, et al. Synthetic aperture confocal imaging[C/OL]//ACM SIGGRAPH 2004 Papers. New York, NY, USA: ACM, 2004: 825–834. [2019–06–24]. http://doi.acm.org/10.1145/1186562.1015806.
    [34] WILBURN B, JOSHI N, VAISH V, et al. High performance imaging using large camera arrays[C/OL]//ACM SIGGRAPH 2005 Papers. New York, NY, USA: ACM, 2005: 765–776. [2019–06–24]. http://doi.acm.org/10.1145/1186822.1073259.
    [35] NAYAR S K, BEN-EZRA M. Motion-based motion deblurring [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2004, 26(6): 689-698. doi:  10.1109/TPAMI.2004.1
    [36] RASKAR R, TAN K-H, FERIS R, et al. Non-photorealistic camera: Depth edge detection and stylized rendering using multi-flash imaging[C/OL]//ACM SIGGRAPH 2004 Papers. New York, USA: ACM, 2004: 679–688. [2019–06–26]. http://doi.acm.org/10.1145/1186562.1015779.
    [37] NG R, LEVOY M, BRÉDIF M, et al. Light field photography with a hand-held plenoptic camera [J]. Computer Science Technical Report CSTR, 2005, 2(11): 1-11.
    [38] 佚名. Lytro[Z/OL]. (2019–02–26)[2019–06–26]. https://en.wikipedia.org/w/index.php?title=Lytro&oldid=885240133.
    [39] SEN P, CHEN B, GARG G, et al. Dual photography[C/OL]//ACM SIGGRAPH 2005 Papers. New York, NY, USA: ACM, 2005: 745–755. [2019–06–24]. http://doi.acm.org/10.1145/1186822.1073257.
    [40] TAKHAR D, LASKA J N, WAKIN M B, et al. A new compressive imaging camera architecture using optical-domain compression[C/OL]. BOUMAN C A, MILLER E L, POLLAK I. [2019–06–26]. http://proceedings.spiedigitallibrary.org/proceeding.aspx?articleid=728899.
    [41] DUARTE M F, DAVENPORT M A, TAKHAR D, et al. Single-pixel imaging via compressive sampling [J]. IEEE Signal Processing Magazine, 2008, 25(2): 83-91. doi:  10.1109/MSP.2007.914730
    [42] SUN B, EDGAR M P, BOWMAN R, et al. 3 D computational imaging with single-pixel detectors [J]. Science, 2013, 340(6134): 844-847. doi:  10.1126/science.1234454
    [43] LEVOY M, NG R, ADAMS A, et al. Light Field Microscopy[C/OL]//ACM SIGGRAPH 2006 Papers. New York, NY, USA: ACM, 2006: 924–934. [2019–06–24]. http://doi.acm.org/10.1145/1179352.1141976.
    [44] RASKAR R, AGRAWAL A, TUMBLIN J. Coded exposure photography: Motion deblurring using fluttered shutter[C/OL]//ACM SIGGRAPH 2006 Papers. New York, NY, USA: ACM, 2006: 795–804[2017–03–14]. http://doi.acm.org/10.1145/1179352.1141957.
    [45] LEVIN A, FERGUS R, DURAND F, et al. Image and depth from a conventional camera with a coded aperture[C/OL]//ACM SIGGRAPH 2007 Papers. New York, NY, USA: ACM, 2007. [2019–06–26]. http://doi.acm.org/10.1145/1275808.1276464.
    [46] VEERARAGHAVAN A, RASKAR R, AGRAWAL A, et al. Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing[C/OL]//ACM SIGGRAPH 2007 Papers. New York, NY, USA: ACM, 2007. [2017–03–14]. http://doi.acm.org/10.1145/1275808.1276463.
    [47] RASKAR R. Less is more: Coded computational photography[C/OL]//Proceedings of the 8 th Asian Conference on computer Vision - Volume Part I. Berlin, Heidelberg: Springer-Verlag, 2007: 1–12. [2019–06–25]. http://dl.acm.org/citation.cfm?id=1775614.1775616.
    [48] GEHM M E, JOHN R, BRADY D J, et al. Single-shot compressive spectral imaging with a dual-disperser architecture [J]. Optics Express, 2007, 15(21): 14013-14027. doi:  10.1364/OE.15.014013
    [49] WAGADARIKAR A, JOHN R, WILLETT R, et al. Single disperser design for coded aperture snapshot spectral imaging [J]. Applied Optics, 2008, 47(10): B44-B51. doi:  10.1364/AO.47.000B44
    [50] BABCOCK H W. The possibility of compensating astronomical seeing [J]. Publications of the Astronomical Society of the Pacific, 1953, 65(386): 229-236.
    [51] HARDY J W. Active optics: A new technology for the control of light [C]//Proceedings of the IEEE, 1978, 66(6): 651–697.
    [52] FRIED D. Special issue on adaptive optics[J]. JOSA, 1977, 67(3): 47.
    [53] LINNIK V P. On the possibility of reducing the influence of atmospheric seeing on the image quality of stars[C]//European Southern Observatory Conference and Workshop Proceedings. 1994, 48: 535.
    [54] GERCHBERG R, SAXTON W. A practical algorithm for the determination of the phase from image and diffraction plane pictures [J]. Optik (Jena), 1972, 35: 237.
    [55] GERCHBERG R W. Phase determination from image and diffraction plane pictures in the electron microscope [J]. Optik, 1971, 34(3): 275-284.
    [56] FIENUP J R. Phase retrieval algorithms: A comparison [J]. Applied Optics, 1982, 21(15): 2758-2769. doi:  10.1364/AO.21.002758
    [57] TAKASAKI H. Moiré Topography [J]. Applied Optics, 1970, 9(6): 1467-1472. doi:  10.1364/AO.9.001467
    [58] CHIANG F P. Moire methods for contouring displacement, deflection, slope and curvature[C/OL]. [2019–06–26]. http://adsabs.harvard.edu/abs/1978 SPIE..153..113 C.
    [59] Creath K, Wyant J C. Moiré and fringe projection techniques [J]. Optical Shop Testing, 1992, 2: 653-685.
    [60] TAKEDA M, INA H, KOBAYASHI S. Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry [J]. JOSA, 1982, 72(1): 156-160. doi:  10.1364/JOSA.72.000156
    [61] TAKEDA M. Fourier fringe analysis and its application to metrology of extreme physical phenomena: A review [Invited] [J]. Applied Optics, 2013, 52(1): 20. doi:  10.1364/AO.52.000020
    [62] TAKEDA M, MUTOH K. Fourier transform profilometry for the automatic measurement of 3-d object shapes [J]. Applied Optics, 1983, 22(24): 3977-3982. doi:  10.1364/AO.22.003977
    [63] MALACARA D. Optical Shop Testing[M]. New Jersey: John Wiley & Sons, 2007.
    [64] BRUNING J H, HERRIOTT D R, GALLAGHER J, et al. Digital wavefront measuring interferometer for testing optical surfaces and lenses [J]. Applied Optics, 1974, 13(11): 2693-2703. doi:  10.1364/AO.13.002693
    [65] GABOR D. A New microscopic principle[EB/OL]. [2019–06–24]. https: //www.nature.com/articles/161777 a0.
    [66] LEITH E N, UPATNIEKS J. Reconstructed wavefronts and communication theory [J]. JOSA, 1962, 52(10): 1123-1130. doi:  10.1364/JOSA.52.001123
    [67] GOODMAN J W, LAWRENCE R W. Digital image formation from electronically detected holograms [J]. Applied Physics Letters, 1967, 11(3): 77-79. doi:  10.1063/1.1755043
    [68] KREIS T. Digital holographic interference-phase measurement using the Fourier-transform method [J]. JOSA A, 1986, 3(6): 847-855. doi:  10.1364/JOSAA.3.000847
    [69] NAKADATE S, YATAGAI T, SAITO H. Digital speckle-pattern shearing interferometry [J]. Applied Optics, 1980, 19(24): 4241-4246. doi:  10.1364/AO.19.004241
    [70] NAKADATE S, YATAGAI T, SAITO H. Electronic speckle pattern interferometry using digital image processing techniques [J]. Applied Optics, 1980, 19(11): 1879-1883. doi:  10.1364/AO.19.001879
    [71] SCHNARS U, JÜPTNER W. Direct recording of holograms by a ccd target and numerical reconstruction [J]. Applied Optics, 1994, 33(2): 179-181. doi:  10.1364/AO.33.000179
    [72] CUCHE E, MARQUET P, DEPEURSINGE C. Spatial filtering for zero-order and twin-image elimination in digital off-axis holography [J]. Applied Optics, 2000, 39(23): 4070-4075. doi:  10.1364/AO.39.004070
    [73] TAKEDA M, RU Q-S. Computer-based highly sensitive electron-wave interferometry [J]. Applied Optics, 1985, 24(18): 3068. doi:  10.1364/AO.24.003068
    [74] KLOTZ E, WEISS H. Three-dimensional coded aperture imaging using nonredundant point distributions [J]. Optics Communications, 1974, 11(4): 368-372. doi:  10.1016/0030-4018(74)90238-7
    [75] TIPTON M D, DOWDEY J E, BONTE F J, et al. Coded aperture imaging using on-axis Fresnel zone plates and extended Gamma-ray sources [J]. Radiology, 1974, 112(1): 155-158. doi:  10.1148/112.1.155
    [76] FENIMORE E E. Coded aperture imaging: Predicted performance of uniformly redundant arrays [J]. Applied Optics, 1978, 17(22): 3562-3570. doi:  10.1364/AO.17.003562
    [77] FENIMORE E E, CANNON T M. Coded aperture imaging with uniformly redundant arrays [J]. Applied Optics, 1978, 17(3): 337-347. doi:  10.1364/AO.17.000337
    [78] FENIMORE E E, CANNON T M, MILLER E L. Comparison of Fresnel zone plates and uniformly redundant arrays[C/OL]//Digital Image Processing II. International Society for Optics and Photonics, 1978: 232–236. [2019–06–26]. https://www.spiedigitallibrary.org/conference-proceedings-of-spie/0149/0000/Comparison-Of-Fresnel-Zone-Plates-And-Uniformly-Redundant-Arrays/10.1117/12.956690.short.
    [79] GOTTESMAN S R, FENIMORE E E. New family of binary arrays for coded aperture imaging [J]. Applied Optics, 1989, 28(20): 4344-4352. doi:  10.1364/AO.28.004344
    [80] OSA. Computational Optical Sensing and Imaging[EB/OL]. [2019–06–24]. https://www.osapublishing.org/conference.cfm?meetingid=15.
    [81] 佚名. IEEE International Conference on Computational Photography (ICCP)[EB/OL]. [2019–06–26]. https://ieeexplore.ieee.org/xpl/conhome/1800125/all-proceedings.
    [82] 佚名. Conference Detail for Computational Imaging IV[EB/OL]. [2019–06–26]. https://spie.org/SI/conferencedetails/computational-imaging?SSO=1.
    [83] 佚名. About TCI[EB/OL]. (2019–04–12)[2019–06–26]. https://signalprocessingsociety.org/publications-resources/ieee-transactions-computational-imaging/about-tci.
    [84] 佚名. To the cinematic and vr community, live long and prosper[EB/OL]. [2019–06–26]. https://web.archive.org/web/20180328000530/http://blog.lytro.com/to-the-cinematic-and-vr-community-live-long-and-prosper/.
    [85] 赵玲玲. 实景三维中国建设技术大纲印发[EB/OL]. [2022–02–15]. http://www.mnr.gov.cn/dt/ch/202108/t20210826_2678325.html.
    [86] Cao Liangcai, He Zehao, Liu Kexuan, et al. Progress and challenges in dynamic holographic 3 D display for the metaverse (Invited) [J]. Infrared and Laser Engineering, 2022, 51(1): 20210935. (in Chinese) doi:  10.3788/IRLA20210935
    [87] SULLIVAN B T. Computational photography is ready for its close-up[EB/OL]. [2019–06–26]. https://www.pcmag.com/article/362806/computational-photography-is-ready-for-its-close-up.
    [88] 佚名. Computational photography will completely revolutionize your smartphone camera - android authority[EB/OL]. [2019–06–26]. http://m.dailyhunt.in/news/india/english/android+authority-epaper-andauth/computational+photography+will+completely+revolutionize+your+smartphone+camera-newsid-94132611.
    [89] COWLEY J M. Diffraction Physics[M]. 3rd ed. Amsterdam: Elsevier Science B V, 1995.
    [90] GOODMAN J W. Introduction to Fourier Optics[M]. Colorado: Roberts and Company Publishers, 2005.
    [91] BORN M, WOLF E, BHATIA A B, et al. Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light[M]. 7th ed. Cambridge: Cambridge University Press, 1999.
    [92] 佚名. High-speed camera[Z/OL]. (2017–06–20). https://en.wikipedia.org/w/index.php?title=High-speed_camera&oldid=786672531.
    [93] ZERNIKE F. Phase contrast, a new method for the microscopic observation of transparent objects [J]. Physica, 1942, 9(7): 686-698. doi:  10.1016/S0031-8914(42)80035-X
    [94] NOMARSKI G. Differential microinterferometer with polarized waves [J]. J Phys Radium Paris, 1955, 16(9): 9S-11S.
    [95] ABRAMOVICI A, ALTHOUSE W E, DREVER R W, et al. LIGO: The laser interferometer gravitational-wave observatory [J]. Science, 1992, 256(5055): 325-333. doi:  10.1126/science.256.5055.325
    [96] ABBOTT B P, ABBOTT R, ABBOTT T D, et al. Observation of gravitational waves from a binary black hole merger [J]. Physical Review Letters, 2016, 116(6): 061102. doi:  10.1103/PhysRevLett.116.061102
    [97] OLE J, LEKBERG. Electronic speckle pattern interferometry [J]. Physics in Technology, 1980, 11(1): 16. doi:  10.1088/0305-4624/11/1/303
    [98] WANG W-C, HWANG C-H, LIN S-Y. Vibration measurement by the time-averaged electronic speckle pattern interferometry methods [J]. Applied Optics, 1996, 35(22): 4502-4509. doi:  10.1364/AO.35.004502
    [99] POPESCU G, IKEDA T, DASARI R R, et al. Diffraction phase microscopy for quantifying cell structure and dynamics [J]. Optics Letters, 2006, 31(6): 775-777. doi:  10.1364/OL.31.000775
    [100] SCHWARZ C J, KUZNETSOVA Y, BRUECK S R J. Imaging interferometric microscopy [J]. Optics Letters, 2003, 28(16): 1424-1426. doi:  10.1364/OL.28.001424
    [101] KUZNETSOVA Y, NEUMANN A, BRUECK S R J. Imaging interferometric microscopy–approaching the linear systems limits of optical resolution [J]. Optics Express, 2007, 15(11): 6651-6663. doi:  10.1364/OE.15.006651
    [102] SCHNARS U, JUEPTNER W. Digital Holography: Digital Hologram Recording, Numerical Reconstruction, and related techniques[M/OL]. Springer Science & Business Media, 2005. [2017–07–04]. https://link.springer.com/book/10.1007/b138284.
    [103] CUCHE E, BEVILACQUA F, DEPEURSINGE C. Digital holography for quantitative phase-contrast imaging [J]. Optics Letters, 1999, 24(5): 291-293. doi:  10.1364/OL.24.000291
    [104] SCHNARS U, JÜPTNER W P O. Digital recording and numerical reconstruction of holograms [J]. Measurement Science and Technology, 2002, 13(9): R85. doi:  10.1088/0957-0233/13/9/201
    [105] CUCHE E, MARQUET P, DEPEURSINGE C. Simultaneous amplitude-contrast and quantitative phase-contrast microscopy by numerical reconstruction of fresnel off-axis holograms [J]. Applied Optics, 1999, 38(34): 6994-7001. doi:  10.1364/AO.38.006994
    [106] KEMPER B, LANGEHANENBERG P, VON BALLY G. Digital holographic microscopy [J]. Optik & Photonik, 2007, 2(2): 41-44. doi:  10.1002/opph.201190249
    [107] KIM M K. Digital Holographic Microscopy[M/OL]//Digital Holographic Microscopy. New York: Springer, 2011: 149–190. [2017–07–04]. https://link.springer.com/chapter/10.1007/978-1-4419-7793-9_11.
    [108] KEMPER B, von BALLY G. Digital holographic microscopy for live cell applications and technical inspection [J]. Applied Optics, 2008, 47(4): A52-A61. doi:  10.1364/AO.47.000A52
    [109] MARQUET P, RAPPAZ B, MAGISTRETTI P J, et al. Digital holographic microscopy: A noninvasive contrast imaging technique allowing quantitative visualization of living cells with subwavelength axial accuracy [J]. Optics Letters, 2005, 30(5): 468-470. doi:  10.1364/OL.30.000468
    [110] HARTMANN J. Bemerkungen uber den bau und die justirung von spektrographen [J]. Zt Instrumentenkd, 1990, 20(47): 17-27.
    [111] PLATT B C, SHACK R. History and principles of Shack-Hartmann wavefront sensing [J]. Journal of Refractive Surgery, 2001, 17(5): S573-S577. doi:  10.3928/1081-597X-20010901-13
    [112] SHACK R V, PLATT B. Production and use of a lenticular Hartmann screen [J]. Journal of the Optical Society of America, 1971, 61: 656-661.
    [113] RAGAZZONI R. Pupil plane wavefront sensing with an oscillating prism [J]. Journal of Modern Optics, 1996, 43(2): 289-293. doi:  10.1080/09500349608232742
    [114] ESPOSITO S, RICCARDI A. Pyramid wavefront sensor behavior in partial correction adaptive optic systems [J]. Astronomy & Astrophysics, 2001, 369(2): L9-L12. doi:  10.1051/0004-6361:20010219
    [115] RAGAZZONI R, DIOLAITI E, VERNET E. A pyramid wavefront sensor with no dynamic modulation [J]. Optics Communications, 2002, 208(1): 51-60. doi:  10.1016/S0030-4018(02)01580-8
    [116] NEIL M A A, BOOTH M J, WILSON T. New modal wave-front sensor: A theoretical analysis [J]. JOSA A, 2000, 17(6): 1098-1107. doi:  10.1364/JOSAA.17.001098
    [117] BOOTH M J. Wave front sensor-less adaptive optics: A model-based approach using sphere packings [J]. Optics Express, 2006, 14(4): 1339-1352. doi:  10.1364/OE.14.001339
    [118] SCHÄFER B, MANN K. Determination of beam parameters and coherence properties of laser radiation by use of an extended Hartmann-Shack wave-front sensor [J]. Applied Optics, 2002, 41(15): 2809-2817. doi:  10.1364/AO.41.002809
    [119] SCHÄFER B, LÜBBECKE M, MANN K. Hartmann-Shack wave front measurements for real time determination of laser beam propagation parameters [J]. Review of Scientific Instruments, 2006, 77(5): 053103. doi:  10.1063/1.2198795
    [120] PFUND J, LINDLEIN N, SCHWIDER J, et al. Absolute sphericity measurement: A comparative study of the use of interferometry and a Shack–Hartmann sensor [J]. Optics Letters, 1998, 23(10): 742-744. doi:  10.1364/OL.23.000742
    [121] GREIVENKAMP J E, SMITH D G, GAPPINGER R O, et al. Optical testing using Shack-Hartmann wavefront sensors[C]// Optical Engineering for Sensing and Nanotechnology (ICOSN 2001). SPIE, 2001, 4416: 260-263.
    [122] DAYTON D, GONGLEWSKI J, PIERSON B, et al. Atmospheric structure function measurements with a Shack–Hartmann wave-front sensor [J]. Optics Letters, 1992, 17(24): 1737-1739. doi:  10.1364/OL.17.001737
    [123] RICKLIN J C, DAVIDSON F M. Atmospheric turbulence effects on a partially coherent Gaussian beam: Implications for free-space laser communication [J]. JOSA A, 2002, 19(9): 1794-1802. doi:  10.1364/JOSAA.19.001794
    [124] BOOTH M J. Adaptive optics in microscopy [J]. Philosophical Transactions of the Royal Society of London A:Mathematical, Physical and Engineering Sciences, 2007, 365(1861): 2829-2843. doi:  10.1098/rsta.2007.0013
    [125] CHA J W, BALLESTA J, SO P T C. Shack-Hartmann wavefront-sensor-based adaptive optics system for multiphoton microscopy [J]. Journal of Biomedical Optics, 2010, 15(4): 046022. doi:  10.1117/1.3475954
    [126] LIANG J, GRIMM B, GOELZ S, et al. Objective measurement of wave aberrations of the human eye with the use of a Hartmann–Shack wave-front sensor [J]. JOSA A, 1994, 11(7): 1949-1957. doi:  10.1364/JOSAA.11.001949
    [127] MORENO-BARRIUSO E, NAVARRO R. Laser ray tracing versus Hartmann–Shack sensor for measuring optical aberrations in the human eye [J]. JOSA A, 2000, 17(6): 974-985. doi:  10.1364/JOSAA.17.000974
    [128] KOHNEN T, KOCH D. Cataract and Refractive Surgery[M]. Berlin: Springer, 2006.
    [129] ALLEN L J, OXLEY M P. Phase retrieval from series of images obtained by defocus variation [J]. Optics Communications, 2001, 199: 65-75. doi:  10.1016/S0030-4018(01)01556-5
    [130] BAUSCHKE H H, COMBETTES P L, LUKE D R. Phase retrieval, error reduction algorithm, and fienup variants: A view from convex optimization [J]. JOSA A, 2002, 19(7): 1334-1345. doi:  10.1364/JOSAA.19.001334
    [131] BAUSCHKE H H, COMBETTES P L, LUKE D R. Hybrid projection–reflection method for phase retrieval [J]. JOSA A, 2003, 20(6): 1025-1034. doi:  10.1364/JOSAA.20.001025
    [132] ELSER V. Phase retrieval by iterated projections [J]. JOSA A, 2003, 20(1): 40-55. doi:  10.1364/JOSAA.20.000040
    [133] LUKE D R. Relaxed averaged alternating reflections for diffraction imaging [J]. Inverse Problems, 2005, 21(1): 37-50. doi:  10.1088/0266-5611/21/1/004
    [134] ZUO J M, VARTANYANTS I, GAO M, et al. Atomic resolution imaging of a carbon nanotube from diffraction intensities [J]. Science, 2003, 300(5624): 1419-1421. doi:  10.1126/science.1083887
    [135] EISEBITT S, LÜNING J, SCHLOTTER W F, et al. Lensless imaging of magnetic nanostructures by X-ray spectro-holography [J]. Nature, 2004, 432(7019): 885-888. doi:  10.1038/nature03139
    [136] MARCHESINI S, HE H, CHAPMAN H N, et al. X-ray image reconstruction from a diffraction pattern alone [J]. Physical Review B, 2003, 68(14): 140101. doi:  10.1103/PhysRevB.68.140101
    [137] GONSALVES R A, CHIDLAW R. Wavefront sensing by phase retrieval[C]//Applications of Digital Image Processing III. SPIE, 1979, 207: 32-39.
    [138] GUYON O. Limits of adaptive optics for high-contrast imaging [J]. The Astrophysical Journal, 2005, 629(1): 592. doi:  10.1086/431209
    [139] PEDRINI G, OSTEN W, ZHANG Y. Wave-front reconstruction from a sequence of interferograms recorded at different planes [J]. Optics Letters, 2005, 30(8): 833-835. doi:  10.1364/OL.30.000833
    [140] ZHANG Y, PEDRINI G, OSTEN W, et al. Whole optical wave field reconstruction from double or multi in-line holograms by phase retrieval algorithm [J]. Optics Express, 2003, 11(24): 3234-3241. doi:  10.1364/OE.11.003234
    [141] ANAND A, PEDRINI G, OSTEN W, et al. Wavefront sensing with random amplitude mask and phase retrieval [J]. Optics Letters, 2007, 32(11): 1584-1586. doi:  10.1364/OL.32.001584
    [142] ALMORO P F, PEDRINI G, GUNDU P N, et al. Phase microscopy of technical and biological samples through random phase modulation with a diffuser [J]. Optics Letters, 2010, 35(7): 1028-1030. doi:  10.1364/OL.35.001028
    [143] MUDANYALI O, TSENG D, OH C, et al. Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications [J]. Lab on a Chip, 2010, 10(11): 1417-1428. doi:  10.1039/C000453G
    [144] TSENG D, MUDANYALI O, OZTOPRAK C, et al. Lensfree microscopy on a cellphone [J]. Lab on a Chip, 2010, 10(14): 1787-1792. doi:  10.1039/C003477K
    [145] FAULKNER H M L, RODENBURG J M. Movable aperture lensless transmission microscopy: A novel phase retrieval algorithm [J]. Physical Review Letters, 2004, 93(2): 023903. doi:  10.1103/PhysRevLett.93.023903
    [146] FAULKNER H M L, RODENBURG J M. Error tolerance of an iterative phase retrieval algorithm for moveable illumination microscopy [J]. Ultramicroscopy, 2005, 103(2): 153-164. doi:  10.1016/j.ultramic.2004.11.006
    [147] GUIZAR-SICAIROS M, FIENUP J R. Phase retrieval with transverse translation diversity: A nonlinear optimization approach [J]. Optics Express, 2008, 16(10): 7264-7278. doi:  10.1364/OE.16.007264
    [148] THIBAULT P, DIEROLF M, MENZEL A, et al. High-resolution scanning X-ray diffraction microscopy [J]. Science, 2008, 321(5887): 379-382. doi:  10.1126/science.1158573
    [149] MAIDEN A M, RODENBURG J M. An improved ptychographical phase retrieval algorithm for diffractive imaging [J]. Ultramicroscopy, 2009, 109(10): 1256-1262. doi:  10.1016/j.ultramic.2009.05.012
    [150] THIBAULT P, DIEROLF M, BUNK O, et al. Probe retrieval in ptychographic coherent diffractive imaging [J]. Ultramicroscopy, 2009, 109(4): 338-343. doi:  10.1016/j.ultramic.2008.12.011
    [151] THIBAULT P, GUIZAR-SICAIROS M. Maximum-likelihood refinement for coherent diffractive imaging [J]. New Journal of Physics, 2012, 14(6): 063004. doi:  10.1088/1367-2630/14/6/063004
    [152] MAIDEN A, JOHNSON D, LI P. Further improvements to the ptychographical iterative engine [J]. Optica, 2017, 4(7): 736-745. doi:  10.1364/OPTICA.4.000736
    [153] MAIDEN A M, HUMPHRY M J, SARAHAN M C, et al. An annealing algorithm to correct positioning errors in ptychography [J]. Ultramicroscopy, 2012, 120: 64-72. doi:  10.1016/j.ultramic.2012.06.001
    [154] BECKERS M, SENKBEIL T, GORNIAK T, et al. Drift correction in ptychographic diffractive imaging [J]. Ultramicroscopy, 2013, 126: 44-47. doi:  10.1016/j.ultramic.2012.11.006
    [155] ZHANG F, PETERSON I, VILA-COMAMALA J, et al. Translation position determination in ptychographic coherent diffraction imaging [J]. Optics Express, 2013, 21(11): 13592. doi:  10.1364/OE.21.013592
    [156] THIBAULT P, MENZEL A. Reconstructing state mixtures from diffraction measurements [J]. Nature, 2013, 494(7435): 68-71. doi:  10.1038/nature11806
    [157] BATEY D J, CLAUS D, RODENBURG J M. Information multiplexing in ptychography [J]. Ultramicroscopy, 2014, 138: 13-21. doi:  10.1016/j.ultramic.2013.12.003
    [158] CLARK J N, HUANG X, HARDER R J, et al. Dynamic imaging using ptychography [J]. Physical Review Letters, 2014, 112(11): 113901. doi:  https://link.aps.org/doi/10.1103/PhysRevLett.112.113901
    [159] KARL R, BEVIS C, LOPEZ-RIOS R, et al. Spatial, spectral, and polarization multiplexed ptychography [J]. Optics Express, 2015, 23(23): 30250. doi:  10.1364/OE.23.030250
    [160] MAIDEN A M, HUMPHRY M J, ZHANG F, et al. Superresolution imaging via ptychography [J]. JOSA A, 2011, 28(4): 604-612. doi:  10.1364/JOSAA.28.000604
    [161] HUMPHRY M J, KRAUS B, HURST A C, et al. Ptychographic electron microscopy using high-angle dark-field scattering for sub-nanometre resolution imaging [J]. Nature Communications, 2012, 3: 730. doi:  10.1038/ncomms1733
    [162] STOCKMAR M, CLOETENS P, ZANETTE I, et al. Near-field ptychography: Phase retrieval for inline holography using a structured illumination [J]. Scientific Reports, 2013, 3(1): 1-6. doi:  http://www.nature.com/articles/srep01927
    [163] TAKAHASHI Y, SUZUKI A, FURUTAKU S, et al. High-resolution and high-sensitivity phase-contrast imaging by focused hard X-ray ptychography with a spatial filter [J]. Applied Physics Letters, 2013, 102(9): 094102. doi:  10.1063/1.4794063
    [164] MAIDEN A M, HUMPHRY M J, RODENBURG J M. Ptychographic transmission microscopy in three dimensions using a multi-slice approach [J]. JOSA A, 2012, 29(8): 1606-1614. doi:  10.1364/JOSAA.29.001606
    [165] GODDEN T M, SUMAN R, HUMPHRY M J, et al. Ptychographic microscope for three-dimensional imaging [J]. Optics Express, 2014, 22(10): 12513. doi:  10.1364/OE.22.012513
    [166] SUZUKI A, FURUTAKU S, SHIMOMURA K, et al. High-resolution multislice x-ray ptychography of extended thick objects [J]. Physical Review Letters, 2014, 112(5): 053903. doi:  10.1103/PhysRevLett.112.053903
    [167] SHIMOMURA K, SUZUKI A, HIROSE M, et al. Precession X-ray ptychography with multislice approach [J]. Physical Review B, 2015, 91(21): 214114. doi:  https://link.aps.org/doi/10.1103/PhysRevB.91.214114
    [168] THIBAULT P, ELSER V, JACOBSEN C, et al. Reconstruction of a yeast cell from X-ray diffraction data [J]. Acta Crystallographica Section A Foundations of Crystallography, 2006, 62(4): 248-261. doi:  10.1107/S0108767306016515
    [169] RODENBURG J M, HURST A C, CULLIS A G, et al. Hard-X-ray lensless imaging of extended objects [J]. Physical Review Letters, 2007, 98(3): 034801. doi:  https://link.aps.org/doi/10.1103/PhysRevLett.98.034801
    [170] GIEWEKEMEYER K, THIBAULT P, KALBFLEISCH S, et al. Quantitative biological imaging by ptychographic x-ray diffraction microscopy [J]. Proceedings of the National Academy of Sciences, 2010, 107(2): 529-534. doi:  10.1073/pnas.0905846107
    [171] MAIDEN A M, MORRISON G R, KAULICH B, et al. Soft X-ray spectromicroscopy using ptychography with randomly phased illumination [J]. Nature Communications, 2013, 4: 1669. doi:  10.1038/ncomms2640
    [172] RODENBURG J M, HURST A C, CULLIS A G. Transmission microscopy without lenses for objects of unlimited size [J]. Ultramicroscopy, 2007, 107(2-3): 227-231. doi:  10.1016/j.ultramic.2006.07.007
    [173] HUE F, RODENBURG J M, MAIDEN A M, et al. Extended ptychography in the transmission electron microscope: Possibilities and limitations [J]. Ultramicroscopy, 2011, 111(8): 1117-1123. doi:  10.1016/j.ultramic.2011.02.005
    [174] HUE F, RODENBURG J M, MAIDEN A M, et al. Wave-front phase retrieval in transmission electron microscopy via ptychography [J]. Physical Review B, 2010, 82(12): 121415.
    [175] BRADY G R, GUIZAR-SICAIROS M, FIENUP J R. Optical wavefront measurement using phase retrieval with transverse translation diversity [J]. Optics Express, 2009, 17(2): 624-639. doi:  10.1364/OE.17.000624
    [176] MAIDEN A M, RODENBURG J M, HUMPHRY M J. Optical ptychography: A practical implementation with useful resolution [J]. Optics Letters, 2010, 35(15): 2585-2587. doi:  10.1364/OL.35.002585
    [177] MARRISON J, RÄTY L, MARRIOTT P, et al. Ptychography – a label free, high-contrast imaging technique for live cells using quantitative phase information[J/OL]. Scientific Reports, 2013, 3(1)[2017–07–05]. http://www.nature.com/articles/srep02369.
    [178] ZHENG G, HORSTMEYER R, YANG C. Wide-field, high-resolution Fourier ptychographic microscopy [J]. Nature Photonics, 2013, 7(9): 739-745. doi:  10.1038/nphoton.2013.187
    [179] OU X, ZHENG G, YANG C. Embedded pupil function recovery for Fourier ptychographic microscopy [J]. Optics Express, 2014, 22(5): 4960. doi:  10.1364/OE.22.004960
    [180] SUN J, CHEN Q, ZHANG Y, et al. Efficient positional misalignment correction method for Fourier ptychographic microscopy [J]. Biomedical Optics Express, 2016, 7(4): 1336. doi:  10.1364/BOE.7.001336
    [181] YEH L-H, DONG J, ZHONG J, et al. Experimental robustness of Fourier ptychography phase retrieval algorithms [J]. Optics Express, 2015, 23(26): 33214. doi:  10.1364/OE.23.033214
    [182] DONG S, SHIRADKAR R, NANDA P, et al. Spectral multiplexing and coherent-state decomposition in Fourier ptychographic imaging [J]. Biomedical Optics Express, 2014, 5(6): 1757. doi:  10.1364/BOE.5.001757
    [183] TIAN L, LI X, RAMCHANDRAN K, et al. Multiplexed coded illumination for Fourier ptychography with an led array microscope [J]. Biomedical Optics Express, 2014, 5(7): 2376-2389. doi:  10.1364/BOE.5.002376
    [184] SUN J, CHEN Q, ZHANG Y, et al. Sampling criteria for Fourier ptychographic microscopy in object space and frequency space [J]. Optics Express, 2016, 24(14): 15765. doi:  10.1364/OE.24.015765
    [185] HORSTMEYER R, CHUNG J, OU X, et al. Diffraction tomography with Fourier ptychography [J]. Optica, 2016, 3(8): 827-835. doi:  10.1364/OPTICA.3.000827
    [186] ZUO C, SUN J, LI J, et al. Wide-field high-resolution 3 d microscopy with Fourier ptychographic diffraction tomography [J]. Optics and Lasers in Engineering, 2020, 128: 106003. doi:  10.1016/j.optlaseng.2020.106003
    [187] HORSTMEYER R, CHEN R Y, OU X, et al. Solving ptychography with a convex relaxation [J]. New Journal of Physics, 2015, 17(5): 053044. doi:  10.1088/1367-2630/17/5/053044
    [188] ZUO C, SUN J, CHEN Q. Adaptive step-size strategy for noise-robust Fourier ptychographic microscopy [J]. Optics Express, 2016, 24(18): 20724. doi:  10.1364/OE.24.020724
    [189] TEAGUE M R. Deterministic phase retrieval: A Green’s function solution [J]. JOSA, 1983, 73(11): 1434-1441. doi:  10.1364/JOSA.73.001434
    [190] STREIBL N. Phase imaging by the transport equation of intensity [J]. Optics Communications, 1984, 49(1): 6-10. doi:  10.1016/0030-4018(84)90079-8
    [191] ICHIKAWA K, LOHMANN A W, TAKEDA M. Phase retrieval based on the irradiance transport equation and the Fourier transform method: experiments [J]. Applied Optics, 1988, 27(16): 3433-3436. doi:  10.1364/AO.27.003433
    [192] RODDIER F, RODDIER C, RODDIER N. Curvature sensing: A new wavefront sensing method[C/OL]. 1988: 203-209. http://dx.doi.org/10.1117/12.948547.
    [193] RODDIER F. Curvature sensing and compensation: A new concept in adaptive optics [J]. Applied Optics, 1988, 27(7): 1223-1225. doi:  10.1364/AO.27.001223
    [194] RODDIER F. Wavefront sensing and the irradiance transport equation [J]. Applied Optics, 1990, 29(10): 1402-1403. doi:  10.1364/AO.29.001402
    [195] RODDIER N A. Algorithms for wavefront reconstruction out of curvature sensing data[C/OL]. http://dx.doi.org/10.1117/12.48799.
    [196] GUREYEV T E, ROBERTS A, NUGENT K A. Partially coherent fields, the transport-of-intensity equation, and phase uniqueness [J]. JOSA A, 1995, 12(9): 1942-1946. doi:  10.1364/JOSAA.12.001942
    [197] GUREYEV T E, NUGENT K A. Phase retrieval with the transport-of-intensity equation. ii. orthogonal series solution for nonuniform illumination [J]. JOSA A, 1996, 13(8): 1670-1682. doi:  10.1364/JOSAA.13.001670
    [198] GUREYEV T E, NUGENT K A. Rapid quantitative phase imaging using the transport of intensity equation [J]. Optics Communications, 1997, 133(1): 339-346. doi:  10.1016/S0030-4018(96)00454-3
    [199] PAGANIN D, NUGENT K A. Noninterferometric phase imaging with partially coherent light [J]. Physical Review Letters, 1998, 80(12): 2586. doi:  10.1103/PhysRevLett.80.2586
    [200] NUGENT K A, GUREYEV T E, COOKSON D F, et al. Quantitative phase imaging using hard x rays [J]. Physical Review Letters, 1996, 77(14): 2961. doi:  10.1103/PhysRevLett.77.2961
    [201] ALLMAN B E, MCMAHON P J, NUGENT K A, et al. Phase radiography with neutrons [J]. Nature, 2000, 408(6809): 158. doi:  10.1038/35041626
    [202] MCMAHON P J, ALLMAN B E, JACOBSON D L, et al. Quantitative phase radiography with polychromatic neutrons [J]. Physical Review Letters, 2003, 91(14): 145502. doi:  10.1103/PhysRevLett.91.145502
    [203] BAJT B, BARTY A, NUGENT K A, et al. Quantitative phase-sensitive imaging in a transmission electron microscope [J]. Ultramicroscopy, 2000, 83(1-2): 67-73. doi:  10.1016/S0304-3991(99)00174-6
    [204] MCMAHON P J, BARONE-NUGENT E D, ALLMAN B E, et al. Quantitative phase-amplitude microscopy ii: Differential interference contrast imaging for biological TEM [J]. Journal of Microscopy, 2002, 206(3): 204-208. doi:  10.1046/j.1365-2818.2002.01026.x
    [205] BELEGGIA M, SCHOFIELD M A, VOLKOV V V, et al. On the transport of intensity technique for phase retrieval [J]. Ultramicroscopy, 2004, 102(1): 37-49. doi:  10.1016/j.ultramic.2004.08.004
    [206] VOLKOV V V, ZHU Y. Lorentz phase microscopy of magnetic materials [J]. Ultramicroscopy, 2004, 98(2): 271-281. doi:  10.1016/j.ultramic.2003.08.026
    [207] MCVITIE S, CUSHLEY M. Quantitative Fresnel Lorentz microscopy and the transport of intensity equation [J]. Ultramicroscopy, 2006, 106(4-5): 423-431. doi:  10.1016/j.ultramic.2005.12.001
    [208] PETERSEN T C, KEAST V J, PAGANIN D M. Quantitative TEM-based phase retrieval of MgO nano-cubes using the transport of intensity equation [J]. Ultramicroscopy, 2008, 108(9): 805-815. doi:  10.1016/j.ultramic.2008.01.001
    [209] BARTY A, NUGENT K A, PAGANIN D, et al. Quantitative optical phase microscopy [J]. Optics Letters, 1998, 23(11): 817-819. doi:  10.1364/OL.23.000817
    [210] BARONE-NUGENT E D, BARTY A, NUGENT K A. Quantitative phase-amplitude microscopy I: Optical microscopy [J]. Journal of Microscopy, 2002, 206(3): 194-203. doi:  10.1046/j.1365-2818.2002.01027.x
    [211] STREIBL N. Three-dimensional imaging by a microscope [J]. JOSA A, 1985, 2(2): 121-127. doi:  10.1364/JOSAA.2.000121
    [212] SHEPPARD C J. Three-dimensional phase imaging with the intensity transport equation [J]. Applied Optics, 2002, 41(28): 5951-5955. doi:  10.1364/AO.41.005951
    [213] WALLER L, TIAN L, BARBASTATHIS G. Transport of intensity phase-amplitude imaging with higher order intensity derivatives [J]. Optics Express, 2010, 18(12): 12552-12561. doi:  10.1364/OE.18.012552
    [214] KOU S S, WALLER L, BARBASTATHIS G, et al. Transport-of-intensity approach to differential interference contrast (TI-DIC) microscopy for quantitative phase imaging [J]. Optics Letters, 2010, 35(3): 447-449. doi:  10.1364/OL.35.000447
    [215] WALLER L, LUO Y, YANG S Y, et al. Transport of intensity phase imaging in a volume holographic microscope [J]. Optics Letters, 2010, 35(17): 2961-2963. doi:  10.1364/OL.35.002961
    [216] WALLER L, KOU S S, SHEPPARD C J R, et al. Phase from chromatic aberrations [J]. Optics Express, 2010, 18(22): 22817-22825. doi:  10.1364/OE.18.022817
    [217] KOU S S, WALLER L, BARBASTATHIS G, et al. Quantitative phase restoration by direct inversion using the optical transfer function [J]. Optics Letters, 2011, 36(14): 2671-2673. doi:  10.1364/OL.36.002671
    [218] ALMORO P F, WALLER L, AGOUR M, et al. Enhanced deterministic phase retrieval using a partially developed speckle field [J]. Optics Letters, 2012, 37(11): 2088-2090. doi:  10.1364/OL.37.002088
    [219] GORTHI S S, SCHONBRUN E. Phase imaging flow cytometry using a focus-stack collecting microscope [J]. Optics Letters, 2012, 37(4): 707-709. doi:  10.1364/OL.37.000707
    [220] WALLER L, TSANG M, PONDA S, et al. Phase and amplitude imaging from noisy images by Kalman filtering [J]. Optics Express, 2011, 19(3): 2805-2815. doi:  10.1364/OE.19.002805
    [221] XUE B, ZHENG S, CUI L, et al. Transport of intensity phase imaging from multiple intensities measured in unequally-spaced planes [J]. Optics Express, 2011, 19(21): 20244-20250. doi:  10.1364/OE.19.020244
    [222] BIE R, YUAN X-H, ZHAO M, et al. Method for estimating the axial intensity derivative in the TIE with higher order intensity derivatives and noise suppression [J]. Optics Express, 2012, 20(7): 8186-8191. doi:  10.1364/OE.20.008186
    [223] ZHENG S, XUE B, XUE W, et al. Transport of intensity phase imaging from multiple noisy intensities measured in unequally-spaced planes [J]. Optics Express, 2012, 20(2): 972-985. doi:  10.1364/OE.20.000972
    [224] MARTINEZ-CARRANZA J, FALAGGIS K, KOZACKI T. Optimum measurement criteria for the axial derivative intensity used in transport of intensity-equation-based solvers [J]. Optics Letters, 2014, 39(2): 182-185. doi:  10.1364/OL.39.000182
    [225] FALAGGIS K, KOZACKI T, KUJAWINSKA M. Optimum plane selection criteria for single-beam phase retrieval techniques based on the contrast transfer function [J]. Optics Letters, 2014, 39(1): 30-33. doi:  10.1364/OL.39.000030
    [226] ZUO C, CHEN Q, ASUNDI A. Light field moment imaging: Comment [J]. Optics Letters, 2014, 39(3): 654. doi:  10.1364/OL.39.000654
    [227] ZUO C, CHEN Q, TIAN L, et al. Transport of intensity phase retrieval and computational imaging for partially coherent fields: The phase space perspective [J]. Optics and Lasers in Engineering, 2015, 71: 20-32. doi:  10.1016/j.optlaseng.2015.03.006
    [228] ZUO C, CHEN Q, ASUNDI A. Boundary-artifact-free phase retrieval with the transport of intensity equation: Fast solution with use of discrete cosine transform [J]. Optics Express, 2014, 22(8): 9220. doi:  10.1364/OE.22.009220
    [229] ZUO C, CHEN Q, LI H, et al. Boundary-artifact-free phase retrieval with the transport of intensity equation II: Applications to microlens characterization [J]. Optics Express, 2014, 22(15): 18310. doi:  10.1364/OE.22.018310
    [230] HUANG L, ZUO C, IDIR M, et al. Phase retrieval with the transport-of-intensity equation in an arbitrarily shaped aperture by iterative discrete cosine transforms [J]. Optics Letters, 2015, 40(9): 1976. doi:  10.1364/OL.40.001976
    [231] ZUO C, CHEN Q, HUANG L, et al. Phase discrepancy analysis and compensation for fast Fourier transform based solution of the transport of intensity equation [J]. Optics Express, 2014, 22(14): 17172. doi:  10.1364/OE.22.017172
    [232] ZUO C, CHEN Q, YU Y, et al. Transport-of-intensity phase imaging using Savitzky-Golay differentiation filter - Theory and applications [J]. Optics Express, 2013, 21(5): 5346-5362. doi:  10.1364/OE.21.005346
    [233] SUN J, ZUO C, CHEN Q. Iterative optimum frequency combination method for high efficiency phase imaging of absorptive objects based on phase transfer function [J]. Optics Express, 2015, 23(21): 28031. doi:  10.1364/OE.23.028031
    [234] ZUO C, SUN J, LI J, et al. High-resolution transport-of-intensity quantitative phase microscopy with annular illumination [J]. Scientific Reports, 2017, 7(1): 7654. doi:  10.1038/s41598-017-06837-1
    [235] LI J, CHEN Q, ZHANG J, et al. Efficient quantitative phase microscopy using programmable annular LED illumination [J]. Biomedical Optics Express, 2017, 8(10): 4687-4705. doi:  10.1364/BOE.8.004687
    [236] LI J, CHEN Q, SUN J, et al. Optimal illumination pattern for transport-of-intensity quantitative phase microscopy [J]. Optics Express, 2018, 26(21): 27599. doi:  10.1364/OE.26.027599
    [237] ZUO C, CHEN Q, QU W, et al. Noninterferometric single-shot quantitative phase microscopy [J]. Optics Letters, 2013, 38(18): 3538. doi:  10.1364/OL.38.003538
    [238] ZUO C, CHEN Q, QU W, et al. High-speed transport-of-intensity phase microscopy with an electrically tunable lens [J]. Optics Express, 2013, 21(20): 24060. doi:  10.1364/OE.21.024060
    [239] ZUO C, SUN J, ZHANG J, et al. Lensless phase microscopy and diffraction tomography with multi-angle and multi-wavelength illuminations using a LED matrix [J]. Optics Express, 2015, 23(11): 14314. doi:  10.1364/OE.23.014314
    [240] LI J, CHEN Q, ZHANG J, et al. Optical diffraction tomography microscopy with transport of intensity equation using a light-emitting diode array [J]. Optics and Lasers in Engineering, 2017, 95: 26-34. doi:  10.1016/j.optlaseng.2017.03.010
    [241] LI J, CHEN Q, SUN J, et al. Three-dimensional tomographic microscopy technique with multi-frequency combination with partially coherent illuminations [J]. Biomedical Optics Express, 2018, 9(6): 2526-2542. doi:  10.1364/BOE.9.002526
    [242] ZUO C, LI J, SUN J, et al. Transport of intensity equation: A tutorial [J]. Optics and Lasers in Engineering, 2020, 135: 106187. doi:  10.1016/j.optlaseng.2020.106187
    [243] HAMILTON D, SHEPPARD C. Differential phase contrast in scanning optical microscopy [J]. Journal of Microscopy, 1984, 133(1): 27. doi:  10.1111/j.1365-2818.1984.tb00460.x
    [244] HAMILTON D K, SHEPPARD C J R, WILSON T. Improved imaging of phase gradients in scanning optical microscopy [J]. Journal of Microscopy, 1984, 135(3): 275. doi:  10.1111/j.1365-2818.1984.tb02533.x
    [245] MEHTA S B, SHEPPARD C J R. Quantitative phase-gradient imaging at high resolution with asymmetric illumination-based differential phase contrast [J]. Optics Letters, 2009, 34(13): 1924. doi:  10.1364/OL.34.001924
    [246] TIAN L, WALLER L. Quantitative differential phase contrast imaging in an LED array microscope [J]. Optics Express, 2015, 23(9): 11394. doi:  10.1364/OE.23.011394
    [247] FAN Y, SUN J, CHEN Q, et al. Optimal illumination scheme for isotropic quantitative differential phase contrast microscopy [J]. arXiv preprint, 2019: 1903.10718.
    [248] IGLESIAS I. Pyramid phase microscopy [J]. Optics Letters, 2011, 36(18): 3636. doi:  10.1364/OL.36.003636
    [249] PARTHASARATHY A B, CHU K K, FORD T N, et al. Quantitative phase imaging using a partitioned detection aperture [J]. Optics Letters, 2012, 37(19): 4062. doi:  10.1364/OL.37.004062
    [250] LU H, CHUNG J, OU X, et al. Quantitative phase imaging and complex field reconstruction by pupil modulation differential phase contrast [J]. Optics Express, 2016, 24(22): 25345. doi:  10.1364/OE.24.025345
    [251] ZUO C, SUN J, FENG S, et al. Programmable aperture microscopy: A computational method for multi-modal phase contrast and light field imaging [J]. Optics and Lasers in Engineering, 2016, 80: 24-31. doi:  10.1016/j.optlaseng.2015.12.012
    [252] LIN Y-Z, HUANG K-Y, LUO Y. Quantitative differential phase contrast imaging at high resolution with radially asymmetric illumination [J]. Optics Letters, 2018, 43(12): 2973-2976. doi:  10.1364/OL.43.002973
    [253] FAN Y, SUN J, CHEN Q, et al. Wide-field anti-aliased quantitative differential phase contrast microscopy [J]. Optics Express, 2018, 26(19): 25129. doi:  10.1364/OE.26.025129
    [254] CHEN H-H, LIN Y-Z, LUO Y. Isotropic differential phase contrast microscopy for quantitative phase bio-imaging [J]. Journal of Biophotonics, 2018, 11(8): e201700364. doi:  10.1002/jbio.201700364
    [255] LEE D, RYU S, KIM U, et al. Color-coded LED microscopy for multi-contrast and quantitative phase-gradient imaging [J]. Biomedical Optics Express, 2015, 6(12): 4912. doi:  10.1364/BOE.6.004912
    [256] PHILLIPS Z F, CHEN M, WALLER L. Single-shot quantitative phase microscopy with color-multiplexed differential phase contrast (CDPC) [J]. PLOS ONE, 2017, 12(2): e0171228. doi:  10.1371/journal.pone.0171228
    [257] LEE W, JUNG D, RYU S, et al. Single-exposure quantitative phase imaging in color-coded LED microscopy [J]. Optics Express, 2017, 25(7): 8398. doi:  10.1364/OE.25.008398
    [258] FAN Y, SUN J, CHEN Q, et al. Single-shot isotropic quantitative phase microscopy based on color-multiplexed differential phase contrast [J]. APL Photonics, 2019, 4(12): 121301. doi:  10.1063/1.5124535
    [259] FORD T N, CHU K K, MERTZ J. Phase-gradient microscopy in thick tissue with oblique back-illumination [J]. Nature Methods, 2012, 9(12): 1195-1197. doi:  10.1038/nmeth.2219
    [260] FORD T N, MERTZ J. Video-rate imaging of microcirculation with single-exposure oblique back-illumination microscopy [J]. Journal of Biomedical Optics, 2013, 18(6): 066007. doi:  10.1117/1.JBO.18.6.066007
    [261] JUNG D, CHOI J-H, KIM S, et al. Smartphone-based multi-contrast microscope using color-multiplexed illumination[J/OL]. Scientific Reports, 2017, 7(1): 7564. [2019–06–05]. http://www.nature.com/articles/s41598-017-07703-w.
    [262] ZHENG G, SHEN C, JIANG S, et al. Concept, implementations and applications of Fourier ptychography [J]. Nature Reviews Physics, 2021, 3(3): 207-223. doi:  10.1038/s42254-021-00280-y
    [263] ZUO C, CHEN Q, SUN J, et al. Non-interferometric phase retrieval and quantitative phase microscopy based on transport of intensity equation: A review [J]. Chinese Journal of Lasers, 2016, 43(6): 0609002. (in Chinese) doi:  10.3788/CJL201643.0609002
    [264] SUN J, Zhang Y, CHEN Q, et al. Fourier ptychographic microscopy: Theory, advances, and applications [J]. Acta Optica Sinica, 2016, 36(10): 89-107. (in Chinese)
    [265] FAN Y, CHEN Q, SUN J, et al. Review of the development of differential phase contrast microscopy [J]. Infrared and Laser Engineering, 2019, 48(6): 0603014. (in Chinese) doi:  10.3788/IRLA201948.0603014
    [266] PAN A, ZUO C, YAO B. High-resolution and large field-of-view Fourier ptychographic microscopy and its applications in biomedicine [J]. Reports on Progress in Physics, 2020, 83(9): 096101. doi:  10.1088/1361-6633/aba6f0
    [267] PAN X, LIU C, TAO H, et al. Phase imaging based on ptychography and progress on related key techniques [J]. Acta Optica Sinica, 2020, 40(1): 0111010. (in Chinese) doi:  10.3788/AOS202040.0111010
    [268] DEBSKIA W, WALCZYKOWSKIA P, KLEWSKIA A, et al. Analysis of usage of multispectral video technique for distinguishing objects in real time[C]//20th ISPRS Congress, 2004.
    [269] BACKMAN V, WALLACE M B, PERELMAN L, et al. Detection of preinvasive cancer cells [J]. Nature, 2000, 406(6791): 35. doi:  10.1038/35017638
    [270] YOSHIDA Y, OGUMA H, MORINO I, et al. Mountaintop observation of CO2 absorption spectra using a short wavelength infrared Fourier transform spectrometer [J]. Applied Optics, 2010, 49(1): 71-79. doi:  10.1364/AO.49.000071
    [271] ZHAO Z, DENG L, BAI L, et al. Optimal imaging band selection mechanism of weld pool vision based on spectrum analysis [J]. Optics & Laser Technology, 2019, 110: 145-151.
    [272] KESTER R T, BEDARD N, GAO L S, et al. Real-time snapshot hyperspectral imaging endoscope [J]. Journal of Biomedical Optics, 2011, 16(5): 056005. doi:  10.1117/1.3574756
    [273] SPERLING B A, HOANG J, KIMES W A, et al. Time-resolved surface infrared spectroscopy during atomic layer deposition [J]. Applied Spectroscopy, 2013, 67(9): 1003-1012. doi:  10.1366/13-06995
    [274] GOETZ A F, VANE G, SOLOMON J E, et al. Imaging spectrometry for earth remote sensing [J]. Science, 1985, 228(4704): 1147-1153. doi:  10.1126/science.228.4704.1147
    [275] OKAMOTO T, YAMAGUCHI I. Simultaneous acquisition of spectral image information [J]. Optics Letters, 1991, 16(16): 1277-1279. doi:  10.1364/OL.16.001277
    [276] OKAMOTO T, TAKAHASHI A, YAMAGUCHI I. Simultaneous acquisition of spectral and spatial intensity distribution [J]. Applied Spectroscopy, 1993, 47(8): 1198-1202. doi:  10.1366/0003702934067810
    [277] DESCOUR M, DERENIAK E. Computed-tomography imaging spectrometer: experimental calibration and reconstruction results [J]. Applied Optics, 1995, 34(22): 4817-4826. doi:  10.1364/AO.34.004817
    [278] CIMINO P, NEESE F, BARONE V. Computational Spectroscopy: Methods, Experiments and Applications[M]. Weinheim: Wiley-VCH, 2010.
    [279] WEI R Y, ZHOU J S, JING J J, et al. Developments and trends of the computed tomography imaging spectrometers [J]. Spectroscopy and Spectral Analysis, 2010, 30(10): 2866-2873.
    [280] MOONEY J M, VICKERS V E, AN M, et al. High-throughput hyperspectral infrared camera [J]. JOSA A, 1997, 14(11): 2951-2961. doi:  10.1364/JOSAA.14.002951
    [281] FANG J, ZHAO D, JIANG Y. New method in imaging spectrometry[C]//Color Science and Imaging Technologies. International Society for Optics and Photonics, 2002: 56–63.
    [282] HAGEN N, DERENIAK E L. Analysis of computed tomographic imaging spectrometers. I. Spatial and spectral resolution [J]. Applied Optics, 2008, 47(28): F85-F95. doi:  10.1364/AO.47.000F85
    [283] CANDES E, TAO T. Decoding by linear programming [J]. arXiv preprint, 2005: math/0502327.
    [284] CANDES E, ROMBERG J, TAO T. Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information [J]. arXiv preprint, 2004: math/0409186.
    [285] BRADY D J, GEHM M E. Compressive imaging spectrometers using coded apertures[C/OL]//Visual Information Processing XV. International Society for Optics and Photonics, 2006: 62460 A. [2019–06–28]. https://www.spiedigitallibrary.org/conference-proceedings-of-spie/6246/62460 A/Compressive-imaging-spectrometers-using-coded-apertures/10.1117/12.667605.short.
    [286] KITTLE D, CHOI K, WAGADARIKAR A, et al. Multiframe image estimation for coded aperture snapshot spectral imagers [J]. Applied Optics, 2010, 49(36): 6824-6833. doi:  10.1364/AO.49.006824
    [287] MA X, YUAN X, FU C, et al. LED-based compressive spectral-temporal imaging [J]. Optics Express, 2021, 29(7): 10698-10715. doi:  10.1364/OE.419888
    [288] CAO X, YUE T, LIN X, et al. Computational snapshot multispectral cameras: Toward dynamic capture of the spectral world [J]. IEEE Signal Processing Magazine, 2016, 33(5): 95-108. doi:  10.1109/MSP.2016.2582378
    [289] XUN CAO, HAO DU, XIN TONG, et al. A prism-mask system for multispectral video acquisition [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 33(12): 2423-2435. doi:  10.1109/TPAMI.2011.80
    [290] COURTIAL J, PATTERSON B, HARVEY A, et al. Design of a static Fourier-transform spectrometer with increased field of view [J]. Applied Optics, 1996, 35(34): 6698-6702. doi:  10.1364/AO.35.006698
    [291] ZHANG W, SONG H, HE X, et al. Deeply learned broadband encoding stochastic hyperspectral imaging [J]. Light: Science & Applications, 2021, 10(1): 108. doi:  10.1038/s41377-021-00545-2
    [292] DECKER J A, HARWIT M. Experimental operation of a Hadamard spectrometer [J]. Applied Optics, 1969, 8(12): 2552. doi:  10.1364/AO.8.002552
    [293] DECKER J A. Experimental realization of the multiplex advantage with a Hadamard-transform spectrometer [J]. Applied Optics, 1971, 10(3): 510. doi:  10.1364/AO.10.000510
    [294] YUE J, HAN J, ZHANG Y, et al. Denoising analysis of Hadamard transform spectrometry [J]. Optics Letters, 2014, 39(13): 3744-3747. doi:  10.1364/OL.39.003744
    [295] YUE J, HAN J, LI L, et al. Denoising analysis of spatial pixel multiplex coded spectrometer with Hadamard H-matrix [J]. Optics Communications, 2018, 407: 355-360. doi:  10.1016/j.optcom.2017.09.072
    [296] ZHAO Z, BAI L, HAN J, et al. High-SNR snapshot multiplex spectrometer with sub-Hadamard-s matrix coding [J]. Optics Communications, 2019, 453: 124322. doi:  10.1016/j.optcom.2019.124322
    [297] CHI M, WU Y, QIAN F, et al. Signal-to-noise ratio enhancement of a Hadamard transform spectrometer using a two-dimensional slit-array [J]. Applied Optics, 2017, 56(25): 7188-7193. doi:  10.1364/AO.56.007188
    [298] BAI L, WANG X, HAN J, et al. Development review of new spectral measurement technology [J]. Infrared and Laser Engineering, 2019, 48(6): 0603001. (in Chinese) doi:  10.3788/IRLA201948.0603001
    [299] FARLOW C A, CHENAULT D B, PEZZANITI J L, et al. Imaging polarimeter development and applications[C/OL]//Polarization Analysis and Measurement IV. International Society for Optics and Photonics, 2002: 118–125. [2019–06–05]. https://www.spiedigitallibrary.org/conference-proceedings-of-spie/4481/0000/Imaging-polarimeter-development-and-applications/10.1117/12.452880.short.
    [300] PEZZANITI J L, CHENAULT D B. A division of aperture MWIR imaging polarimeter[C/OL]//Polarization Science and Remote Sensing II. International Society for Optics and Photonics, 2005: 58880 V. [2019–06–28]. https://www.spiedigitallibrary.org/conference-proceedings-of-spie/5888/58880 V/A-division-of-aperture-MWIR-imaging-polarimeter/10.1117/12.623543.short.
    [301] Nordin G P, Meier J T, Deguzman P C, et al. Diffractive optical element for Stokes vector measurement with a focal plane array[C]//Polarization: Measurement, Analysis, and Remote Sensing II. SPIE, 1999, 3754: 169-177.
    [302] Bickel W S, Bailey W M. Stokes vectors, Mueller matrices, and polarized scattered light [J]. American Journal of Physics, 1985, 53(5): 468-478. doi:  10.1119/1.14202
    [303] ESPINOSA-LUNA R. Scattering by rough surfaces in a conical configuration: Experimental Mueller matrix [J]. Optics Letters, 2002, 27(17): 1510-1512. doi:  10.1364/OL.27.001510
    [304] TYO J S, ROWE M P, PUGH E N, et al. Target detection in optically scattering media by polarization-difference imaging [J]. Applied Optics, 1996, 35(11): 1855-1870. doi:  10.1364/AO.35.001855
    [305] LIANG J, ZHANG W, REN L, et al. Polarimetric dehazing method for visibility improvement based on visible and infrared image fusion [J]. Applied Optics, 2016, 55(29): 8221-8226. doi:  10.1364/AO.55.008221
    [306] SCHECHNER Y Y, KARPEL N. Recovery of underwater visibility and structure by polarization analysis [J]. IEEE Journal of Oceanic Engineering, 2005, 30(3): 570-587. doi:  10.1109/JOE.2005.850871
    [307] MUDGE J, VIRGEN M. Real time polarimetric dehazing [J]. Applied Optics, 2013, 52(9): 1932-1938. doi:  10.1364/AO.52.001932
    [308] ZHANG W, LIANG J, JU H, et al. A robust haze-removal scheme in polarimetric dehazing imaging based on automatic identification of sky region [J]. Optics & Laser Technology, 2016, 86: 145-151. doi:  10.1016/j.optlastec.2016.07.015
    [309] WANG H, WANG H, HU H, et al. Automatic underwater polarization imaging without background region or any prior [J]. Optics Express, 2021, 29(20): 31283-31295. doi:  10.1364/OE.434398
    [310] WANG H, HU H, JIANG J, et al. Polarization differential imaging in turbid water via Mueller matrix and illumination modulation [J]. Optics Communications, 2021, 499: 127274. doi:  10.1016/j.optcom.2021.127274
    [311] HU H, QI P, LI X, et al. Underwater imaging enhancement based on a polarization filter and histogram attenuation prior [J]. Journal of Physics D: Applied Physics, 2021, 54(17): 175102. doi:  10.1088/1361-6463/abdc93
    [312] LIU F, LIU F, LIU F, et al. Depolarization index from Mueller matrix descatters imaging in turbid water [J]. Chinese Optics Letters, 2022, 20(2): 022601. doi:  10.3788/COL202220.022601
    [313] LIANG J, JU H, ZHANG W, et al. Review of optical polarimetric dehazing technique[J/OL]. 2017, 37(4): 0400001. [2022–02–14]. http://ir.opt.ac.cn/handle/181661/28922.
    [314] LIANG J, REN L, JU H, et al. Polarimetric dehazing method for dense haze removal based on distribution analysis of angle of polarization [J]. Optics Express, 2015, 23(20): 26146-26157. doi:  10.1364/OE.23.026146
    [315] HU H, LI X, LIU T. Recent advances in underwater image restoration technique based on polarimetric imaging [J]. Infrared and Laser Engineering, 2019, 48(6): 0603006. (in Chinese) doi:  10.3788/IRLA201948.0603006
    [316] Crosby F J. Stokes vector component versus elementary factor performance in a target detection algorithm[C]//Polarization: Measurement, Analysis, and Remote Sensing VI. SPIE, 2004, 5432: 1-11.
    [317] Cavanaugh D B, Castle K R, Davenport W. Anomaly detection using the hyperspectral polarimetric imaging testbed[C]//Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XII. SPIE, 2006, 6233: 625-637.
    [318] Egan W G, Duggin M J. Synthesis of optical polarization signatures of military aircraft[C]//Polarization Analysis and Measurement IV. International Society for Optics and Photonics, 2002, 4481: 188-194.
    [319] Egan W G, Liu Q. Polarized MODTRAN 3.7 applied to characterization of ocean color in the presence of aerosols[C]//Polarization Analysis and Measurement IV. International Society for Optics and Photonics, 2002, 4481: 228-241.
    [320] Goldstein D H. Polarimetric characterization of federal standard paints[C]//Polarization Analysis, Measurement, and Remote Sensing III. SPIE, 2000, 4133: 112-123.
    [321] Le Hors L, Hartemann P, Dolfi D, et al. Phenomenological model of paints for multispectral polarimetric imaging[C]//Targets and Backgrounds VII: Characterization and Representation. SPIE, 2001, 4370: 94-105.
    [322] Forssell G, Hedborg-Karlsson E. Measurements of polarization properties of camouflaged objects and of the denial of surfaces covered with cenospheres[C]//Targets and Backgrounds IX: Characterization and Representation. International Society for Optics and Photonics, 2003, 5075: 246-258.
    [323] Aron Y, Gronau Y. Polarization in the MWIR: A method to improve target aquisition[C]//Infrared Technology and Applications XXXI. SPIE, 2005, 5783: 653-661.
    [324] Cremer F, De Jong W, Schutte K. Infrared polarization measurements and modelling applied to surface laid anti-personnel landmines [J]. Optical Engineering, 2002, 41(5): 1021-1032. doi:  10.1117/1.1467362
    [325] KOSHIKAWA K, SHIRAI Y. A model-based recognition of glossy objects using their polarimetrical properties [J]. Advanced Robotics, 1987, 2(2): 137-147. doi:  10.1163/156855387X00129
    [326] Wolff L B, Boult T E. Constraining object features using a polarization reflectance model [J]. Phys Based Vis Princ Pract Radiom, 1993, 1: 167.
    [327] Miyazaki D, Kagesawa M, Ikeuchi K. Transparent surface modeling from a pair of polarization images [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2004, 26(1): 73-82. doi:  10.1109/TPAMI.2004.1261080
    [328] Duncan D D, Hahn D V, Thomas M E. Physics-based polarimetric BRDF models[C]//Optical Diagnostic Methods for Inorganic Materials III. SPIE, 2003, 5192: 129-140.
    [329] YANG P, WEI H, KATTAWAR G W, et al. Sensitivity of the backscattering Mueller matrix to particle shape and thermodynamic phase [J]. Applied Optics, 2003, 42(21): 4389-4395. doi:  10.1364/AO.42.004389
    [330] ANDREOU A G, KALAYJIAN Z K. Polarization imaging: principles and integrated polarimeters [J]. IEEE Sensors Journal, 2002, 2(6): 566-576. doi:  10.1109/JSEN.2003.807946
    [331] BICKEL W S, DAVIDSON J F, HUFFMAN D R, et al. Application of polarization effects in light scattering: A new biophysical tool [J]. Proceedings of the National Academy of Sciences, 1976, 73(2): 486-490. doi:  10.1073/pnas.73.2.486
    [332] JACQUES S L, ROMAN J R, LEE K. Imaging superficial tissues with polarized light [J]. Lasers in Surgery and Medicine, 2000, 26(2): 119-129. doi:  10.1002/(SICI)1096-9101(2000)26:2<119::AID-LSM3>3.0.CO;2-Y
    [333] Jacques S L, Samatham R, Isenhath S, et al. Polarized light camera to guide surgical excision of skin cancers[C]//Photonic Therapeutics and Diagnostics IV. SPIE, 2008, 6842: 102-108.
    [334] Oldenbourg R, Mei G. New polarized light microscope with precision universal compensator [J]. Journal of Microscopy, 1995, 180(2): 140-147. doi:  10.1111/j.1365-2818.1995.tb03669.x
    [335] OLDENBOURG R. New views on polarization microscopy[C]//European Cells and Materials, 2001.
    [336] Itoh M, Yamanari M, Yasuno Y, et al. Polarization characteristics of multiple backscattering in human blood cell suspensions [J]. Optical and Quantum Electronics, 2005, 37(13): 1277-1285.
    [337] Xia J, Weaver A, Gerrard D E, et al. Monitoring sarcomere structure changes in whole muscle using diffuse light reflectance [J]. Journal of Biomedical Optics, 2006, 11(4): 040504. doi:  10.1117/1.2234278
    [338] ANTONELLI M-R, PIERANGELO A, NOVIKOVA T, et al. Mueller matrix imaging of human colon tissue for cancer diagnostics: How monte carlo modeling can help in the interpretation of experimental data [J]. Optics Express, 2010, 18(10): 10200-10208. doi:  10.1364/OE.18.010200
    [339] PIERANGELO A, BENALI A, ANTONELLI M-R, et al. Ex-vivo characterization of human colon cancer by Mueller polarimetric imaging [J]. Optics Express, 2011, 19(2): 1582-1593. doi:  10.1364/OE.19.001582
    [340] Pierangelo A, Manhas S, Benali A, et al. Ex vivo photometric and polarimetric multilayer characterization of human healthy colon by multispectral Mueller imaging [J]. Journal of Biomedical Optics, 2012, 17(6): 066009. doi:  10.1117/1.JBO.17.6.066009
    [341] CHUNG J, JUNG W, HAMMER-WILSON M J, et al. Use of polar decomposition for the diagnosis of oral precancer [J]. Applied Optics, 2007, 46(15): 3038-3045. doi:  10.1364/AO.46.003038
    [342] Wood M F, Ghosh N, Moriyama E H, et al. Proof-of-principle demonstration of a Mueller matrix decomposition method for polarized light tissue characterization in vivo [J]. Journal of Biomedical Optics, 2009, 14(1): 014029. doi:  10.1117/1.3065545
    [343] SHUKLA P, PRADHAN A. Mueller decomposition images for cervical tissue: Potential for discriminating normal and dysplastic states [J]. Optics Express, 2009, 17(3): 1600-1609. doi:  10.1364/OE.17.001600
    [344] PIERANGELO A, NAZAC A, BENALI A, et al. Polarimetric imaging of uterine cervix: A case study [J]. Optics Express, 2013, 21(12): 14120-14130. doi:  10.1364/OE.21.014120
    [345] Wang W, Lim L G, Srivastava S, et al. Roles of linear and circular polarization properties and effect of wavelength choice on differentiation between ex vivo normal and cancerous gastric samples [J]. Journal of Biomedical Optics, 2014, 19(4): 046020. doi:  10.1117/1.JBO.19.4.046020
    [346] QI J, YE M, SINGH M, et al. Narrow band 3 × 3 Mueller polarimetric endoscopy [J]. Biomedical Optics Express, 2013, 4(11): 2433-2449. doi:  10.1364/BOE.4.002433
    [347] QI J, BARRIÈRE C, WOOD T C, et al. Polarized multispectral imaging in a rigid endoscope based on elastic light scattering spectroscopy [J]. Biomedical Optics Express, 2012, 3(9): 2087-2099. doi:  10.1364/BOE.3.002087
    [348] HE C, HE H, CHANG J, et al. Polarisation optics for biomedical and clinical applications: A review [J]. Light: Science & Applications, 2021, 10(1): 194. doi:  10.1038/s41377-021-00639-x
    [349] QI J, HE H, LIN J, et al. Assessment of tissue polarimetric properties using Stokes polarimetric imaging with circularly polarized illumination [J]. Journal of Biophotonics, 2018, 11(4): e201700139. doi:  10.1002/jbio.201700139
    [350] SUN M, HE H, ZENG N, et al. Characterizing the microstructures of biological tissues using Mueller matrix and transformed polarization parameters [J]. Biomedical Optics Express, 2014, 5(12): 4223-4234. doi:  10.1364/BOE.5.004223
    [351] HEE M R, HUANG D, SWANSON E A, et al. Polarization-sensitive low-coherence reflectometer for birefringence characterization and ranging [J]. JOSA B, 1992, 9(6): 903-908. doi:  10.1364/JOSAB.9.000903
    [352] FAN C, YAO G. Imaging myocardial fiber orientation using polarization sensitive optical coherence tomography [J]. Biomedical Optics Express, 2013, 4(3): 460-465. doi:  10.1364/BOE.4.000460
    [353] Brewster D. X. On the communication of the structure of doubly refracting crystals to glass, muriate of soda, fluor spar, and other substances, by mechanical compression and dilatation. By David Brewster, LL. DFRS Lond. and Edin. In a letter addressed to the Right Hon. Sir Joseph Banks, Bart. G. C. B. P. R S. [J]. Philosophical Transactions of the Royal Society of London, 1816(106): 156-178.
    [354] Hecker F W, Morche B. Computer-aided measurement of relative retardations in plane photoelasticity[M]//Experimental stress analysis. Dordrecht: Springer, 1986: 535-542.
    [355] Sarma A, Pillai S A, Subramanian G, et al. Computerized image processing for whole-field determination of isoclinics and isochromatics [J]. Experimental Mechanics, 1992, 32(1): 24-29. doi:  10.1007/BF02317980
    [356] Kihara T. Automatic whole-field measurement of principal stress directions using three wavelengths[C]//Proc 10th Int Conf on Experimental Mechanics, Lisbon, 1994: 95-99.
    [357] Mangal S K, Ramesh K. Use of multiple loads to extract continuous isoclinic fringes by phase shifting technique [J]. Strain, 1999, 35(1): 15-17. doi:  10.1111/j.1475-1305.1999.tb01114.x
    [358] Almeida Magalhaes C, Americo Almeida Magalhaes Jr P. New numerical methods for the photoelastic technique with high accuracy [J]. Journal of Applied Physics, 2012, 112(8): 083111. doi:  10.1063/1.4761979
    [359] ALMEIDA MAGALHÃES C, SMITH NETO P, ALMEIDA MAGALHÃES JÚNIOR P A, et al. Separation of isochromatics and isoclinics phasemaps for the photoelastic technique with use phase shifting and a large number of high precision images [J]. Metrology and Measurement Systems, 2013, 20(1): 127-138. doi:  10.2478/mms-2013-0012
    [360] AJOVALASIT A, PETRUCCI G. Analisi automatica delle frange fotoelastiche in luce bianca[C]//Proceedings of the XVIII AIAS Conference, 1990.
    [361] CARAZO-ALVAREZ J, HAAKE S J, PATTERSON E A. Completely automated photoelastic fringe analysis [J]. Optics and Lasers in Engineering, 1994, 21(3): 133-149. doi:  10.1016/0143-8166(94)90067-1
    [362] Ajovalasit A, Barone S, Petrucci G. Towards RGB photoelasticity: Full-field automated photoelasticity in white light [J]. Experimental Mechanics, 1995, 35(3): 193-200. doi:  10.1007/BF02319657
    [363] YONEYAMA S, TAKASHI M. A new method for photoelastic fringe analysis from a single image using elliptically polarized white light [J]. Optics and Lasers in Engineering, 1998, 30(5): 441-459. doi:  10.1016/S0143-8166(98)00037-2
    [364] YONEYAMA S, SHIMIZU M, GOTOH J, et al. Photoelastic analysis with a single tricolor image [J]. Optics and Lasers in Engineering, 1998, 29(6): 423-435. doi:  10.1016/S0143-8166(97)00107-3
    [365] NURSE A D. Automated photoelasticity: Weighted least-squares determination of field stresses [J]. Optics and Lasers in Engineering, 1999, 31(5): 353-370. doi:  10.1016/S0143-8166(99)00033-0
    [366] QUIROGA J A, GARCÍA-BOTELLA Á, GÓMEZ-PEDRERO J A. Improved method for isochromatic demodulation by RGB calibration [J]. Applied Optics, 2002, 41(17): 3461-3468. doi:  10.1364/AO.41.003461
    [367] Cline R A, Westerveld W B, Risley J S. A new method for measuring the retardation of a photoelastic modulator using single photon counting techniques [J]. Review of Scientific Instruments, 1993, 64(5): 1169-1174. doi:  10.1063/1.1144113
    [368] ZENG A, LI F, ZHU L, et al. Simultaneous measurement of retardance and fast axis angle of a quarter-wave plate using one photoelastic modulator [J]. Applied Optics, 2011, 50(22): 4347-4352. doi:  10.1364/AO.50.004347
    [369] WOODHAM R J. Photometric method for determining surface orientation from multiple images [J]. Optical Engineering, 1980, 19(1): 191139.
    [370] CHRISTENSEN P H, SHAPIRO L G. Three-dimensional shape from color photometric stereo [J]. International Journal of Computer Vision, 1994, 13(2): 213-227. doi:  10.1007/BF01427152
    [371] DERESIEWICZ H, SKALAK R. On uniqueness in dynamic poroelasticity [J]. Bulletin of the Seismological Society of America, 1963, 53(4): 783-788. doi:  10.1785/BSSA0530040783
    [372] COLEMAN JR E N, JAIN R. Obtaining 3-dimensional shape of textured and specular surfaces using four-source photometry [J]. Computer Graphics and Image Processing, 1982, 18(4): 309-328. doi:  10.1016/0146-664X(82)90001-6
    [373] PARK J-S, TOU J T. Highlight separation and surface orientations for 3-D specular objects[C]//IEEE Proceedings. 10 th International Conference on Pattern Recognition, 1990: 331–335.
    [374] IKEUCHI K. Determining surface orientations of specular surfaces by using the photometric stereo method [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1981, 3(6): 661-669.
    [375] WU T-P, TANG C-K. Dense photometric stereo using a mirror sphere and graph cut[C]//2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05). IEEE, 2005: 140–147.
    [376] MOZEROV M G, VAN DE WEIJER J. Accurate stereo matching by two-step energy minimization [J]. IEEE Transactions on Image Processing, 2015, 24(3): 1153-1163. doi:  10.1109/TIP.2015.2395820
    [377] GEIGER A, ROSER M, URTASUN R. Efficient large-scale stereo matching[C]//Asian Conference on Computer Vision. Springer, 2010: 25–38.
    [378] TAN X, SUN C, WANG D, et al. Soft cost aggregation with multi-resolution fusion[C]//European Conference on Computer Vision. Springer, 2014: 17–32.
    [379] YANG Q, YANG R, DAVIS J, et al. Spatial-depth super resolution for range images[C]//2007 IEEE Conference on Computer Vision and Pattern Recognition. IEEE, 2007: 1–8.
    [380] YOON K-J, KWEON I S. Adaptive support-weight approach for correspondence search [J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2006, 28(4): 650-656.
    [381] HOSNI A, RHEMANN C, BLEYER M, et al. Fast cost-volume filtering for visual correspondence and beyond [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 35(2): 504-511.
    [382] YANG Q, WANG L, YANG R, et al. Stereo matching with color-weighted correlation, hierarchical belief propagation, and occlusion handling [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008, 31(3): 492-504.
    [383] KLAUS A, SORMANN M, KARNER K. Segment-based stereo matching using belief propagation and a self-adapting dissimilarity measure[C]//18th International Conference on Pattern Recognition (ICPR’06). IEEE, 2006: 15–18.
    [384] BERTOZZI M, BROGGI A. GOLD: A parallel real-time stereo vision system for generic obstacle and lane detection [J]. IEEE Transactions on Image Processing, 1998, 7(1): 62-81. doi:  10.1109/83.650851
    [385] LOOP C, ZHANG Z. Computing rectifying homographies for stereo vision[C]//Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149), 1999: 125–131.
    [386] GEHRIG S K, EBERLI F, MEYER T. A real-time low-power stereo vision engine using semi-global matching[C]//International Conference on Computer Vision Systems. Springer, 2009: 134–143.
    [387] SHIM H, LEE S. Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations [J]. Optical Engineering, 2012, 51(9): 094401.
    [388] YU L, Zhang D, Yu B, et al. Research of 3 D laser scanning measurement system for mining [J]. Metal Mine, 2012(10): 101-103+107. (in Chinese)
    [389] KOU L, ZHANG L, ZHANG K, et al. A multi-focus image fusion method via region mosaicking on Laplacian pyramids [J]. PloS ONE, 2018, 13(5): e0191085. doi:  10.1371/journal.pone.0191085
    [390] DORRINGTON A A, KELLY C B D, MCCLURE S H, et al. Advantages of 3 D time-of-flight range imaging cameras in machine vision applications[C]// The 16th Electronics New Zealand Conference (ENZCon), 2009: 95–99.
    [391] GANAPATHI V, PLAGEMANN C, KOLLER D, et al. Real time motion capture using a single time-of-flight camera[C/OL]//2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Francisco, CA, USA: IEEE, 2010: 755–762. [2019–06–04]. http://ieeexplore.ieee.org/document/5540141/.
    [392] HSU S, ACHARYA S, RAFII A, et al. Performance of a Time-of-Flight Range Camera for Intelligent Vehicle Safety Applications[M]//Advanced Microsystems for Automotive Applications. Berlin: Springer, 2006: 205–219.
    [393] HAHNE U, ALEXA M. Depth imaging by combining time-of-flight and on-demand stereo[C]//Workshop on Dynamic 3 D Imaging. Springer, 2009: 70–83.
    [394] SCHUON S, THEOBALT C, DAVIS J, et al. High-quality scanning using time-of-flight depth superresolution[C]//2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops. IEEE, 2008: 1–7.
    [395] CUI Y, SCHUON S, THRUN S, et al. Algorithms for 3 d shape scanning with a depth camera [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 35(5): 1039-1050.
    [396] ZHANG Z, ZHANG J. Solutions and core techniques of city modeling [J]. World Sci-Tech R& D, 2003(3): 23-29. doi:  10.16507/j.issn.1006-6055.2003.03.006
    [397] 王继周, 李成名, 林宗坚. 城市三维数据获取技术发展探讨[J]. 测绘科学, 2004, 29(4): 71-73, 86. doi:  10.3771/j.issn.1009-2307.2004.04.023
    [398] GAO Z. The Research of terrestrial Laser Scanning Data Processing and Modeling[D]. Xi’an: Chang'an University, 2010.
    [399] FANG W, Research on Automatic Texture mapping of terrestrial laser scanning data combining photogrammetry techniques[D]. Wuhan: Wuhan University, 2014.
    [400] NAYAR S K, WATANABE M, NOGUCHI M. Real-time focus range sensor [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1996, 18(12): 1186-1198. doi:  10.1109/34.546256
    [401] WATANABE M, NAYAR S K. Rational filters for passive depth from defocus [J]. International Journal of Computer Vision, 1998, 27(3): 203-225. doi:  10.1023/A:1007905828438
    [402] GENG J. Structured-light 3D surface imaging: a tutorial [J]. Advances in Optics and Photonics, 2011, 3(2): 128-160. doi:  10.1364/AOP.3.000128
    [403] ZUO C, FENG S, HUANG L, et al. Phase shifting algorithms for fringe projection profilometry: A j. [J]. Optics and Lasers in Engineering, 2018, 109: 23-59. doi:  10.1016/j.optlaseng.2018.04.019
    [404] GORTHI S S, RASTOGI P. Fringe projection techniques: Whither we are? [J]. Optics & Lasers in Engineering, 2010, 48(2): 133-140.
    [405] REICH C, RITTER R, THESING J. 3-D shape measurement of complex objects by combining photogrammetry and fringe projection [J]. Optical Engineering, 2000, 39(1): 224-232. doi:  10.1117/1.602356
    [406] HUANG P S, ZHANG C, CHIANG F-P. High-speed 3-D shape measurement based on digital fringe projection [J]. Optical Engineering, 2003, 42(1): 163-169. doi:  10.1117/1.1525272
    [407] PAN B, KEMAO Q, HUANG L, et al. Phase error analysis and compensation for nonsinusoidal waveforms in phase-shifting digital fringe projection profilometry [J]. Optics Letters, 2009, 34(4): 416-418. doi:  10.1364/OL.34.000416
    [408] QUAN C, HE X, WANG C, et al. Shape measurement of small objects using LCD fringe projection with phase shifting [J]. Optics Communications, 2001, 189(1-3): 21-29. doi:  10.1016/S0030-4018(01)01038-0
    [409] ZHANG Z, TOWERS C E, TOWERS D P. Time efficient color fringe projection system for 3D shape and color using optimum 3-frequency Selection. [J]. Optics Express, 2006, 14(14): 6444-6455. doi:  10.1364/OE.14.006444
    [410] WANG Z, NGUYEN D A, BARNES J C. Some practical considerations in fringe projection profilometry [J]. Optics & Lasers in Engineering, 2010, 48(2): 218-225.
    [411] PAN J, HUANG P S, CHIANG F-P. Color-coded binary fringe projection technique for 3-D shape measurement [J]. Optical Engineering, 2005, 44(2): 023606. doi:  10.1117/1.1840973
    [412] KÜHMSTEDT P, MUNCKELT C, HEINZE M, et al. 3D shape measurement with phase correlation based fringe projection[C]//Optical Measurement Systems for Industrial Inspection V. International Society for Optics and Photonics, 2007: 66160B.
    [413] ZHANG Z. Review of single-shot 3 D shape measurement by phase calculation-based fringe projection techniques [J]. Optics and Lasers in Engineering, 2012, 50(8): 1097-1106. doi:  10.1016/j.optlaseng.2012.01.007
    [414] LIU H C, HALIOUA M, SRINIVASAN V. Automated phase-measuring profilometry of 3-D diffuse objects [J]. Applied Optics, 1984, 23(18): 3105. doi:  10.1364/AO.23.003105
    [415] SU X, CHEN W. Fourier transform profilometry: A review [J]. Optics and lasers in Engineering, 2001, 35(5): 263-284. doi:  10.1016/S0143-8166(01)00023-9
    [416] SU X, ZHANG Q. Dynamic 3-D shape measurement method: A review [J]. Optics & Lasers in Engineering, 2010, 48(2): 191-204.
    [417] KEMAO Q. Two-dimensional windowed Fourier transform for fringe pattern analysis: Principles, applications and implementations [J]. Optics and Lasers in Engineering, 2007, 45(2): 304-317. doi:  10.1016/j.optlaseng.2005.10.012
    [418] KEMAO Q. Windowed Fourier transform for fringe pattern analysis [J]. Applied Optics, 2004, 43(13): 2695-2702. doi:  10.1364/AO.43.002695
    [419] ZHONG J, WENG J. Spatial carrier-fringe pattern analysis by means of wavelet transform: wavelet transform profilometry [J]. Applied Optics, 2004, 43(26): 4993-4998. doi:  10.1364/AO.43.004993
    [420] SU X Y, BALLY G V, VUKICEVIC D. Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation [J]. Optics Communications, 1993, 98(1-3): 141-150. doi:  10.1016/0030-4018(93)90773-X
    [421] LI J, HASSEBROOK L G, GUAN C. Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity. [J]. Journal of the Optical Society of America:A Optics Image Science & Vision, 2003, 20(1): 106-115.
    [422] ZHANG S. Recent progresses on real-time 3D shape measurement using digital fringe projection techniques [J]. Optics & Lasers in Engineering, 2010, 48(2): 149-158.
    [423] VAN DER JEUGHT S, DIRCKX J J. Real-time structured light profilometry: A review [J]. Optics and Lasers in Engineering, 2016, 87: 18-31. doi:  10.1016/j.optlaseng.2016.01.011
    [424] ZUO C, HUANG L, ZHANG M, et al. Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review [J]. Optics & Lasers in Engineering, 2016, 85: 84-103.
    [425] SU X, CHEN W. Reliability-guided phase unwrapping algorithm: A review [J]. Optics and Lasers in Engineering, 2004, 42(3): 245-261. doi:  10.1016/j.optlaseng.2003.11.002
    [426] GUTMANN B, WEBER H. Phase unwrapping with the branch-cut method: Role of phase-field direction [J]. Applied Optics, 2000, 39(26): 4802-4816. doi:  10.1364/AO.39.004802
    [427] ZAPPA E, BUSCA G. Comparison of eight unwrapping algorithms applied to Fourier-transform profilometry [J]. Optics and Lasers in Engineering, 2008, 46(2): 106-116. doi:  10.1016/j.optlaseng.2007.09.002
    [428] GHIGLIA D C, ROMERO L A. Minimum Lp-norm two-dimensional phase unwrapping [J]. JOSA A, 1996, 13(10): 1999-2013. doi:  10.1364/JOSAA.13.001999
    [429] TROUVE E, NICOLAS J-M, MAITRE H. Improving phase unwrapping techniques by the use of local frequency estimates [J]. IEEE Transactions on Geoscience and Remote Sensing, 1998, 36(6): 1963-1972. doi:  10.1109/36.729368
    [430] ZEBKER H A, LU Y. Phase unwrapping algorithms for radar interferometry: residue-cut, least-squares, and synthesis algorithms [J]. JOSA A, 1998, 15(3): 586-598. doi:  10.1364/JOSAA.15.000586
    [431] HUNTLEY J M, SALDNER H. Temporal phase-unwrapping algorithm for automated interferogram analysis [J]. Applied Optics, 1993, 32(17): 3047-3052. doi:  10.1364/AO.32.003047
    [432] GUSHOV V, SOLODKIN Y N. Automatic processing of fringe patterns in integer interferometers [J]. Optics and Lasers in Engineering, 1991, 14(4-5): 311-324. doi:  10.1016/0143-8166(91)90055-X
    [433] SANSONI G, CAROCCI M, RODELLA R. Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors [J]. Appl Opt, 1999, 38(31): 6565-6573. doi:  10.1364/AO.38.006565
    [434] ZHAO H, CHEN W, TAN Y. Phase-unwrapping algorithm for the measurement of three-dimensional object shapes [J]. Applied Optics, 1994, 33(20): 4497-4500. doi:  10.1364/AO.33.004497
    [435] CHENG Y-Y, WYANT J C. Two-wavelength phase shifting interferometry [J]. Applied Optics, 1984, 23(24): 4539-4543. doi:  10.1364/AO.23.004539
    [436] CREATH K, CHENG Y-Y, WYANT J C. Contouring aspheric surfaces using two-wavelength phase-shifting interferometry [J]. Optica Acta:International Journal of Optics, 1985, 32(12): 1455-1464. doi:  10.1080/713821689
    [437] BURKE J, BOTHE T, OSTEN W, et al. Reverse engineering by fringe projection[C]//Interferometry XI: Applications. International Society for Optics and Photonics, 2002: 312–325.
    [438] DING Y, XI J, YU Y, et al. Recovering the absolute phase maps of two fringe patterns with selected frequencies [J]. Optics Letters, 2011, 36(13): 2518-2520. doi:  10.1364/OL.36.002518
    [439] FALAGGIS K, TOWERS D P, TOWERS C E. Algebraic solution for phase unwrapping problems in multiwavelength interferometry [J]. Applied Optics, 2014, 53(17): 3737-3747. doi:  10.1364/AO.53.003737
    [440] PETKOVIĆ T, PRIBANIĆ T, DJONLIĆ M. Temporal phase unwrapping using orthographic projection [J]. Optics and Lasers in Engineering, 2017, 90: 34-47. doi:  10.1016/j.optlaseng.2016.09.006
    [441] XING S, GUO H. Temporal phase unwrapping for fringe projection profilometry aided by recursion of Chebyshev polynomials [J]. Applied Optics, 2017, 56(6): 1591-1602. doi:  10.1364/AO.56.001591
    [442] SANSONI G, CORINI S, LAZZARI S, et al. Three-dimensional imaging based on Gray-code light projection: characterization of the measuring algorithm and development of a measuring system for industrial applications [J]. Applied Optics, 1997, 36(19): 4463-4472. doi:  10.1364/AO.36.004463
    [443] LI Z, SHI Y, WANG C, et al. Accurate calibration method for a structured light system [J]. Optical Engineering, 2008, 47(5): 053604. doi:  10.1117/1.2931517
    [444] SALDNER H O, HUNTLEY J M. Temporal phase unwrapping: Application to surface profiling of discontinuous objects [J]. Applied Optics, 1997, 36(13): 2770-2775. doi:  10.1364/AO.36.002770
    [445] MARTINEZ-CELORIO R A, DAVILA A, KAUFMANN G H, et al. Extension of the displacement measurement range for electronic speckle-shearing pattern interferometry using carrier fringes and a temporal-phase-unwrapping method [J]. Optical Engineering, 2000, 39(3): 751-758. doi:  10.1117/1.602423
    [446] HUANG L, ASUNDI A K. Phase invalidity identification framework with the temporal phase unwrapping method [J]. Measurement Science and Technology, 2011, 22(3): 035304. doi:  10.1088/0957-0233/22/3/035304
    [447] TIAN J, PENG X, ZHAO X. A generalized temporal phase unwrapping algorithm for three-dimensional profilometry [J]. Optics and Lasers in Engineering, 2008, 46(4): 336-342. doi:  10.1016/j.optlaseng.2007.11.002
    [448] PEDRINI G, ALEXEENKO I, OSTEN W, et al. Temporal phase unwrapping of digital hologram sequences [J]. Applied Optics, 2003, 42(29): 5846-5854. doi:  10.1364/AO.42.005846
    [449] WEISE T, LEIBE B, VAN GOOL L. Fast 3D Scanning with automatic motion compensation[C]//CVPR’07. IEEE Conference on Computer Vision and Pattern Recognition, 2007.
    [450] QIAN J, TAO T, FENG S, et al. Motion-artifact-free dynamic 3D shape measurement with hybrid Fourier-transform phase-shifting profilometry [J]. Optics Express, 2019, 27(3): 2713. doi:  10.1364/OE.27.002713
    [451] TAO T, CHEN Q, FENG S, et al. High-speed real-time 3D shape measurement based on adaptive depth constraint [J]. Optics Express, 2018, 26(17): 22440. doi:  10.1364/OE.26.022440
    [452] TAO T, CHEN Q, FENG S, et al. High-precision real-time 3D shape measurement based on a quad-camera system [J]. Journal of Optics, 2018, 20(1): 014009. doi:  10.1088/2040-8986/aa9e0f
    [453] TAO T, CHEN Q, DA J, et al. Real-time 3-D shape measurement with composite phase-shifting fringes and multi-view system [J]. Optics Express, 2016, 24(18): 20253-20269. doi:  10.1364/OE.24.020253
    [454] BRÄUERBURCHARDT C, MUNKELT C, HEINZE M, et al. Using geometric constraints to solve the point correspondence problem in fringe projection based 3D measuring systems[C]//International Conference on Image Analysis and Processing. 2011.
    [455] LI Z, ZHONG K, LI Y F, et al. Multiview phase shifting: A full-resolution and high-speed 3D measurement framework for arbitrary shape dynamic objects [J]. Optics Letters, 2013, 38(9): 1389-1391. doi:  10.1364/OL.38.001389
    [456] QIAN J, FENG S, TAO T, et al. High-resolution real-time 360° 3D model reconstruction of a handheld object with fringe projection profilometry [J]. Optics Letters, 2019, 44(23): 5751. doi:  10.1364/OL.44.005751
    [457] QIAN J, FENG S, XU M, et al. High-resolution real-time 360° 3D surface defect inspection with fringe projection profilometry [J]. Optics and Lasers in Engineering, 2021, 137: 106382. doi:  10.1016/j.optlaseng.2020.106382
    [458] ZHOU P, GOODSON K E. Subpixel displacement and deformation gradient measurement using digital image/speckle correlation [J]. Optical Engineering, 2001, 40(8): 1613-1621. doi:  10.1117/1.1387992
    [459] ZHANG J, JIN G, MA S, et al. Application of an improved subpixel registration algorithm on digital speckle correlation measurement [J]. Optics & Laser Technology, 2003, 35(7): 533-542.
    [460] FENG S, CHEN Q, GU G, et al. Fringe pattern analysis using deep learning [J]. Advanced Photonics, 2019, 1(2): 1. doi:  10.1117/1.AP.1.2.025001
    [461] FENG S, ZUO C, HU Y, et al. Deep-learning-based fringe-pattern analysis with uncertainty estimation [J]. Optica, 2021, 8(12): 1507-1510. doi:  10.1364/OPTICA.434311
    [462] FENG S, ZUO C, YIN W, et al. Micro deep learning profilometry for high-speed 3D surface imaging[J/OL]. Optics and Lasers in Engineering, 2019, 121: 416–427. [2019–12–20]. https://linkinghub.elsevier.com/retrieve/pii/S0143816619302015.
    [463] QIAN J, FENG S, TAO T, et al. Deep-learning-enabled geometric constraints and phase unwrapping for single-shot absolute 3 D shape measurement[J/OL]. APL Photonics, 2020, 5(4): 046105. [2020–06–23]. http://aip.scitation.org/doi/10.1063/5.0003217. DOI: 10.1063/5.0003217.
    [464] QIAN J, FENG S, LI Y, et al. Single-shot absolute 3D shape measurement with deep-learning-based color fringe projection profilometry [J]. Optics Letters, 2020, 45(7): 1842-1845. doi:  10.1364/OL.388994
    [465] LI Y, QIAN J, FENG S, et al. Composite fringe projection deep learning profilometry for single-shot absolute 3D shape measurement[J/OL]. Optics Express, 2022, 30(3): 3424. [2022–02–13]. https://opg.optica.org/abstract.cfm?URI=oe-30-3-3424.
    [466] VAN DER JEUGHT S, DIRCKX J J J. Deep neural networks for single shot structured light profilometry[J/OL]. Optics Express, 2019, 27(12): 17091. [2020–07–19]. https://www.osapublishing.org/abstract.cfm?URI=oe-27-12-17091.
    [467] NGUYEN H, WANG Y, WANG Z. Single-shot 3D shape reconstruction using structured light and deep convolutional neural networks[J/OL]. Sensors, 2020, 20(13): 3718. [2020–07–08]. https://www.mdpi.com/1424-8220/20/13/3718.
    [468] ZHENG Y, WANG S, LI Q, et al. Fringe projection profilometry by conducting deep learning from its digital twin[J/OL]. Optics Express, 2020, 28(24): 36568. [2021–05–08]. https://www.osapublishing.org/abstract.cfm?URI=oe-28-24-36568.
    [469] YIN W, HU Y, FENG S, et al. Single-shot 3D shape measurement using an end-to-end stereo matching network for speckle projection profilometry [J]. Optics Express, 2021, 29(9): 13388-13407. doi:  10.1364/OE.418881
    [470] QIAO G, HUANG Y, SONG Y, et al. A single-shot phase retrieval method for phase measuring deflectometry based on deep learning [J]. Optics Communications, 2020, 476: 126303. doi:  10.1016/j.optcom.2020.126303
    [471] ZHOU W, SONG Y, QU X, et al. Fourier transform profilometry based on convolution neural network[C/OL]. HAN S, YOSHIZAWA T, ZHANG S. Optical Metrology and Inspection for Industrial Applications V. Beijing, China: SPIE, 2018: 62. [2020–03–15]. https://www.spiedigitallibrary.org/conference-proceedings-of-spie/10819/2500884/Fourier-transform-profilometry-based-on-convolution-neural-network/10.1117/12.2500884.full.
    [472] YANG T, ZHANG Z, LI H, et al. Single-shot phase extraction for fringe projection profilometry using deep convolutional generative adversarial network [J]. Measurement Science and Technology, 2020, 32(1): 015007. doi:  10.1088/1361-6501/aba5c5
    [473] YAN K, YU Y, HUANG C, et al. Fringe pattern denoising based on deep learning[J/OL]. Optics Communications, 2019, 437: 148–152. [2021–11–29]. https://linkinghub.elsevier.com/retrieve/pii/S0030401818311076.
    [474] GERSHUN A. The light field [J]. Journal of Mathematics and Physics, 1939, 18(1-4): 51-151. doi:  10.1002/sapm193918151
    [475] ADELSON E H, WANG J Y A. Single lens stereo with a plenoptic camera [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1992, 14(2): 99-106. doi:  10.1109/34.121783
    [476] NG R, LEVOY M, BREDIF M, et al. Light field photography with a hand-held plenoptic camera[D]. California: Stanford University, 2005.
    [477] PERWASS C, WIETZKE L. Single lens 3D-camera with extended depth-of-field[C/OL]. [2019–06–04]. http://proceedings.spiedigitallibrary.org/proceeding.aspx?doi=10.1117/12.909882.
    [478] HEINZE C, SPYROPOULOS S, HUSSMANN S, et al. Automated robust metric calibration algorithm for multifocus plenoptic cameras [J]. IEEE Transactions on Instrumentation and Measurement, 2016, 65(5): 1197-1205. doi:  10.1109/TIM.2015.2507412
    [479] LEVOY M, HANRAHAN P. Light field rendering[C/OL]//Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques - SIGGRAPH ’96. Not Known: ACM Press, 1996: 31–42. [2019–05–10]. http://portal.acm.org/citation.cfm?doid=237170.237199.
    [480] YANG J C, EVERETT M, BUEHLER C, et al. A real-time distributed light field camera. [J]. Rendering Techniques, 2002, 2002: 77-86.
    [481] WILBURN B, JOSHI N, VAISH V, et al. High performance imaging using large camera arrays[C/OL]//ACM SIGGRAPH 2005 Papers. New York, NY, USA: ACM, 2005: 765–776. [2019–06–05]. http://doi.acm.org/10.1145/1186822.1073259.
    [482] LIN X, WU J, ZHENG G, et al. Camera array based light field microscopy [J]. Biomedical Optics Express, 2015, 6(9): 3179-3189. doi:  10.1364/BOE.6.003179
    [483] VEERARAGHAVAN A, RASKAR R, AGRAWAL A, et al. Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing [J]. ACM Trans Graph, 2007, 26(3): 69. doi:  10.1145/1276377.1276463
    [484] MARWAH K, WETZSTEIN G, BANDO Y, et al. Compressive light field photography using overcomplete dictionaries and optimized projections [J]. ACM Transactions on Graphics, 2013, 32(4): 1-12. doi:  10.1145/2461912.2461914
    [485] LIANG C-K, LIN T-H, WONG B-Y, et al. Programmable aperture photography: Multiplexed light field acquisition [M]//ACM SIGGRAPH 2008 papers. 2008: 1-10.
    [486] ORTH A, CROZIER K B. Light field moment imaging [J]. Optics Letters, 2013, 38(15): 2666. doi:  10.1364/OL.38.002666
    [487] PARK J-H, LEE S-K, JO N-Y, et al. Light ray field capture using focal plane sweeping and its optical reconstruction using 3D displays [J]. Optics Express, 2014, 22(21): 25444. doi:  10.1364/OE.22.025444
    [488] CHEN N, REN Z, LI D, et al. Analysis of the noise in backprojection light field acquisition and its optimization [J]. Applied Optics, 2017, 56(13): F20. doi:  10.1364/AO.56.000F20
    [489] LU C-H, MUENZEL S, FLEISCHER J W. High-resolution light-field imaging via phase space retrieval [J]. Applied Optics, 2019, 58(5): A142. doi:  10.1364/AO.58.00A142
    [490] CHEN N, ZUO C. 3D imaging technology based on depth measurement [J]. Infrared and Laser Engineering, 2019, 48(6): 0603013. (in Chinese) doi:  10.3788/IRLA201948.0603013
    [491] XIAO X, WANG Z, SUN C, et al. Research on focusing and ranging method based on light field camera technology [J]. Acta Photonica Sinica, 2008, 37(12): 2539. (in Chinese)
    [492] VAISH V, GARG G, TALVALA E, et al. Synthetic aperture focusing using a shear-warp factorization of the viewing transform[C/OL]//2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05). San Diego, CA, USA: IEEE, 2005: 129. [2019–06–04]. http://ieeexplore.ieee.org/document/1565441/.
    [493] Godfrey Hounsfield[Z/OL]. (2019–04–15)[2019–06–28].https://en.wikipedia.org/w/index.php?title=Godfrey_Hounsfield&oldid=892611195.
    [494] RADON J. On the determination of functions from their integral values along certain manifolds [J]. IEEE Transactions on Medical Imaging, 1986, 5(4): 170-176. doi:  10.1109/TMI.1986.4307775
    [495] Allan MacLeod Cormack[Z/OL]. (2019–06–04)[2019–06–29]. https://en.wikipedia.org/w/index.php?title=Allan_MacLeod_Cormack&oldid=900263100.
    [496] Isidor Isaac Rabi - Wikipedia[EB/OL]. [2019–06–29]. https://en.wikipedia.org/wiki/Isidor_Isaac_Rabi.
    [497] Nuclear magnetic resonance - Wikipedia[EB/OL]. [2019–06–29]. https://en.wikipedia.org/wiki/Nuclear_magnetic_resonance.
    [498] WEBB R H. Confocal optical microscopy [J]. Reports on Progress in Physics, 1996, 59(3): 427. doi:  10.1088/0034-4885/59/3/003
    [499] DIASPRO A. Confocal and Two-Photon Microscopy: Foundations, Applications and Advances[M/OL]. New York : Wiley-Liss, 2002. http://adsabs.harvard.edu/abs/2001 ctmf.book.....D.
    [500] ZIPFEL W R, WILLIAMS R M, WEBB W W. Nonlinear magic: Multiphoton microscopy in the biosciences [J]. Nature Biotechnology, 2003, 21(11): 1369-1377. doi:  10.1038/nbt899
    [501] OLARTE O E, ANDILLA J, GUALDA E J, et al. Light-sheet microscopy: A tutorial [J]. Advances in Optics and Photonics, 2018, 10(1): 111. doi:  10.1364/AOP.10.000111
    [502] SARDER P, NEHORAI A. Deconvolution methods for 3-D fluorescence microscopy images [J]. IEEE Signal Processing Magazine, 2006, 23(3): 32-45. doi:  10.1109/MSP.2006.1628876
    [503] WEINSTEIN M, CASTLEMAN K R. Reconstructing 3-D specimens from 2-D section images[C/OL]//Quantitative Imagery in the Biomedical Sciences I. International Society for Optics and Photonics, 1971: 131–138. [2019–06–29].https://www.spiedigitallibrary.org/conference-proceedings-of-spie/0026/0000/Reconstructing-3-D-Specimens-From-2-D-Section-Images/10.1117/12.975337.short.
    [504] AGARD D A. Optical sectioning microscopy: Cellular architecture in three dimensions [J]. Annual Review of Biophysics and Bioengineering, 1984, 13: 191-219. doi:  10.1146/annurev.bb.13.060184.001203
    [505] GIBSON S F, LANNI F. Experimental test of an analytical model of aberration in an oil-immersion objective lens used in three-dimensional light microscopy [J]. JOSA A, 1992, 9(1): 154-166. doi:  10.1364/JOSAA.9.000154
    [506] MONCK J R, OBERHAUSER A F, KEATING T J, et al. Thin-section ratiometric Ca2+ images obtained by optical sectioning of fura-2 loaded mast cells [J]. The Journal of Cell Biology, 1992, 116(3): 745-759. doi:  10.1083/jcb.116.3.745
    [507] MCNALLY J G, KARPOVA T, COOPER J, et al. Three-dimensional imaging by deconvolution microscopy [J]. Methods, 1999, 19(3): 373-385. doi:  10.1006/meth.1999.0873
    [508] PREZA C, MILLER M I, THOMAS L J, et al. Regularized linear method for reconstruction of three-dimensional microscopic objects from optical sections [J]. Journal of the Optical Society of America A, Optics and Image Science, 1992, 9(2): 219-228. doi:  10.1364/JOSAA.9.000219
    [509] DEY N, BLANC-FERAUD L, ZIMMER C, et al. Richardson–lucy algorithm with total variation regularization for 3D confocal microscope deconvolution [J]. Microscopy Research and Technique, 2006, 69(4): 260-266. doi:  10.1002/jemt.20294
    [510] VAN KEMPEN G M P, VAN VLIET L J. Background estimation in nonlinear image restoration [J]. Journal of the Optical Society of America A, 2000, 17(3): 425. doi:  10.1364/JOSAA.17.000425
    [511] REMMELE S, SEELAND M, HESSER J. Fluorescence Microscopy Deconvolution Based on Bregman Iteration and Richardson-Lucy Algorithm with TV Regularization[M/OL]//Bildverarbeitung Für Die Medizin. Berlin: Springer, 2008: 72–76. [2018–07–20].https://link.springer.com/chapter/10.1007/978-3-540-78640-5_15.
    [512] HOLMES T J. Maximum-likelihood image restoration adapted for noncoherent optical imaging [J]. Journal of the Optical Society of America A, 1988, 5(5): 666. doi:  10.1364/JOSAA.5.000666
    [513] HOLMES T J. Blind deconvolution of quantum-limited incoherent imagery: Maximum-likelihood approach [J]. Journal of the Optical Society of America A, 1992, 9(7): 1052. doi:  10.1364/JOSAA.9.001052
    [514] HOLMES T J, O’CONNOR N J. Blind deconvolution of 3 d transmitted light brightfield micrographs [J]. Journal of Microscopy, 2000, 200(2): 114-127. doi:  10.1046/j.1365-2818.2000.00751.x
    [515] MARKHAM J, CONCHELLO J-A. Parametric blind deconvolution: A Robust method for the simultaneous estimation of image and blur [J]. JOSA A, 1999, 16(10): 2377-2391. doi:  10.1364/JOSAA.16.002377
    [516] LEVOY M, NG R, ADAMS A, et al. Light Field Microscopy[C/OL]//ACM SIGGRAPH 2006 Papers. New York, USA: ACM, 2006: 924–934. [2019–06–05]. http://doi.acm.org/10.1145/1179352.1141976.
    [517] LEVOY M, ZHANG Z, MCDOWALL I. Recording and controlling the 4D light field in a microscope using microlens arrays [J]. Journal of Microscopy, 2009, 235(2): 144-162. doi:  10.1111/j.1365-2818.2009.03195.x
    [518] BROXTON M, GROSENICK L, YANG S, et al. Wave optics theory and 3-D deconvolution for the light field microscope [J]. Optics Express, 2013, 21(21): 25418. doi:  10.1364/OE.21.025418
    [519] GUO C, LIU W, HUA X, et al. Fourier light-field microscopy [J]. Optics Express, 2019, 27(18): 25573. doi:  10.1364/OE.27.025573
    [520] PREVEDEL R, YOON Y-G, HOFFMANN M, et al. Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy [J]. Nature Methods, 2014, 11(7): 727-730. doi:  10.1038/nmeth.2964
    [521] PÉGARD N C, LIU H-Y, ANTIPA N, et al. Compressive light-field microscopy for 3D neural activity recording [J]. Optica, 2016, 3(5): 517. doi:  10.1364/OPTICA.3.000517
    [522] SKOCEK O, NÖBAUER T, WEILGUNY L, et al. High-speed volumetric imaging of neuronal activity in freely moving rodents [J]. Nature Methods, 2018, 15(6): 429-432. doi:  10.1038/s41592-018-0008-0
    [523] LI H, GUO C, KIM-HOLZAPFEL D, et al. Fast, volumetric live-cell imaging using high-resolution light-field microscopy [J]. Biomedical Optics Express, 2019, 10(1): 29. doi:  10.1364/BOE.10.000029
    [524] ZHANG Z, BAI L, CONG L, et al. Imaging volumetric dynamics at high speed in mouse and zebrafish brain with confocal light field microscopy [J]. Nature Biotechnology, 2021, 39(1): 74-83. doi:  10.1038/s41587-020-0628-7
    [525] WU J, LU Z, JIANG D, et al. Iterative tomography with digital adaptive optics permits hour-long intravital observation of 3D subcellular dynamics at millisecond scale [J]. Cell, 2021, 184(12): 3318-3332. doi:  10.1016/j.cell.2021.04.029
    [526] WOLF E. Three-dimensional structure determination of semi-transparent objects from holographic data [J]. Optics Communications, 1969, 1(4): 153-156. doi:  10.1016/0030-4018(69)90052-2
    [527] KAK A C, SLANEY M. Principles of Computerized Tomographic Imaging[M/OL]. Philadelphia: SIAM, 2001. [2017–09–25]. http://epubs.siam.org/doi/pdf/10.1137/1.9780898719277.fm.
    [528] HAEBERLÉ O, BELKEBIR K, GIOVANINNI H, et al. Tomographic diffractive microscopy: Basics, techniques and perspectives [J]. Journal of Modern Optics, 2010, 57(9): 686-699. doi:  10.1080/09500340.2010.493622
    [529] RAPPAZ B, MARQUET P, CUCHE E, et al. Measurement of the integral refractive index and dynamic cell morphometry of living cells with digital holographic microscopy [J]. Optics Express, 2005, 13(23): 9361-9373. doi:  10.1364/OPEX.13.009361
    [530] LAUER V. New approach to optical diffraction tomography yielding a vector equation of diffraction tomography and a novel tomographic microscope [J]. Journal of Microscopy, 2002, 205(2): 165-176. doi:  10.1046/j.0022-2720.2001.00980.x
    [531] CHOI W. Tomographic phase microscopy and its biological applications[J/OL]. 3D Research, 2012, 3(4): 1-11. [2017–11–20]. http://link.springer.com/10.1007/3 DRes.04(2012)5.
    [532] CHARRIÈRE F, MARIAN A, MONTFORT F, et al. Cell refractive index tomography by digital holographic microscopy [J]. Optics Letters, 2006, 31(2): 178. doi:  10.1364/OL.31.000178
    [533] CHARRIÈRE F, PAVILLON N, COLOMB T, et al. Living specimen tomography by digital holographic microscopy: Morphometry of testate amoeba [J]. Optics Express, 2006, 14(16): 7005. doi:  10.1364/OE.14.007005
    [534] CHOI W, FANG-YEN C, BADIZADEGAN K, et al. Tomographic phase microscopy [J]. Nature Methods, 2007, 4(9): 717-719. doi:  10.1038/nmeth1078
    [535] SUNG Y, CHOI W, FANG-YEN C, et al. Optical diffraction tomography for high resolution live cell imaging [J]. Optics Express, 2009, 17(1): 266-277. doi:  10.1364/OE.17.000266
    [536] KIM K, YOON H, DIEZ-SILVA M, et al. High-resolution three-dimensional imaging of red blood cells parasitized by plasmodium falciparum and in situ hemozoin crystals using optical diffraction tomography [J]. Journal of Biomedical Optics, 2013, 19(1): 011005. doi:  10.1117/1.JBO.19.1.011005
    [537] COTTE Y, TOY F, JOURDAIN P, et al. Marker-free phase nanoscopy [J]. Nature Photonics, 2013, 7(2): 113-117. doi:  10.1038/nphoton.2012.329
    [538] DEVANEY A J. A filtered backpropagation algorithm for diffraction tomography [J]. Ultrasonic Imaging, 1982, 4(4): 336-350. doi:  10.1016/0161-7346(82)90017-7
    [539] BARTY A, NUGENT K A, ROBERTS A, et al. Quantitative phase tomography [J]. Optics Communications, 2000, 175(4): 329-336. doi:  10.1016/S0030-4018(99)00726-9
    [540] DEVANEY A J. Inverse-scattering theory within the rytov approximation [J]. Optics Letters, 1981, 6(8): 374-376. doi:  10.1364/OL.6.000374
    [541] CHEN B, STAMNES J J. Validity of diffraction tomography based on the first born and the first rytov approximations [J]. Applied Optics, 1998, 37(14): 2996-3006. doi:  10.1364/AO.37.002996
    [542] DEBAILLEUL M, GEORGES V, SIMON B, et al. High-resolution three-dimensional tomographic diffractive microscopy of transparent inorganic and biological samples [J]. Optics Letters, 2009, 34(1): 79-81. doi:  10.1364/OL.34.000079
    [543] FANG-YEN C, CHOI W, SUNG Y, et al. Video-rate tomographic phase microscopy [J]. Journal of Biomedical Optics, 2011, 16(1): 011005. doi:  10.1117/1.3522506
    [544] KIM K, KIM K S, PARK H, et al. Real-time visualization of 3-D dynamic microscopic objects using optical diffraction tomography [J]. Optics Express, 2013, 21(26): 32269. doi:  10.1364/OE.21.032269
    [545] JIN D, ZHOU R, YAQOOB Z, et al. Dynamic spatial filtering using a digital micromirror device for high-speed optical diffraction tomography [J]. Optics Express, 2018, 26(1): 428. doi:  10.1364/OE.26.000428
    [546] KIM T, ZHOU R, MIR M, et al. White-light diffraction tomography of unlabelled live cells [J]. Nature Photonics, 2014, 8(3): 256-263. doi:  10.1038/nphoton.2013.350
    [547] LEE K, KIM K, KIM G, et al. Time-multiplexed structured illumination using a dmd for optical diffraction tomography [J]. Optics Letters, 2017, 42(5): 999. doi:  10.1364/OL.42.000999
    [548] ISIKMAN S O, BISHARA W, MAVANDADI S, et al. Lens-free optical tomographic microscope with a large imaging volume on a chip [J]. Proceedings of the National Academy of Sciences, 2011, 108(18): 7296-7301. doi:  10.1073/pnas.1015638108
    [549] SOTO J M, RODRIGO J A, ALIEVA T. Label-free quantitative 3D tomographic imaging for partially coherent light microscopy [J]. Optics Express, 2017, 25(14): 15699-15712. doi:  10.1364/OE.25.015699
    [550] TIAN L, WALLER L. 3D intensity and phase imaging from light field measurements in an led array microscope [J]. Optica, 2015, 2(2): 104. doi:  10.1364/OPTICA.2.000104
    [551] ZHANG R, CAI Z, SUN J, et al. Calculation of coherent field and its application in optical imaging [J]. Laser & Optoelectronics Progress, 2021, 58(18): 1811025. (in Chinese)
    [552] WOLF E. New theory of partial coherence in the space–frequency domain. part I: Spectra and cross spectra of steady-state sources [J]. JOSA, 1982, 72(3): 343-351. doi:  10.1364/JOSA.72.000343
    [553] WIGNER E. On the quantum correction for thermodynamic equilibrium [J]. Physical Review, 1932, 40(5): 749-759. doi:  10.1103/PhysRev.40.749
    [554] DOLIN LS. Beam description of weakly-inhomogeneous wave fields [J]. Izv Vyssh Uchebn Zaved Radiofiz., 1964, 7: 559-563.
    [555] WALTHER A. Radiometry and coherence [J]. JOSA, 1968, 58(9): 1256-1259. doi:  10.1364/JOSA.58.001256
    [556] WALTHER A. Radiometry and coherence [J]. JOSA, 1973, 63(12): 1622-1623. doi:  10.1364/JOSA.63.001622
    [557] BASTIAANS M J. The Wigner distribution function applied to optical signals and systems [J]. Optics Communications, 1978, 25(1): 26-30. doi:  10.1016/0030-4018(78)90080-9
    [558] BASTIAANS M J. Wigner distribution function and its application to first-order optics [J]. JOSA, 1979, 69(12): 1710-1716. doi:  10.1364/JOSA.69.001710
    [559] BASTIAANS M J. Application of the wigner distribution function to partially coherent light [J]. JOSA A, 1986, 3(8): 1227-1238. doi:  10.1364/JOSAA.3.001227
    [560] TESTORF M, HENNELLY B, OJEDA-CASTANEDA J. Phase-SPACE Optics: Fundamentals and Applications[M]. New York: McGraw-Hill Education, 2009.
    [561] ZERNIKE F. The concept of degree of coherence and its application to optical problems [J]. Physica, 1938, 5(8): 785-795. doi:  10.1016/S0031-8914(38)80203-2
    [562] SANTARSIERO M, BORGHI R. Measuring spatial coherence by using a reversed-wavefront young interferometer [J]. Optics Letters, 2006, 31(7): 861. doi:  10.1364/OL.31.000861
    [563] GONZÁLEZ A I, MEJÍA Y. Nonredundant array of apertures to measure the spatial coherence in two dimensions with only one interferogram [J]. JOSA A, 2011, 28(6): 1107-1113. doi:  10.1364/JOSAA.28.001107
    [564] NAIK D N, PEDRINI G, TAKEDA M, et al. Spectrally resolved incoherent holography: 3D spatial and spectral imaging using a Mach–Zehnder radial-shearing interferometer [J]. Optics Letters, 2014, 39(7): 1857. doi:  10.1364/OL.39.001857
    [565] IACONIS C, WALMSLEY I A. Direct measurement of the two-point field correlation function [J]. Optics Letters, 1996, 21(21): 1783-1785. doi:  10.1364/OL.21.001783
    [566] NAIK D N, PEDRINI G, OSTEN W. Recording of incoherent-object hologram as complex spatial coherence function using sagnac radial shearing interferometer and a pockels cell [J]. Optics Express, 2013, 21(4): 3990-3995. doi:  10.1364/OE.21.003990
    [567] RAYMER M G, BECK M, MCALISTER D. Complex wave-field reconstruction using phase-space tomography [J]. Physical Review Letters, 1994, 72(8): 1137-1140. doi:  10.1103/PhysRevLett.72.1137
    [568] MCALISTER D F, BECK M, CLARKE L, et al. Optical phase retrieval by phase-space tomography and fractional-order fourier transforms [J]. Optics Letters, 1995, 20(10): 1181-1183. doi:  10.1364/OL.20.001181
    [569] WALLER L, SITU G, FLEISCHER J W. Phase-space measurement and coherence synthesis of optical beams [J]. Nature Photonics, 2012, 6(7): 474-479. doi:  10.1038/nphoton.2012.144
    [570] TIAN L, ZHANG Z, PETRUCCELLI J C, et al. Wigner function measurement using a lenslet array [J]. Optics Express, 2013, 21(9): 10511-10525. doi:  10.1364/OE.21.010511
    [571] GLASNER D, BAGON S, IRANI M. Super-resolution from a single image[C/OL]//2009 IEEE 12 th International Conference on Computer Vision. Kyoto: IEEE, 2009: 349–356. [2019–06–05]. http://ieeexplore.ieee.org/document/5459271/.
    [572] HUANG J-B, SINGH A, AHUJA N. Single image super-resolution from transformed self-exemplars[C/OL]//2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Boston, MA, USA: IEEE, 2015: 5197–5206. [2019–06–05]. http://ieeexplore.ieee.org/document/7299156/.
    [573] KWANG IN KIM, YOUNGHEE KWON. Single-image super-resolution using sparse regression and natural image prior [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(6): 1127-1133. doi:  10.1109/TPAMI.2010.25
    [574] WANG D, FU T, BI G, et al. Long-distance sub-diffraction high-resolution imaging using sparse sampling [J]. Sensors, 2020, 20(11): 3116. doi:  10.3390/s20113116
    [575] XIANG M, PAN A, ZHAO Y, et al. Coherent synthetic aperture imaging for visible remote sensing via reflective fourier ptychography [J]. Optics Letters, 2021, 46(1): 29-32. doi:  10.1364/OL.409258
    [576] BIONDI F. Recovery of partially corrupted SAR images by super-resolution based on spectrum extrapolation [J]. IEEE Geoscience and Remote Sensing Letters, 2016, 14(2): 139-143.
    [577] BHATTACHARJEE S, SUNDARESHAN M K. Mathematical extrapolation of image spectrum for constraint-set design and set-theoretic superresolution [J]. JOSA A, 2003, 20(8): 1516-1527. doi:  10.1364/JOSAA.20.001516
    [578] ELAD M, DATSENKO D. Example-based regularization deployed to super-resolution reconstruction of a single image [J]. The Computer Journal, 2009, 52(1): 15-30.
    [579] BEVILACQUA M, ROUMY A, GUILLEMOT C, et al. Single-image super-resolution via linear mapping of interpolated self-examples [J]. IEEE Transactions on Image Processing, 2014, 23(12): 5334-5347. doi:  10.1109/TIP.2014.2364116
    [580] DONG C, LOY C C, HE K, et al. Image super-resolution using deep convolutional networks [J]. IEEE Transactions on Pattern Analysis and Machine Iintelligence, 2015, 38(2): 295-307.
    [581] ZOU Y, ZHANG L, LIU C, et al. Super-resolution reconstruction of infrared images based on a convolutional neural network with skip connections [J]. Optics and Lasers in Engineering, 2021, 146: 106717. doi:  10.1016/j.optlaseng.2021.106717
    [582] WANG B, ZOU Y, ZHANG L, et al. Low-light-level image super-resolution reconstruction based on a multi-scale features extraction network [J]. Photonics, 2021, 8(8): 321.
    [583] VANDEWALLE P, SÜSSTRUNK S, VETTERLI M. A frequency domain approach to registration of aliased images with application to super-resolution[J/OL]. EURASIP Journal on Advances in Signal Processing, 2006(1): 71459. [2019–06–05].https://asp-eurasipjournals.springeropen.com/articles/10.1155/ASP/2006/71459.
    [584] NGUYEN N, MILANFAR P. A wavelet-based interpolation-restoration method for superresolution (wavelet superresolution) [J]. Circuits Systems and Signal Processing, 2000, 19(4): 321-338. doi:  10.1007/BF01200891
    [585] SUNG CHEOL PARK, MIN KYU PARK, MOON GI KANG. Super-resolution image reconstruction: A technical overview [J]. IEEE Signal Processing Magazine, 2003, 20(3): 21-36. doi:  10.1109/MSP.2003.1203207
    [586] WANG Z, LIU D, YANG J, et al. Deep networks for image super-resolution with sparse prior[C/OL]//2015 IEEE International Conference on Computer Vision (ICCV). Santiago, Chile: IEEE, 2015: 370–378. [2019–06–05]. http://ieeexplore.ieee.org/document/7410407/.
    [587] IRANI M, PELEG S. Improving resolution by image registration [J]. CVGIP:Graphical Models and Image Processing, 1991, 53(3): 231-239. doi:  10.1016/1049-9652(91)90045-L
    [588] CHEN J, LI Y, CAO L. Research on region selection super resolution restoration algorithm based on infrared micro-scanning optical imaging model [J]. Scientific Reports, 2021, 11(1): 1-8. doi:  10.1038/s41598-020-79139-8
    [589] ZHANG X, HUANG W, XU M, et al. Super-resolution imaging for infrared micro-scanning optical system [J]. Optics Express, 2019, 27(5): 7719-7737. doi:  10.1364/OE.27.007719
    [590] DAI S, LIU J, XIANG H, et al. Super-resolution reconstruction of images based on uncontrollable microscanning and genetic algorithm [J]. Optoelectronics Letters, 2014, 10(4): 313-316. doi:  10.1007/s11801-014-4067-x
    [591] HUSZKA G, GIJS M A. Turning a normal microscope into a super-resolution instrument using a scanning microlens array [J]. Scientific Reports, 2018, 8: 601.
    [592] GUNTURK B K, ALTUNBASAK Y, MERSEREAU R M. Super-resolution reconstruction of compressed video using transform-domain statistics [J]. IEEE Transactions on Image Processing, 2004, 13(1): 33-43. doi:  10.1109/TIP.2003.819221
    [593] CABANSKI W A, BREITER R, MAUK K-H, et al. Miniaturized high-performance starring thermal imaging system[C]//Infrared Detectors and Focal Plane Arrays VI. International Society for Optics and Photonics, 2000: 208–219.
    [594] WANG B, ZUO C, SUN J, et al. A computational super-resolution technique based on coded aperture imaging[C/OL]. PETRUCCELLI J C, TIAN L, PREZA C. Computational Imaging V. United States: SPIE, 2020: 25. [2020–10–13].https://www.spiedigitallibrary.org/conference-proceedings-of-spie/11396/2560579/A-computational-super-resolution-technique-based-on-coded-aperture-imaging/10.1117/12.2560579.full.
    [595] LUCKE R L, RICKARD L J, BASHKANSKY M, et al. Synthetic aperture ladar (SAL): Fundamental theory, design equations for a satellite system, and laboratory demonstration[R/OL]. Fort Belvoir, VA: Defense Technical Information Center, 2002. [2019–06–05]. http://www.dtic.mil/docs/citations/ADA409859.
    [596] BASHKANSKY M, LUCKE R L, FUNK E, et al. Two-dimensional synthetic aperture imaging in the optical domain [J]. Optics Letters, 2002, 27(22): 1983. doi:  10.1364/OL.27.001983
    [597] BECK S M, BUCK J R, BUELL W F, et al. Synthetic-aperture imaging laser radar: Laboratory demonstration and signal processing [J]. Applied Optics, 2005, 44(35): 7621. doi:  10.1364/AO.44.007621
    [598] GARCÍA J, MICÓ V, GARCÍA-MARTÍNEZ P, et al. Synthetic aperture superresolution by structured light projection[C/OL]//AIP Conference Proceedings. Toledo (Spain): AIP, 2006: 136–145. [2019–06–05]. http://aip.scitation.org/doi/abs/10.1063/1.2361214.
    [599] GARCÍA J, ZALEVSKY Z, FIXLER D. Synthetic aperture superresolution by speckle pattern projection [J]. Optics Express, 2005, 13(16): 6073. doi:  10.1364/OPEX.13.006073
    [600] HOLLOWAY J, WU Y, SHARMA M K, et al. SAVI: Synthetic apertures for long-range, subdiffraction-limited visible imaging using fourier ptychography [J]. Science Advances, 2017, 3(4): e1602564. doi:  10.1126/sciadv.1602564
    [601] KOCH B. Status and future of laser scanning, synthetic aperture radar and hyperspectral remote sensing data for forest biomass assessment [J]. ISPRS Journal of Photogrammetry and Remote Sensing, 2010, 65(6): 581-590. doi:  10.1016/j.isprsjprs.2010.09.001
    [602] HOLLOWAY J, ASIF M S, SHARMA M K, et al. Toward long distance, sub-diffraction imaging using coherent camera arrays[J/OL]. ArXiv: 1510.08470 [Physics], 2015. [2019–12–18]. http://arxiv.org/abs/1510.08470.
    [603] KENDRICK R L, DUNCAN A, OGDEN C, et al. Segmented planar imaging detector for eo reconnaissance[C]//Imaging and Applied Optics, OSA, 2013: CM4 C. 1.
    [604] KENDRICK R L, DUNCAN A, OGDEN C, et al. Flat-panel space-based space surveillance sensor[C]//Advanced Maui Optical and Space Surveillance Technologies Conference, 2013.
    [605] KATZ B, ROSEN J. Super-resolution in incoherent optical imaging using synthetic aperture with fresnel elements [J]. Optics Express, 2010, 18(2): 962-972. doi:  10.1364/OE.18.000962
    [606] ABBE E. Beiträge zur Theorie des mikroskops und der mikroskopischen wahrnehmung [J]. Archiv Für Mikroskopische Anatomie, 1873, 9(1): 413-418.
    [607] BETZIG E, PATTERSON G H, SOUGRAT R, et al. Imaging intracellular fluorescent proteins at nanometer resolution [J]. Science, 2006, 313(5793): 1642-1645. doi:  10.1126/science.1127344
    [608] RUST M J, BATES M, ZHUANG X. Sub-diffraction-limit imaging by stochastic optical reconstruction microscopy (STORM) [J]. Nature Methods, 2006, 3(10): 793. doi:  10.1038/nmeth929
    [609] HELL S W, WICHMANN J. Breaking the diffraction resolution limit by stimulated emission: stimulated-emission-depletion fluorescence microscopy[J]. Optics Letters, 1994, 19(11):780-782.
    [610] GUSTAFSSON M G. Surpassing the lateral resolution limit by a factor of two using structured illumination microscopy [J]. Journal of Microscopy, 2000, 198(2): 82-87. doi:  10.1046/j.1365-2818.2000.00710.x
    [611] HESS S T, GIRIRAJAN T P, MASON M D. Ultra-high resolution imaging by fluorescence photoactivation localization microscopy [J]. Biophysical Journal, 2006, 91(11): 4258-4272. doi:  10.1529/biophysj.106.091116
    [612] XIA P, LIU X, WU B, et al. Superresolution imaging reveals structural features of EB1 in microtubule plus-end tracking [J]. Molecular Biology of the Cell, 2014, 25(25): 4166-4173. doi:  10.1091/mbc.e14-06-1133
    [613] NICKERSON A, HUANG T, LIN L-J, et al. Photoactivated localization microscopy with bimolecular fluorescence complementation (BiFC-PALM) for nanoscale imaging of protein-protein interactions in cells [J]. PloS One, 2014, 9(6): e100589. doi:  10.1371/journal.pone.0100589
    [614] LIU Z, XING D, SU Q P, et al. Super-resolution imaging and tracking of protein–protein interactions in sub-diffraction cellular space [J]. Nature Communications, 2014, 5: 4443. doi:  10.1038/ncomms5443
    [615] XU K, ZHONG G, ZHUANG X. Actin, spectrin, and associated proteins form a periodic cytoskeletal structure in axons [J]. Science, 2013, 339(6118): 452-456. doi:  10.1126/science.1232251
    [616] KLAR T A, JAKOBS S, DYBA M, et al. Fluorescence microscopy with diffraction resolution barrier broken by stimulated emission [J]. Proceedings of the National Academy of Sciences, 2000, 97(15): 8206-8210. doi:  10.1073/pnas.97.15.8206
    [617] WILLIG K I, RIZZOLI S O, WESTPHAL V, et al. STED microscopy reveals that synaptotagmin remains clustered after synaptic vesicle exocytosis [J]. Nature, 2006, 440(7086): 935. doi:  10.1038/nature04592
    [618] HELL S W. Far-field optical nanoscopy [J]. Science, 2007, 316(5828): 1153-1158. doi:  10.1126/science.1137395
    [619] KNER P, CHHUN B B, GRIFFIS E R, et al. Super-resolution video microscopy of live cells by structured illumination [J]. Nature Methods, 2009, 6(5): 339. doi:  10.1038/nmeth.1324
    [620] HUANG X, FAN J, LI L, et al. Fast, long-term, super-resolution imaging with hessian structured illumination Microscopy [J]. Nature Biotechnology, 2018, 36(5): 451-459. doi:  10.1038/nbt.4115
    [621] EGGELING C, WILLIG K I, SAHL S J, et al. Lens-based fluorescence nanoscopy [J]. Quarterly Reviews of Biophysics, 2015, 48(2): 178-243. doi:  10.1017/S0033583514000146
    [622] GUSTAFSSON M G. Nonlinear structured-illumination microscopy: Wide-field fluorescence imaging with theoretically unlimited resolution [J]. Proceedings of the National Academy of Sciences, 2005, 102(37): 13081-13086. doi:  10.1073/pnas.0406877102
    [623] WILDANGER D, MEDDA R, KASTRUP L, et al. A compact STED microscope providing 3D nanoscale resolution [J]. Journal of Microscopy, 2009, 236(1): 35-43. doi:  10.1111/j.1365-2818.2009.03188.x
    [624] SCHMIDT R, WURM C A, JAKOBS S, et al. Spherical nanosized focal spot unravels the interior of cells [J]. Nature Methods, 2008, 5(6): 539. doi:  10.1038/nmeth.1214
    [625] BERNING S, WILLIG K I, STEFFENS H, et al. Nanoscopy in a living mouse brain [J]. Science, 2012, 335(6068): 551-551. doi:  10.1126/science.1215369
    [626] SCHERMELLEH L, CARLTON P M, HAASE S, et al. Subdiffraction multicolor imaging of the nuclear periphery with 3D structured illumination microscopy[J/OL]. Science, 2008, 320(5881): 1332–1336. [2019–06–05]. http://www.sciencemag.org/cgi/doi/10.1126/science.1156947.
    [627] ZHIJIAN L, JINGZE L, YAQIONG W, et al. Principle and recent progress of several super-resolution fluorescence microscopy techniques [J]. Progress in Biochemistry and Biophysics, 2009, 36(12): 1626-1634.
    [628] HUANG B, WANG W, BATES M, et al. Three-dimensional super-resolution imaging by stochastic optical reconstruction microscopy [J]. Science, 2008, 319(5864): 810-813. doi:  10.1126/science.1153529
    [629] PRAKASH K, DIEDERICH B, HEINTZMANN R, et al. Super-resolution microscopy: a brief history and new avenues [J]. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2022, 380(2220): 20210110. doi:  10.1098/rsta.2021.0110
    [630] CHMYROV A, KELLER J, GROTJOHANN T, et al. Nanoscopy with more than 100, 000’doughnuts’ [J]. Nature Methods, 2013, 10(8): 737. doi:  10.1038/nmeth.2556
    [631] SHARONOV A, HOCHSTRASSER R M. Wide-field subdiffraction imaging by accumulated binding of diffusing probes [J]. Proceedings of the National Academy of Sciences, 2006, 103(50): 18911-18916. doi:  10.1073/pnas.0609643104
    [632] BALZAROTTI F, EILERS Y, GWOSCH K C, et al. Nanometer resolution imaging and tracking of fluorescent molecules with minimal photon fluxes [J]. Science, 2017, 355(6325): 606-612. doi:  10.1126/science.aak9913
    [633] DERTINGER T, COLYER R, IYER G, et al. Fast, background-free, 3D super-resolution optical fluctuation imaging (SOFI) [J]. Proceedings of the National Academy of Sciences, 2009, 106(52): 22287-22292. doi:  10.1073/pnas.0907866106
    [634] WANG Z, GUO W, LI L, et al. Optical virtual imaging at 50 nm lateral resolution with a white-light nanoscope [J]. Nature Communications, 2011, 2: 218. doi:  10.1038/ncomms1211
    [635] SIGAL Y M, ZHOU R, ZHUANG X. Visualizing and discovering cellular structures with super-resolution microscopy [J]. Science, 2018, 361(6405): 880-887. doi:  10.1126/science.aau1044
    [636] XIANG H, QING Y, CUIFANG K, et al. Optical super-resolution imaging based on frequency shift [J]. Acta Optica Sinica, 2021, 41(1): 0111001. (in Chinese) doi:  10.3788/AOS202141.0111001
    [637] ZHANGHAO K, CHEN X, LIU W, et al. Super-resolution imaging of fluorescent dipoles via polarized structured illumination microscopy [J]. Nature Communications, 2019, 10(1): 4694. doi:  10.1038/s41467-019-12681-w
    [638] LIANG L, YAN W, QIN X, et al. Designing sub‐2 nm organosilica nanohybrids for far‐field super‐resolution imaging[J]. Angewandte Chemie, 2020, 132(2): 756-761.
    [639] Zhao W, Zhao S, Li L. et al. Sparse deconvolution improves the resolution of live-cell super-resolution fluorescence microscopy[J/OL]. Nature Biotechnology (2021).https://doi.org/10.1038/s41587-021-01092-2.
    [640] DAN D, LEI M, YAO B, et al. DMD-based LED-illumination super-resolution and optical sectioning microscopy [J]. Scientific Reports, 2013, 3(1): 1116. doi:  10.1038/srep01116
    [641] HELL S W, SAHL S J, BATES M, et al. The 2015 super-resolution microscopy roadmap[J]. Journal of Physics D: Applied Physics, 2015, 48(44): 443001.
    [642] CLEGG B. The Man Who Stopped Time: The Illuminating Story Of Eadweard Muybridge–pioneer Photographer, Father of the Motion Picture, Murderer[M]. Washington, DC: Joseph Henry Press, 2007, 7: 1, 106-108.
    [643] FURUTA M, NISHIKAWA Y, INOUE T, et al. A high-speed, high-sensitivity digital CMOS image sensor with a global shutter and 12-bit column-parallel cyclic A/D converters [J]. IEEE Journal of Solid-State Circuits, 2007, 42(4): 766-774. doi:  10.1109/JSSC.2007.891655
    [644] WANG X, YAN L, SI J, et al. High-frame-rate observation of single femtosecond laser pulse propagation in fused silica using an echelon and optical polarigraphy technique [J]. Applied Optics, 2014, 53(36): 8395-8399. doi:  10.1364/AO.53.008395
    [645] KAKUE T, TOSA K, YUASA J, et al. Digital light-in-flight recording by holography by use of a femtosecond pulsed laser [J]. IEEE Journal of Selected Topics in Quantum Electronics, 2011, 18(1): 479-485.
    [646] LI Z, ZGADZAJ R, WANG X, et al. Single-shot tomographic movies of evolving light-velocity objects [J]. Nature Communications, 2014, 5(1): 1-12.
    [647] NAKAGAWA K, IWASAKI A, OISHI Y, et al. Sequentially timed all-optical mapping photography (STAMP) [J]. Nature Photonics, 2014, 8(9): 695-700. doi:  10.1038/nphoton.2014.163
    [648] YUE Q-Y, CHENG Z-J, HAN L, et al. One-shot time-resolved holographic polarization microscopy for imaging laser-induced ultrafast phenomena [J]. Optics Express, 2017, 25(13): 14182-14191. doi:  10.1364/OE.25.014182
    [649] EHN A, BOOD J, LI Z, et al. FRAME: Femtosecond videography for atomic and molecular dynamics[J]. Light: Science & Applications, 2017, 6(9): e17045–e17045.
    [650] FERMANN M E, GALVANAUSKAS A, SUCHA G. Ultrafast Lasers: Technology and Applications[M]. Boca Raton: CRC Press, 2002.
    [651] WEINER A M. Ultrafast optical pulse shaping: A tutorial review [J]. Optics Communications, 2011, 284(15): 3669-3692. doi:  10.1016/j.optcom.2011.03.084
    [652] VELTEN A, WU D, JARABO A, et al. Femto-photography: Capturing and visualizing the propagation of light [J]. ACM Transactions on Graphics (ToG), 2013, 32(4): 1-8.
    [653] GAO L, LIANG J, LI C, et al. Single-shot compressed ultrafast photography at one hundred billion frames per second [J]. Nature, 2014, 516(7529): 74-77. doi:  10.1038/nature14005
    [654] LIANG J, ZHU L, WANG L V. Single-shot real-time femtosecond imaging of temporal focusing[J]. Light: Science & Applications, 2018, 7(1): 1–10.
    [655] QI D, CAO F, XU S, et al. 100-trillion-frame-per-second single-shot compressed ultrafast photography via molecular alignment [J]. Physical Review Applied, 2021, 15(2): 024051. doi:  10.1103/PhysRevApplied.15.024051
    [656] LEI S, ZHANG S. Flexible 3-D shape measurement using projector defocusing [J]. Optics Letters, 2009, 34(20): 3080-3082. doi:  10.1364/OL.34.003080
    [657] AYUBI G A, AYUBI J A, DI MARTINO J M, et al. Pulse-width modulation in defocused three-dimensional fringe projection [J]. Optics Letters, 2010, 35(21): 3682-3684. doi:  10.1364/OL.35.003682
    [658] ZUO C, CHEN Q, FENG S, et al. Optimized pulse width modulation pattern strategy for three-dimensional profilometry with projector defocusing [J]. Applied Optics, 2012, 51(19): 4477-4490. doi:  10.1364/AO.51.004477
    [659] ZUO C, CHEN Q, GU G, et al. High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection [J]. Optics and Lasers in Engineering, 2013, 51(8): 953-960. doi:  10.1016/j.optlaseng.2013.02.012
    [660] WANG Y, ZHANG S. Superfast multifrequency phase-shifting technique with optimal pulse width modulation [J]. Optics Express, 2011, 19(6): 5149-5155. doi:  10.1364/OE.19.005149
    [661] WANG Y, ZHANG S. Three-dimensional shape measurement with binary dithered patterns [J]. Applied Optics, 2012, 51(27): 6631-6636. doi:  10.1364/AO.51.006631
    [662] DAI J, ZHANG S. Phase-optimized dithering technique for high-quality 3D shape measurement [J]. Optics and Lasers in Engineering, 2013, 51(6): 790-795. doi:  10.1016/j.optlaseng.2013.02.003
    [663] DAI J, LI B, ZHANG S. High-quality fringe pattern generation using binary pattern optimization through symmetry and periodicity [J]. Optics and Lasers in Engineering, 2014, 52: 195-200. doi:  10.1016/j.optlaseng.2013.06.010
    [664] SUN J, ZUO C, FENG S, et al. Improved intensity-optimized dithering technique for 3D shape measurement [J]. Optics and Lasers in Engineering, 2015, 66: 158-164. doi:  10.1016/j.optlaseng.2014.09.008
    [665] DAI J, LI B, ZHANG S. Intensity-optimized dithering technique for three-dimensional shape measurement with projector defocusing [J]. Optics and Lasers in Engineering, 2014, 53: 79-85. doi:  10.1016/j.optlaseng.2013.08.015
    [666] LI B, WANG Y, DAI J, et al. Some recent advances on superfast 3D shape measurement with digital binary defocusing techniques [J]. Optics and Lasers in Engineering, 2014, 54: 236-246. doi:  10.1016/j.optlaseng.2013.07.010
    [667] ZHANG S, VAN D W D, OLIVER J. Superfast phase-shifting method for 3-D shape measurement. [J]. Optics Express, 2010, 18(9): 9684. doi:  10.1364/OE.18.009684
    [668] GONG Y, ZHANG S. Ultrafast 3-D shape measurement with an off-the-shelf DLP projector [J]. Optics Express, 2010, 18(19): 19743-19754. doi:  10.1364/OE.18.019743
    [669] ZUO C, TAO T, FENG S, et al. Micro Fourier transform profilometry (μFTP): 3D shape measurement at 10, 000frames per second [J]. Optics and Lasers in Engineering, 2018, 102: 70-91. doi:  10.1016/j.optlaseng.2017.10.013
    [670] YIN W, ZUO C, FENG S, et al. High-speed three-dimensional shape measurement using geometry-constraint-based number-theoretical phase unwrapping [J]. Optics and Lasers in Engineering, 2019, 115: 21-31. doi:  10.1016/j.optlaseng.2018.11.006
    [671] LAUGHNER J I, ZHANG S, LI H, et al. Mapping cardiac surface mechanics with structured light imaging [J]. American Journal of Physiology-Heart and Circulatory Physiology, 2012, 303(6): H712-H720. doi:  10.1152/ajpheart.00269.2012
    [672] ZHANG Q, SU X, CAO Y, et al. Optical 3-D shape and deformation measurement of rotating blades using stroboscopic structured illumination [J]. Optical Engineering, 2005, 44(11): 113601. doi:  10.1117/1.2127927
    [673] SCHAFFER M, GRO\S SE M, HARENDT B, et al. High-speed optical 3-D measurements for shape representation [J]. Optics and Photonics News, 2011, 22(12): 49-49. doi:  10.1364/OPN.22.12.000049
    [674] SCHAFFER M, GROSSE M, HARENDT B, et al. High-speed three-dimensional shape measurements of objects with laser speckles and acousto-optical deflection [J]. Optics Letters, 2011, 36(16): 3097-3099. doi:  10.1364/OL.36.003097
    [675] SCHAFFER M, GRO\S SE M, HARENDT B, et al. Statistical patterns: an approach for high-speed and high-accuracy shape measurements [J]. Optical Engineering, 2014, 53(11): 112205.
    [676] GROSSE M, SCHAFFER M, HARENDT B, et al. Fast data acquisition for three-dimensional shape measurement using fixed-pattern projection and temporal coding [J]. Optical Engineering, 2011, 50(10): 100503. doi:  10.1117/1.3646100
    [677] FUJIGAKI M, SAKAGUCHI T, MURATA Y. Development of a compact 3D shape measurement unit using the light-source-stepping method [J]. Optics and Lasers in Engineering, 2016, 85: 9-17. doi:  10.1016/j.optlaseng.2016.04.016
    [678] HEIST S, MANN A, KÜHMSTEDT P, et al. Array projection of aperiodic sinusoidal fringes for high-speed three-dimensional shape measurement [J]. Optical Engineering, 2014, 53(11): 112208. doi:  10.1117/1.OE.53.11.112208
    [679] HEIST S, LUTZKE P, SCHMIDT I, et al. High-speed three-dimensional shape measurement using GOBO projection [J]. Optics and Lasers in Engineering, 2016, 87: 90-96. doi:  10.1016/j.optlaseng.2016.02.017
    [680] HEIST S. 5D hyperspectral imaging: Fast and accurate measurement of surface shape and spectral characteristics using structured light [J]. Optics Express, 2018, 26(18): 23366-23379. doi:  10.1364/OE.26.023366
    [681] LANDMANN M, HEIST S, DIETRICH P, et al. High-speed 3D thermography [J]. Optics and Lasers in Engineering, 2019, 121: 448-455. doi:  10.1016/j.optlaseng.2019.05.009
    [682] FENG S, ZUO C, YIN W, et al. Micro deep learning profilometry for high-speed 3D surface imaging [J]. Optics and Lasers in Engineering, 2019, 121: 416-427. doi:  10.1016/j.optlaseng.2019.04.020
    [683] YOU L, YANG X, HE Y, et al. Jitter analysis of a superconducting nanowire single photon detector [J]. Aip Advances, 2013, 3(7): 072135. doi:  10.1063/1.4817581
    [684] RAGHURAM A, PEDIREDLA A, NARASIMHAN S G, et al. STORM: Super-resolving transients by oveRsampled measurements[C/OL]//2019 IEEE International Conference on Computational Photography (ICCP). Tokyo, Japan: IEEE, 2019: 1–11. [2022–01–27]. https://ieeexplore.ieee.org/document/8747334/.
    [685] WANG Z, MIKI S, FUJIWARA M. Superconducting nanowire single-photon detectors for quantum information and communications [J]. IEEE Journal of Selected Topics in Quantum Electronics, 2009, 15(6): 1741-1747. doi:  10.1109/JSTQE.2009.2034616
    [686] LINGDONG K, QINGYUAN Z, XUECOU T, et al. Progress and applications of superconducting nanowire delay-line single-photon imagers [J]. Laser & Optoelectronics Progress, 2021, 58(10): 1011002. (in Chinese) doi:  10.3788/LOP202158.1011002
    [687] BEITONG C, QIAN D, XIUMIN X, et al. The progress of single-photon photodetectors[J/OL]. Laser Technology. [2022-02-18]. http://kns.cnki.net/kcms/detail/51.1125.TN.20210927.2354.004.html.
    [688] Tobin R, Halimi A, Mccarthy A, et al. Long-range depth profiling of camouflaged targets using single-photon detection [J]. Optical Engineering, 2017, 57(3): 031303. doi:  10.1117/1.OE.57.3.031303
    [689] LI Z-P, YE J-T, HUANG X, et al. Single-photon imaging over 200 km [J]. Optica, 2021, 8(3): 344. doi:  10.1364/OPTICA.408657
    [690] CHEN S, HALIMI A, REN X, et al. Learning non-local spatial correlations to restore sparse 3D single-photon data [J]. IEEE Transactions on Image Processing, 2019, 29: 3119-3131.
    [691] HUA K, LIU B, CHEN Z, et al. Efficient and noise robust photon-counting imaging with first signal photon unit method[J]. Photonics, 2021, 8(6): 229.
    [692] LI Z-P, HUANG X, JIANG P-Y, et al. Super-resolution single-photon imaging at 8.2 kilometers [J]. Optics Express, 2020, 28(3): 4076-4087. doi:  10.1364/OE.383456
    [693] XUE R, KANG Y, ZHANG T, et al. Sub-pixel scanning high-resolution panoramic 3D imaging based on a SPAD array [J]. IEEE Photonics Journal, 2021, 13(4): 1-6. doi:  10.1109/JPHOT.2021.3103817
    [694] MACCARONE A, MATTIOLI DELLA ROCCA F, MCCARTHY A, et al. Three-dimensional imaging of stationary and moving targets in turbid underwater environments using a single-photon detector array [J]. Optics Express, 2019, 27(20): 28437. doi:  10.1364/OE.27.028437
    [695] LIU Y X, FAN Q, LI X Y, et al. Realization of silicon single-photon detector with ultra-low dark count rate [J]. Acta Optica Sinica, 2020, 40(10): 1004001. (in Chinese)
    [696] LI Z P. Long range single-photon three-dimensional imaging[D/OL]. Heifei: University of Science and Technology of China, 2020.https://kns.cnki.net/KCMS/detail/detail.aspx?dbcode=CDFD&dbname=CDFDLAST2021&filename=1020088480.nh&v=.
    [697] MARINO R, STEPHENS T, HATCH R, et al. A compact 3D imaging laser radar system using Geiger-mode APD arrays: System and measurements[C]//Proceedings of SPIE, 2003, 5086: 501581.
    [698] AULL B. Geiger-mode avalanche photodiode arrays integrated to all-digital CMOS circuits [J]. Sensors, 2016, 16: 495. doi:  10.3390/s16040495
    [699] MARINO R, DAVIS W. Jigsaw : A foliage-penetrating 3D imaging laser radar system[J/OL]. Undefined, 2004.[2022–02–13].https://www.semanticscholar.org/paper/Jigsaw-%3 A-A-Foliage-Penetrating-3-D-Imaging-Laser-Marino-Davis/dd5821 a64 eb27 b04259 c0 fb4 da93 f3 b7601 f70 b1.
    [700] BULLER G S, WALLACE A M. Ranging and three-dimensional imaging using time-correlated single-photon counting and point-by-point acquisition [J]. IEEE Journal of Selected Topics in Quantum Electronics, 2007, 13(4): 1006-1015. doi:  10.1109/JSTQE.2007.902850
    [701] MCCARTHY A, KRICHEL N J, GEMMELL N R, et al. Kilometer-range, high resolution depth imaging via 1560 nm wavelength single-photon detection [J]. Optics Express, 2013, 21(7): 8904-8915. doi:  10.1364/OE.21.008904
    [702] KIRMANI A, VENKATRAMAN D, SHIN D, et al. First-photon imaging [J]. Science, 2014, 343(6166): 58-61. doi:  10.1126/science.1246775
    [703] ALTMANN Y, REN X, MCCARTHY A, et al. Lidar waveform-based analysis of depth images constructed using sparse single-photon data [J]. IEEE Transactions on Image Processing, 2016, 25(5): 1935-1946. doi:  10.1109/TIP.2016.2526784
    [704] LI Z-P, HUANG X, CAO Y, et al. Single-photon computational 3D imaging at 45 km [J]. Photonics Research, 2020, 8(9): 1532-1540. doi:  10.1364/PRJ.390091
    [705] CHEN C, CHEN Q, XU J, et al. Learning to see in the dark [C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2008: 3291-3300.
    [706] LINDELL D B, O’TOOLE M, WETZSTEIN G. Single-photon 3D imaging with deep sensor fusion [J]. ACM Transactions on Graphics, 2018, 37(4): 1-12. doi:  https://doi.org/10.1145/3197517.3201316
    [707] PENG J, XIONG Z, HUANG X, et al. Photon-efficient 3D imaging with a non-local neural network[C]//European Conference on Computer Vision, ECCV, 2020: 225-241.
    [708] TAN H, PENG J, XIONG Z, et al. Deep learning based single-photon 3D imaging with multiple returns [C]//2020 International Conference on 3D Vision (3DV), 2020: 1196-1205.
    [709] ZHAO X, JIANG X, HAN A, et al. Photon-efficient 3D reconstruction employing a edge enhancement method [J]. Optics Express, 2022, 30(2): 1555-1569. doi:  10.1364/OE.446369
    [710] ARGUS-IS. ARGUS-IS[Z/OL].(2020–07–15)[2021–03–08]. https://en.wikipedia.org/w/index.php?title=ARGUS-IS&oldid=967762056.
    [711] WILBURN B, JOSHI N, VAISH V, et al. High-speed videography using a dense camera array[C]//Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR, 2004.
    [712] WILBURN B, JOSHI N, VAISH V, et al. High performance imaging using large camera arrays [J]. ACM Transactions on Graphics, 2005, 24(3): 765-776. doi:  https://doi.org/10.1145/1073204.1073259
    [713] Perrin S. A 360 Degree camera that sees in 3D (w/ Video)[EB/OL]. 2010, [2021–03–08]. https://phys.org/news/2010-12-degree-camera-3d-video.html.
    [714] COGAL O, AKIN A, SEYID K, et al. A new omni-directional multi-camera system for high resolution surveillance[C]//Mobile Multimedia/Image Processing, Security, and Applications 2014. International Society for Optics and Photonics, 2014, 9120: 91200N.
    [715] LAW N M, FORS O, RATZLOFF J, et al. The evryscope: Design and performance of the first full-sky gigapixel-scale telescope[C]//Ground-Based And Airborne Telescopes VI. International Society for Optics and Photonics, 2016, 9906: 99061M.
    [716] LAW N M, FORS O, RATZLOFF J, et al. Evryscope science: Exploring the potential of all-sky gigapixel-scale telescopes [J]. Publications of the Astronomical Society of the Pacific, 2015, 127(949): 234. doi:  10.1086/680521
    [717] LIFANG S. Research and experiments on artificial compound eye imaging system with large field of view [D]. Chengdu: University of Electronic Science and Technology of China, 2014. (in Chinese)
    [718] SHI LIFANG, CAO AXIU C, LIU YUELIAN, et al. Design and experiment of artificial compound eye with large view field [J]. Opto-Electronic Engineering, 2013, 40: 27-33. (in Chinese)
    [719] HONGXIN Z, ZHENWU L, FENGYOU L. The Research progress of artificial compound eye [J]. Journal of Changchun University of Science and Technology, 2006(2): 4-7.
    [720] ZHANG HONGXIN, LU ZHENWU, LIU HUA. Novel method to simulate and analyze superposition compound eyes [J]. Optics and Precision Engineering, 2008, 16(10): 1847. (in Chinese)
    [721] HONGXIN Z, ZHENWU L, FENGYOU L, et al. The building and analysis of the superposition compound eye’s optical model [J]. Acta Photonica Sinica, 2007, 36(6): 1106. (in Chinese)
    [722] HONGXIN Z, ZHENWU L, FENGYOU L, et al. Simulation and analysis of the apposition compound eye based on the ZEMAX software [J]. Optical Technique, 2006, 32(S1): 124-126+129.
    [723] CAO ZhAOLOU, ZHAN ZHENXIAN, WANG KEYI. Structural design of spherical compound eye lens for moving object detection [J]. Infrared and Laser Engineering, 2011, 40(1): 70-73. (in Chinese)
    [724] Fang G. Design on novel compound eye device for target positioning and research on the key technology[D]. Hefei: University of Science and Technology of China, 2012. (in Chinese)
    [725] GUO FANG, WANG KEYI, YAN PEIZHENG, et al. Calibration of compound eye system for target positioning with large field of view [J]. Optics and Precision Engineering, 2012, 20(5): 913-920. (in Chinese)
    [726] KEIYI W, HAO Z, ZHAOLOU C, et al. Calibration and detection of compound eye model [J]. Optics and Precision Engineering, 2010, 18(8): 1807-1813. (in Chinese)
    [727] Aqueti. Aqueti(中国)- 官方网站[EB/OL]. [2021–03–07]. http://www.aqueti.cn/.
    [728] BRADY D J, HAGEN N. Multiscale lens design [J]. Optics Express, 2009, 17(13): 10659-10674. doi:  10.1364/OE.17.010659
    [729] TREMBLAY E J, MARKS D L, BRADY D J, et al. Design and scaling of monocentric multiscale imagers [J]. Applied Optics, 2012, 51(20): 4691-4702. doi:  10.1364/AO.51.004691
    [730] MARKS D L, BRADY D J. Close-up imaging using microcamera arrays for focal plane synthesis [J]. Optical Engineering, 2011, 50(3): 033205. doi:  10.1117/1.3554389
    [731] MARKS D L, TREMBLAY E J, FORD J E, et al. Microcamera aperture scale in monocentric gigapixel cameras [J]. Applied Optics, 2011, 50(30): 5824-5833. doi:  10.1364/AO.50.005824
    [732] MARKS D L, BRADY D J. Gigagon: A monocentric lens design imaging 40 gigapixels[C]//Imaging Systems. Optical Society of America, 2010: ITuC2.
    [733] SON H S, MARKS D L, HAHN J, et al. Design of a spherical focal surface using close-packed relay optics [J]. Optics Express, 2011, 19(17): 16132-16138. doi:  10.1364/OE.19.016132
    [734] SON H S, MARKS D L, TREMBLAY E, et al. A multiscale, wide field, gigapixel camera[C]//Computational Optical Sensing And Imaging. Optical Society of America, 2011: JTuE2.
    [735] BRADY D J, GEHM M E, STACK R A, et al. Multiscale gigapixel photography [J]. Nature, 2012, 486(7403): 386-389. doi:  10.1038/nature11150
    [736] MARKS D L, LLULL P R, PHILLIPS Z, et al. Characterization of the AWARE 10 two-gigapixel wide-field-of-view visible imager [J]. Applied Optics, 2014, 53(13): C54-C63. doi:  10.1364/AO.53.000C54
    [737] LLULL P, BANGE L, PHILLIPS Z, et al. Characterization of the AWARE 40 wide-field-of-view visible imager [J]. Optica, 2015, 2(12): 1086-1089. doi:  10.1364/OPTICA.2.001086
    [738] WU J, XIONG B, LIN X, et al. Snapshot hyperspectral volumetric microscopy [J]. Scientific Reports, 2016, 6(1): 1-10. doi:  10.1038/s41598-016-0001-8
    [739] JUNKAI Q, FENG Z, GANG Y, et al. A new super-large of view and small distortion optical system [J]. Spacecraft Recovery & Remote Sensing, 2013, 34(2): 30-35.
    [740] AQI Y. Optical design of three-line array airborne mapping camera[D]. Xi'an: University of Chinese Academy of Sciences (Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences), 2015. (in Chinese)
    [741] YISI W. Research on wide FOV high resolution earth observation system based on multi-scale stitching imaging[D]. Hangzhou: Zhejiang University, 2016. (in Chinese)
    [742] XIAOPENG S, FEI L, WEI L, et al. Latest progress in comptutational imaging technology and application [J]. Laser & Optoelectronics Progress, 2020, 57(2): 020001. (in Chinese)
    [743] FEI L, YAZHEN W, PINGLI H, et al. Design of multi-scale wide area high-resolution computational imaging system based on concentric spherical lens [J]. Acta Physica Sinica, 2019, 68(8): 084201. (in Chinese) doi:  10.7498/aps.68.20182229
    [744] GARCIA-SUCERQUIA J, XU W, JERICHO S K, et al. Digital in-line holographic microscopy [J]. Applied Optics, 2006, 45(5): 836-850. doi:  10.1364/AO.45.000836
    [745] GARCIA-SUCERQUIA J, XU W, JERICHO M H, et al. Immersion digital in-line holographic microscopy [J]. Optics Letters, 2006, 31(9): 1211-1213. doi:  10.1364/OL.31.001211
    [746] KANKA M, RIESENBERG R, KREUZER H J. Reconstruction of high-resolution holographic microscopic images [J]. Optics Letters, 2009, 34(8): 1162-1164. doi:  10.1364/OL.34.001162
    [747] KANKA M, RIESENBERG R, PETRUCK P, et al. High resolution (NA=0.8) in lensless in-line holographic microscopy with glass sample carriers [J]. Optics Letters, 2011, 36(18): 3651-3653. doi:  10.1364/OL.36.003651
    [748] BISHARA W, SU T-W, COSKUN A F, et al. Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution [J]. Optics Express, 2010, 18(11): 11181-11191. doi:  10.1364/OE.18.011181
    [749] HAHN J, LIM S, CHOI K, et al. Video-rate compressive holographic microscopic tomography [J]. Optics Express, 2011, 19(8): 7289-7298. doi:  10.1364/OE.19.007289
    [750] LUO W, ZHANG Y, GÖRÖCS Z, et al. Propagation phasor approach for holographic image reconstruction [J]. Scientific Reports, 2016, 6: 22738. doi:  10.1038/srep22738
    [751] XIONG Z, MELZER J E, GARAN J, et al. Optimized sensing of sparse and small targets using lens-free holographic microscopy [J]. Optics Express, 2018, 26(20): 25676. doi:  10.1364/OE.26.025676
    [752] AGBANA T E, GONG H, AMOAH A S, et al. Aliasing, coherence, and resolution in a lensless holographic microscope [J]. Optics Letters, 2017, 42(12): 2271-2274. doi:  10.1364/OL.42.002271
    [753] ZHANG W, CAO L, JIN G, et al. Full field-of-view digital lens-free holography for weak-scattering objects based on grating modulation [J]. Applied Optics, 2018, 57(1): A164. doi:  10.1364/AO.57.00A164
    [754] ALLIER C, MOREL S, VINCENT R, et al. Imaging of dense cell cultures by multiwavelength lens-free video microscopy: Cell cultures by lens-free microscopy [J]. Cytometry Part A, 2017, 91(5): 433-442. doi:  10.1002/cyto.a.23079
    [755] SERABYN E, LIEWER K, WALLACE J K. Resolution optimization of an off-axis lensless digital holographic microscope [J]. Applied Optics, 2018, 57(1): A172. doi:  10.1364/AO.57.00A172
    [756] FENG S, WU J. Resolution enhancement method for lensless in-line holographic microscope with spatially-extended light source [J]. Optics Express, 2017, 25(20): 24735. doi:  10.1364/OE.25.024735
    [757] FENG S, WANG M, WU J. Lensless in-line holographic microscope with talbot grating illumination [J]. Optics Letters, 2016, 41(14): 3157. doi:  10.1364/OL.41.003157
    [758] CUI X, LEE L M, HENG X, et al. Lensless high-resolution on-chip optofluidic microscopes for caenorhabditis elegans and cell imaging [J]. Proceedings of The National Academy of Sciences, 2008, 105(31): 10670-10675. doi:  10.1073/pnas.0804612105
    [759] XU W, JERICHO M H, MEINERTZHAGEN I A, et al. Digital in-line holography for biological applications [J]. Proceedings of the National Academy of Sciences, 2001, 98(20): 11301-11305. doi:  10.1073/pnas.191361398
    [760] SU T, SEO S, ERLINGER A, et al. Towards wireless health: Lensless on-chip cytometry [J]. Optics and Photonics News, 2008, 19(12): 24-24. doi:  10.1364/OPN.19.12.000024
    [761] ISIKMAN S, SEO S, SENCAN I, et al. Lensfree cell holography on a chip: From holographic cell signatures to microscopic reconstruction[C]//2009 IEEE LEOS Annual Meeting Conference Proceedings, 2009.
    [762] ZHENG G, LEE S A, ANTEBI Y, et al. The ePetri dish, an on-chip cell imaging platform based on subpixel perspective sweeping microscopy (SPSM) [J]. Proceedings of the National Academy of Sciences, 2011, 108(41): 16889-16894. doi:  10.1073/pnas.1110681108
    [763] GREENBAUM A, LUO W, SU T-W, et al. Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy [J]. Nature Methods, 2012, 9(9): 889-895. doi:  10.1038/nmeth.2114
    [764] GREENBAUM A, OZCAN A. Maskless imaging of dense samples using pixel super-resolution based multi-height lensfree on-chip microscopy [J]. Optics Express, 2012, 20(3): 3129-3143. doi:  10.1364/OE.20.003129
    [765] BISHARA W, SIKORA U, MUDANYALI O, et al. Holographic pixel super-resolution in portable lensless on-chip microscopy using a fiber-optic array [J]. Lab on A Chip, 2011, 11(7): 1276-1279. doi:  10.1039/C0LC00684J
    [766] HARDIE R C, BARNARD K J, BOGNAR J G, et al. High-resolution image reconstruction from a sequence of rotated and translated frames and its application to an infrared imaging system [J]. Optical Engineering, 1998, 37(1): 247-261. doi:  10.1117/1.601623
    [767] ELAD M, HEL-OR Y. A fast super-resolution reconstruction algorithm for pure translational motion and common space-invariant blur [J]. IEEE Transactions on Image Processing, 2001, 10(8): 1187-1193. doi:  10.1109/83.935034
    [768] GREENBAUM A, FEIZI A, AKBARI N, et al. Wide-field computational color imaging using pixel super-resolved on-chip microscopy [J]. Optics Express, 2013, 21(10): 12469-12483. doi:  10.1364/OE.21.012469
    [769] GREENBAUM A, SIKORA U, OZCAN A. Field-portable wide-field microscopy of dense samples using multi-height pixel super-resolution based lensfree imaging [J]. Lab on A Chip, 2012, 12(7): 1242-1245. doi:  10.1039/C2LC21072J
    [770] ZHENG G, AH LEE S, YANG S, et al. Sub-pixel resolving optofluidic microscope for on-chip cell imaging [J]. Lab on A Chip, 2010, 10(22): 3125-3129. doi:  10.1039/C0LC00213E
    [771] LUO W, GREENBAUM A, ZHANG Y, et al. Synthetic aperture-based on-chip microscopy [J]. Light:Science & Applications, 2015, 4(3): e261. doi:  10.1038/lsa.2015.34
    [772] BAO P, ZHANG F, PEDRINI G, et al. Phase retrieval using multiple illumination wavelengths [J]. Optics Letters, 2008, 33(4): 309-311. doi:  10.1364/OL.33.000309
    [773] BAO P, SITU G, PEDRINI G, et al. Lensless phase microscopy using phase retrieval with multiple illumination wavelengths [J]. Applied Optics, 2012, 51(22): 5486-5494. doi:  10.1364/AO.51.005486
    [774] NOOM D W E, FLAES D E B, LABORDUS E, et al. High-speed multi-wavelengt Fresnel diffraction imaging [J]. Optics Express, 2014, 22(25): 30504-30511. doi:  10.1364/OE.22.030504
    [775] SANZ M, PICAZO-BUENO J A, GARCÍA J, et al. Improved quantitative phase imaging in lensless microscopy by single-shot multi-wavelength illumination using a fast convergence algorithm [J]. Optics Express, 2015, 23(16): 21352-21365. doi:  10.1364/OE.23.021352
    [776] FIENUP J R. Reconstruction of an object from the modulus of Its fourier transform [J]. Optics Letters, 1978, 3(1): 27-29. doi:  10.1364/OL.3.000027
    [777] LUO W, ZHANG Y, FEIZI A, et al. Pixel super-resolution using wavelength scanning [J]. Light: Science & Applications, 2016, 5(4): e16060. doi:  10.1038/lsa.2016.60
    [778] ZHANG J, SUN J, CHEN Q, et al. Adaptive pixel-super-resolved lensfree in-line digital holography for wide-field on-chip microscopy [J]. Scientific Reports, 2017, 7(1): 11777. doi:  10.1038/s41598-017-11715-x
    [779] RIVENSON Y, ZHANG Y, GÜNAYDIN H, et al. Phase recovery and holographic image reconstruction using deep learning in neural networks [J]. Light: Science & Applications, 2018, 7(2): 17141. doi:  10.1038/lsa.2017.141
    [780] RIVENSON Y, CEYLAN KOYDEMIR H, WANG H, et al. Deep learning enhanced mobile-phone microscopy [J]. ACS Photonics, 2018, 5: 2354-2364. doi:  10.1021/acsphotonics.8b00146
    [781] JIALIN Z, QIAN C, XIANGYU Z, et al. Lens-free on-chip microscopy: Theory, advances, and applications [J]. Infrared and Laser Engineering, 2019, 48(6): 0603009. (in Chinese) doi:  10.3788/IRLA201948.0603009
    [782] MICO V, ZALEVSKY Z, GARCÍA-MARTÍNEZ P, et al. Synthetic aperture superresolution with multiple off-axis holograms [J]. JOSA A, 2006, 23(12): 3162-3170. doi:  10.1364/JOSAA.23.003162
    [783] MICO V, ZALEVSKY Z, GARCÍA-MARTÍNEZ P, et al. Superresolved imaging in digital holography by superposition of tilted wavefronts [J]. Applied Optics, 2006, 45(5): 822-828. doi:  10.1364/AO.45.000822
    [784] GRANERO L, MICÓ V, ZALEVSKY Z, et al. Superresolution imaging method using phase-shifting digital lensless fourier holography [J]. Optics Express, 2009, 17(17): 15008-15022. doi:  10.1364/OE.17.015008
    [785] MICÓ V, FERREIRA C, GARCÍA J. Surpassing digital holography limits by lensless object scanning holography [J]. Optics Express, 2012, 20(9): 9382-9395. doi:  10.1364/OE.20.009382
    [786] MICO V, ZALEVSKY Z, GARCÍA J. Common-path phase-shifting digital holographic microscopy: A way to quantitative phase imaging and superresolution [J]. Optics Communications, 2008, 281(17): 4273-4281. doi:  10.1016/j.optcom.2008.04.079
    [787] MICÓ V, GARCÍA J. Common-path phase-shifting lensless holographic microscopy [J]. Optics Letters, 2010, 35(23): 3919-3921. doi:  10.1364/OL.35.003919
    [788] MICÓ V, ZALEVSKY Z, GARCIA J. Superresolved common-path phase-shifting digital inline holographic microscopy using a spatial light modulator [J]. Optics Letters, 2012, 37(23): 4988-4990. doi:  10.1364/OL.37.004988
    [789] MICÓ V, ZALEVSKY Z. Superresolved digital in-line holographic microscopy for high-resolution lensless biological imaging [J]. Journal of Biomedical Optics, 2010, 15(4): 046027. doi:  10.1117/1.3481142
    [790] PICAZO-BUENO J Á, ZALEVSKY Z, GARCÍA J, et al. Superresolved spatially multiplexed interferometric microscopy [J]. Optics Letters, 2017, 42(5): 927-930. doi:  10.1364/OL.42.000927
    [791] MICO V, FERREIRA C, ZALEVSKY Z, et al. Spatially-multiplexed interferometric microscopy (SMIM): Converting a standard microscope into a holographic one [J]. Optics Express, 2014, 22(12): 14929-14943. doi:  10.1364/OE.22.014929
    [792] GAO P, PEDRINI G, OSTEN W. Structured illumination for resolution enhancement and autofocusing in digital holographic microscopy [J]. Optics Letters, 2013, 38(8): 1328. doi:  10.1364/OL.38.001328
    [793] CHOWDHURY S, ELDRIDGE W J, WAX A, et al. Structured illumination multimodal 3D-resolved quantitative phase and fluorescence sub-diffraction microscopy [J]. Biomedical Optics Express, 2017, 8(5): 2496. doi:  10.1364/BOE.8.002496
    [794] GABAI H, SHAKED N T. Dual-channel low-coherence interferometry and its application to quantitative phase imaging of fingerprints [J]. Optics Express, 2012, 20(24): 26906. doi:  10.1364/OE.20.026906
    [795] GIRSHOVITZ P, SHAKED N T. Doubling the field of view in off-axis low-coherence interferometric imaging [J]. Light: Science & Applications, 2014, 3(3): e151. doi:  10.1038/lsa.2014.32
    [796] FRENKLACH I, GIRSHOVITZ P, SHAKED N T. Off-axis interferometric phase microscopy with tripled imaging area [J]. Optics Letters, 2014, 39(6): 1525. doi:  10.1364/OL.39.001525
    [797] BIAN L, SUO J, SITU G, et al. Content adaptive illumination for fourier ptychography [J]. Optics Letters, 2014, 39(23): 6648-6651. doi:  10.1364/OL.39.006648
    [798] HE X, LIU C, ZHU J. Single-shot fourier ptychography based on diffractive beam splitting [J]. Optics Letters, 2018, 43(2): 214. doi:  10.1364/OL.43.000214
    [799] LEE B, HONG J, YOO D, et al. Single-shot phase retrieval via Fourier ptychographic microscopy [J]. Optica, 2018, 5(8): 976-983. doi:  10.1364/OPTICA.5.000976
    [800] TIAN L, LIU Z, YEH L-H, et al. Computational illumination for high-speed in vitro Fourier ptychographic microscopy [J]. Optica, 2015, 2(10): 904. doi:  10.1364/OPTICA.2.000904
    [801] SUN J, ZUO C, ZHANG J, et al. High-speed Fourier ptychographic microscopy based on programmable annular illuminations[J/OL]. Scientific Reports, (2018–09–11). http://www.nature.com/articles/s41598-018-25797-8.
    [802] SUN J, CHEN Q, ZHANG J, et al. Single-shot quantitative phase microscopy based on color-multiplexed Fourier Ptychography [J]. Optics Letters, 2018, 43(14): 3365. doi:  10.1364/OL.43.003365
    [803] QIU Z, ZHANG Z, ZHONG J, et al. Comprehensive comparison of single-pixel imaging methods [J]. Optics and Lasers in Engineering, 2020, 134: 106301. doi:  10.1016/j.optlaseng.2020.106301
    [804] NIPKOW P. Optical disk [J]. German patent, 1884, 30: 15.
    [805] LOGIE B J. Apparatus for transmitting views or images to a distanc: US, US1699270A[P]. 1929-01-15.
    [806] PITTMAN T B, SHIH Y H, STREKALOV D V, et al. Optical imaging by means of two-photon quantum entanglement [J]. Physical Review A, 1995, 52(5): R3429-R3432. doi:  10.1103/PhysRevA.52.R3429
    [807] BENNINK R S, BENTLEY S J, BOYD R W. “Two-Photon” coincidence imaging with a classical source [J]. Physical Review Letters, 2002, 89(11): 113601. doi:  10.1103/PhysRevLett.89.113601
    [808] GATTI A, BRAMBILLA E, BACHE M, et al. Ghost imaging with thermal light: Comparing entanglement and classical correlation [J]. Physical Review Letters, 2004, 93(9): 093602. doi:  10.1103/PhysRevLett.93.093602
    [809] CAI Y, ZHU S-Y. Ghost imaging with incoherent and partially coherent light radiation [J]. Physical Review E, 2005, 71(5): 056607. doi:  10.1103/PhysRevE.71.056607
    [810] BROMBERG Y, KATZ O, SILBERBERG Y. Ghost imaging with a single detector [J]. Physical Review A, 2009, 79(5): 053840. doi:  10.1103/PhysRevA.79.053840
    [811] HAN S, YU H, SHEN X, et al. A review of ghost imaging via sparsity constraints [J]. Applied Sciences, 2018, 8(8): 1379. doi:  10.3390/app8081379
    [812] FERRI F, MAGATTI D, LUGIATO L, et al. Differential ghost imaging [J]. Physical Review Letters, 2010, 104(25): 253603. doi:  10.1103/PhysRevLett.104.253603
    [813] SUN B, WELSH S S, EDGAR M P, et al. Normalized ghost imaging [J]. Optics Express, 2012, 20(15): 16892-16901. doi:  10.1364/OE.20.016892
    [814] VASILE T, DAMIAN V, COLTUC D, et al. Single pixel sensing for THz laser beam profiler based on Hadamard Transform [J]. Optics & Laser Technology, 2016, 79: 173-178.
    [815] ZHANG Z, MA X, ZHONG J. Single-pixel imaging by means of Fourier spectrum acquisition [J]. Nature Communications, 2015, 6(1): 1-6.
    [816] LIU B-L, YANG Z-H, LIU X, et al. Coloured computational imaging with single-pixel detectors based on a 2D discrete cosine transform [J]. Journal of Modern Optics, 2017, 64(3): 259-264. doi:  10.1080/09500340.2016.1229507
    [817] MCCARTHY A, COLLINS R J, KRICHEL N J, et al. Long-range time-of-flight scanning sensor based on high-speed time-correlated single-photon counting [J]. Applied Optics, 2009, 48(32): 6241-6251. doi:  10.1364/AO.48.006241
    [818] VELTEN A, WILLWACHER T, GUPTA O, et al. Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging [J]. Nature Communications, 2012, 3: 745. doi:  10.1038/ncomms1747
    [819] KEPPEL E. Approximating complex surfaces by triangulation of contour lines [J]. IBM Journal of Research and Development, 1975, 19(1): 2-11. doi:  10.1147/rd.191.0002
    [820] BOYDE A. Stereoscopic images in confocal (tandem scanning) microscopy [J]. Science, 1985, 230(4731): 1270-1272. doi:  10.1126/science.4071051
    [821] Zhang Z, Zhong J. Three-dimensional single-pixel imaging with far fewer measurements than effective image pixels [J]. Optics Letters, 2016, 41(11): 2497-2500. doi:  10.1364/OL.41.002497
    [822] ZHANG Y, EDGAR M P, SUN B, et al. 3 D single-pixel video [J]. Journal of Optics, 2016, 18(3): 035203. doi:  10.1088/2040-8978/18/3/035203
    [823] SALVADOR-BALAGUER E, LATORRE-CARMONA P, CHABERT C, et al. Low-cost single-pixel 3 D imaging by using an LED array [J]. Optics Express, 2018, 26(12): 15623-15631. doi:  10.1364/OE.26.015623
    [824] SUN M-J, EDGAR M P, GIBSON G M, et al. Single-pixel three-dimensional imaging with time-based depth resolution [J]. Nature Communications, 2016, 7: 12010. doi:  10.1038/ncomms12010
    [825] HOWLAND G A, DIXON P B, HOWELL J C. Photon-counting compressive sensing laser radar for 3 D imaging [J]. Applied Optics, 2011, 50(31): 5917-5920. doi:  10.1364/AO.50.005917
    [826] ZHAO C, GONG W, CHEN M, et al. Ghost imaging lidar via sparsity constraints [J]. Applied Physics Letters, 2012, 101(14): 141123. doi:  10.1063/1.4757874
    [827] HOWLAND G A, LUM D J, WARE M R, et al. Photon counting compressive depth mapping [J]. Optics Express, 2013, 21(20): 23822-23837. doi:  10.1364/OE.21.023822
    [828] CHEN M, LI E, GONG W, et al. Ghost imaging lidar via sparsity constraints in real atmosphere [J]. Optics and Photonics Journal, 2013, 3(2): 83. doi:  10.4236/opj.2013.32B021
    [829] YU H, LI E, GONG W, et al. Structured image reconstruction for three-dimensional ghost imaging lidar [J]. Optics Express, 2015, 23(11): 14541-14551. doi:  10.1364/OE.23.014541
    [830] GONG W, ZHAO C, YU H, et al. Three-dimensional ghost imaging lidar via sparsity constraint [J]. Scientific Reports, 2016, 6: 26133. doi:  10.1038/srep26133
    [831] QIU Z, ZHANG Z, ZHONG J. Efficient full-color single-pixel imaging based on the human vision property—“Giving in to the Blues” [J]. Optics Letters, 2020, 45(11): 3046-3049. doi:  10.1364/OL.389525
    [832] ZHANG Z, LIU S, PENG J, et al. Simultaneous spatial, spectral, and 3D compressive imaging via efficient Fourier single-pixel measurements [J]. Optica, 2018, 5(3): 315. doi:  10.1364/OPTICA.5.000315
    [833] STANTCHEV R I, YU X, BLU T, et al. Real-time terahertz imaging with a single-pixel detector[J/OL]. Nature Communications, 2020, 11(1): 2535. https://doi.org/10.1038/s41467-020-16370-x.
    [834] PUSHKARSKY I, LIU Y, WEAVER W, et al. Automated single-cell motility analysis on a chip using lensfree microscopy [J]. Scientific Reports, 2014, 4: 4717. doi:  10.1038/srep04717
    [835] KESAVAN S V, GARCIA F P N Y, MENNETEAU M, et al. Real-time label-free detection of dividing cells by means of lensfree video-microscopy [J]. Journal of Biomedical Optics, 2014, 19(3): 036004.
    [836] LEE L M, CUI X, YANG C. The application of on-chip optofluidic microscopy for imaging giardia lamblia trophozoites and cysts [J]. Biomedical Microdevices, 2009, 11(5): 951. doi:  10.1007/s10544-009-9312-x
    [837] COSKUN A F, SENCAN I, SU T-W, et al. Lensless wide-field fluorescent imaging on a chip using compressive decoding of sparse objects [J]. Optics Express, 2010, 18(10): 10510-10523. doi:  10.1364/OE.18.010510
    [838] COSKUN A F, SU T-W, OZCAN A. Wide field-of-view lens-free fluorescent imaging on a chip [J]. Lab on A Chip, 2010, 10(7): 824-827. doi:  10.1039/B926561A
    [839] SHANMUGAM A, SALTHOUSE C D. Lensless fluorescence imaging with height calculation [J]. Journal of Biomedical Optics, 2014, 19(1): 016002.
    [840] OZCAN A, MCLEOD E. Lensless imaging and sensing [J]. Annual Review of Biomedical Engineering, 2016, 18(1): 77-102. doi:  10.1146/annurev-bioeng-092515-010849
    [841] COSKUN A F, SENCAN I, SU T-W, et al. Wide-field lensless fluorescent microscopy using a tapered fiber-optic faceplate on a chip [J]. Analyst, 2011, 136(17): 3512-3518. doi:  10.1039/C0AN00926A
    [842] KHADEMHOSSEINIEH B, SENCAN I, BIENER G, et al. Lensfree on-chip imaging using nanostructured surfaces [J]. Applied Physics Letters, 2010, 96(17): 171106. doi:  10.1063/1.3405719
    [843] KHADEMHOSSEINIEH B, BIENER G, SENCAN I, et al. Lensfree color imaging on a nanostructured chip using compressive decoding [J]. Applied Physics Letters, 2010, 97(21): 211112. doi:  10.1063/1.3521410
    [844] LEE S A, OU X, LEE J E, et al. Chip-scale fluorescence microscope based on a silo-filter complementary metal-oxide semiconductor image sensor [J]. Optics Letters, 2013, 38(11): 1817-1819. doi:  10.1364/OL.38.001817
    [845] HAN C, PANG S, BOWER D V, et al. Wide field-of-view on-chip talbot fluorescence microscopy for longitudinal cell culture monitoring from within the Incubator [J]. Analytical Chemistry, 2013, 85(4): 2356-2360. doi:  10.1021/ac303356v
    [846] COSKUN A F, SENCAN I, SU T-W, et al. Lensfree fluorescent on-chip imaging of transgenic caenorhabditis elegans over an ultra-wide field-of-view [J]. PLoS ONE, 2011, 6(1): e15955. doi:  10.1371/journal.pone.0015955
    [847] MCLEOD E, DINCER T U, VELI M, et al. High-throughput and label-free single nanoparticle sizing based on time-resolved on-chip microscopy [J]. ACS Nano, 2015, 9(3): 3265-3273. doi:  10.1021/acsnano.5b00388
    [848] MENG H, HUSSAIN F. In-line recording and off-axis viewing technique for holographic particle velocimetry [J]. Applied Optics, 1995, 34(11): 1827-1840. doi:  10.1364/AO.34.001827
    [849] ISIKMAN S O, BISHARA W, OZCAN A. Partially coherent lensfree tomographic microscopy [J]. Applied Optics, 2011, 50(34): H253-H264. doi:  10.1364/AO.50.00H253
    [850] SU T-W, ISIKMAN S O, BISHARA W, et al. Multi-angle lensless digital holography for depth resolved imaging on a chip [J]. Optics Express, 2010, 18(9): 9690-9711. doi:  10.1364/OE.18.009690
    [851] KUMAR M, VIJAYAKUMAR A, ROSEN J. Incoherent digital holograms acquired by interferenceless coded aperture correlation holography system without refractive lenses [J]. Scientific Reports, 2017, 7(1): 11555. doi:  10.1038/s41598-017-11731-x
    [852] MERTZ L, YOUNG N O. Fresnel transformations of images [J]. SPIE Milestone Series Ms, 1996, 128: 44-49.
    [853] SHIMANO T, NAKAMURA Y, TAJIMA K, et al. Lensless light-field imaging with fresnel zone aperture: quasi-coherent coding [J]. Applied Optics, 2018, 57(11): 2841-2850. doi:  10.1364/AO.57.002841
    [854] TAJIMA K, SHIMANO T, NAKAMURA Y, et al. Lensless light-field imaging with multi-phased fresnel zone aperture[C/OL]//2017 IEEE International Conference on Computational Photography (ICCP). Stanford, CA, USA: IEEE, 2017: 1–7.[2021–06–23]. http://ieeexplore.ieee.org/document/7951485/.
    [855] SAO M, NAKAMURA Y, TAJIMA K, et al. Lensless close-up imaging with fresnel zone aperture [J]. Japanese Journal of Applied Physics, 2018, 57(9S1): 09SB05. doi:  10.7567/JJAP.57.09SB05
    [856] WU J, ZHANG H, ZHANG W, et al. Single-shot lensless imaging with fresnel zone aperture and incoherent illumination [J]. Light: Science & Applications, 2020, 9(1): 53.
    [857] ASIF M S, AYREMLOU A, SANKARANARAYANAN A, et al. FlatCam: Thin, lensless cameras using coded aperture and computation [J]. IEEE Transactions on Computational Imaging, 2017, 3(3): 384-397. doi:  10.1109/TCI.2016.2593662
    [858] BOOMINATHAN V, ADAMS J K, ROBINSON J T, et al. PhlatCam: designed phase-mask based thin lensless camera [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, 42(7): 1618-1629. doi:  10.1109/TPAMI.2020.2987489
    [859] SCHILLING G. Catching gamma-ray bursts on the Wing [J]. Sky and Telescope, 2004, 107(3): 32-42.
    [860] GREENWOOD D P. Bandwidth specification for adaptive optics systems [J]. JOSA, 1977, 67(3): 390-393. doi:  10.1364/JOSA.67.000390
    [861] FRIED D L. Limiting resolution looking down through the atmosphere [J]. JOSA, 1966, 56(10): 1380-1384. doi:  10.1364/JOSA.56.001380
    [862] FRIED D L. Anisoplanatism in adaptive optics [J]. JOSA, 1982, 72(1): 52-61. doi:  10.1364/JOSA.72.000052
    [863] MURPHY D V. Atmospheric-turbulence compensation experiments using cooperative beacons [J]. Lincoln Laboratory Journal, 1992, 5: 25-44.
    [864] MURPHY D V, PRIMMERMAN C A, ZOLLARS B G, et al. Experimental demonstration of atmospheric compensation using multiple synthetic beacons [J]. Optics Letters, 1991, 16(22): 1797-1799. doi:  10.1364/OL.16.001797
    [865] FUGATE R Q, FRIED D, AMEER G, et al. Measurement of atmospheric wavefront distortion using scattered light from a laser guide-star [J]. Nature, 1991, 353(6340): 144. doi:  10.1038/353144a0
    [866] PRIMMERMAN C, FOUCHE D. Thermal-blooming compensation: experimental observations using a deformable-mirror system [J]. Applied Optics, 1976, 15(4): 990-995. doi:  10.1364/AO.15.000990
    [867] FOY R, LABEYRIE A. Feasibility of adaptive telescope with laser probe [J]. Astronomy and Astrophysics, 1985, 152: L29-L31.
    [868] HUMPHREYS R, BRADLEY L, HERRMANN J. Sodium-layer synthetic beacons for adaptive optics [J]. The Lincoln Laboratory Journal, 1992, 5(1): 45-66.
    [869] HARDY J W, LEFEBVRE J E, KOLIOPOULOS C. Real-time atmospheric compensation [J]. JOSA, 1977, 67(3): 360-369. doi:  10.1364/JOSA.67.000360
    [870] HARDY J W. Adaptive optics for astronomical telescopes [J]. Physics Today, 2000, 53(4): 69-69. doi:  10.1063/1.2405463
    [871] ELLERBROEK B, BRITTON M, DEKANY R, et al. Adaptive optics for the thirty meter telescope[C]//Astronomical Adaptive Optics Systems and Applications II. International Society for Optics and Photonics, 2005: 590304.
    [872] VERNIN J, MUÑOZ-TUÑÓN C, SARAZIN M, et al. European extremely large telescope site characterization I: Overview [J]. Publications of the Astronomical Society of the Pacific, 2011, 123(909): 1334. doi:  10.1086/662995
    [873] KERN P, MERKLE F, GAFFARD J P, et al. Prototype of an adaptive optical system for astronomical observation[C]//Real-Time Image Processing: Concepts and Technologies, 1988: 9–16.
    [874] ROUSSET G, FONTANELLA J, KERN P, et al. First diffraction-limited astronomical images with adaptive optics [J]. Astronomy and Astrophysics, 1990, 230: L29-L32.
    [875] FUGATE R Q. The Starfire optical range 3.5-m adaptive optical telescope[C]//Large Ground-based Telescopes, 2003: 934–944.
    [876] ACTON D S, DUNN R B. Solar imaging at national solar observatory using a segmented adaptive optics system[C]//Active and Adaptive Optical Components and Systems II, 1993: 348–353.
    [877] ROORDA A. Adaptive optics for studying visual function: A comprehensive review [J]. Journal of Vision, 2011, 11(5): 6. doi:  10.1167/11.5.6
    [878] LIANG J, WILLIAMS D R, MILLER D T. Supernormal vision and high-resolution retinal imaging through adaptive optics [J]. JOSA A, 1997, 14(11): 2884-2892. doi:  10.1364/JOSAA.14.002884
    [879] ROORDA A, WILLIAMS D R. The arrangement of the three cone classes in the living human eye [J]. Nature, 1999, 397(6719): 520. doi:  10.1038/17383
    [880] XUEJUN L N Z Y R, YIYUN L X W C H, WENHAN J. A small adaptive optical imaging system for cells of living human retina [J]. Acta Optica Sinica, 2004, 24(9): 1153-1158. (in Chinese)
    [881] SHI G, DAI Y, WANG L, et al. Adaptive optics optical coherence tomography for retina imaging [J]. Chinese Optics Letters, 2008, 6(6): 424-425. doi:  10.3788/COL20080606.0424
    [882] LU J, LI H, HE Y, et al. Superresolution in adaptive optics confocal scanning laser ophthalmoscope [J]. Journal of Physics, 2011, 60(3): 266-275. (in Chinese)
    [883] GU M. Principles of Three Dimensional Imaging in Confocal Microscopes[M]. Singapore: World Scientific, 1996.
    [884] PAWLEY J. Handbook of Biological Confocal Microscopy[M]. Berlin: Springer Science & Business Media, 2010.
    [885] WILSON T, OTHERS. Confocal Microscopy[M]. London: Academic Press London, 1990.
    [886] BOOTH M J, NEIL M A, JUŠKAITIS R, et al. Adaptive aberration correction in a confocal microscope [J]. Proceedings of the National Academy of Sciences, 2002, 99(9): 5788-5792. doi:  10.1073/pnas.082544799
    [887] TAO X, AZUCENA O, FU M, et al. Adaptive optics microscopy with direct wavefront sensing using fluorescent protein guide stars [J]. Optics Letters, 2011, 36(17): 3389. doi:  10.1364/OL.36.003389
    [888] ALBERT O, SHERMAN L, MOUROU G, et al. Smart microscope: An adaptive optics learning system for aberration correction in multiphoton confocal microscopy [J]. Optics Letters, 2000, 25(1): 52. doi:  10.1364/OL.25.000052
    [889] CHA J W, BALLESTA J. Shack-hartmann wavefront-sensor-based adaptive optics system for multiphoton microscopy [J]. Journal of Biomedical Optics, 2010, 15: 10.
    [890] POTSAID B, BELLOUARD Y, WEN J T. Adaptive scanning optical microscope (ASOM): A multidisciplinary optical microscope design for large field of view and high resolution imaging [J]. Optics Express, 2005, 13(17): 6504-6518. doi:  10.1364/OPEX.13.006504
    [891] WARBER M, MAIER S, HAIST T, et al. Combination of scene-based and stochastic measurement for wide-field aberration correction in microscopic imaging [J]. Applied Optics, 2010, 49(28): 5474. doi:  10.1364/AO.49.005474
    [892] VERMEULEN P, MURO E, PONS T, et al. Adaptive optics for fluorescence wide-field microscopy using spectrally independent guide star and markers [J]. Journal of Biomedical Optics, 2011, 16(7): 076019. doi:  10.1117/1.3603847
    [893] HELL S W, WICHMANN J. Breaking the diffraction resolution limit by stimulated emission: Stimulated-emission-depletion fluorescence microscopy [J]. Optics Letters, 1994, 19(11): 780. doi:  10.1364/OL.19.000780
    [894] DÉBARRE D, BOTCHERBY E J, BOOTH M J, et al. Adaptive optics for structured illumination microscopy [J]. Optics Express, 2008, 16(13): 9290. doi:  10.1364/OE.16.009290
    [895] PATTON B R, BURKE D, OWALD D, et al. Three-dimensional STED microscopy of aberrating tissue using dual adaptive optics. [J]. Optics Express, 2016, 24(8): 8862-8876. doi:  10.1364/OE.24.008862
    [896] GOULD T J, KROMANN E B, BURKE D, et al. Auto-aligning stimulated emission depletion microscope using adaptive optics [J]. Optics Letters, 2013, 38(11): 1860. doi:  10.1364/OL.38.001860
    [897] GOULD T J, BURKE D, BEWERSDORF J, et al. Adaptive optics enables 3 D STED microscopy in aberrating specimens [J]. Optics Express, 2012, 20(19): 20998. doi:  10.1364/OE.20.020998
    [898] NING Y, JIANG W, LING N, et al. Response function calculation and sensitivity comparison analysis of various bimorph deformable mirrors [J]. Optics Express, 2007, 15(19): 12030-12038. doi:  10.1364/OE.15.012030
    [899] ROOMS F, CAMET S, CHARTON J, et al. A new deformable mirror and experimental setup for free-space optical communication[C]//Free-Space Laser Communication Technologies XXI, 2009: 71990 O.
    [900] BIFANO T G, PERREAULT J A, BIERDEN P A. Micromachined deformable mirror for optical wavefront compensation[C]//High-Resolution Wavefront Control: Methods, Devices, and Applications II, 2000: 7–15.
    [901] LOVE G D. Wave-front correction and production of Zernike modes with a liquid-crystal spatial light modulator [J]. Applied Optics, 1997, 36(7): 1517-1524. doi:  10.1364/AO.36.001517
    [902] CAI D, YAO J, JIANG W. Performance of liquid-crystal spatial light modulator using for wave-front correction [J]. Acta Optica Sinica, 2009, 29(2): 285-291. (in Chinese) doi:  10.3788/AOS20092902.0285
    [903] GUO Y, ZHANG A, FAN X, et al. First on-sky demonstration of the piezoelectric adaptive secondary mirror [J]. Optics Letters, 2016, 41(24): 5712-5715. doi:  10.1364/OL.41.005712
    [904] VORONTSOV M, CARHART G, RICKLIN J. Adaptive phase-distortion correction based on parallel gradient-descent optimization [J]. Optics Letters, 1997, 22(12): 907-909. doi:  10.1364/OL.22.000907
    [905] YANG H, LI X, JIANG W. High resolution imaging of phase-distorted extended object using SPGD algorithm and deformable mirror[C]//Optical Design and Testing III, 2007: 683411.
    [906] WANG J, BAI F, NING Y, et al. Wavefront response matrix for closed-loop adaptive optics system based on non-modulation pyramid wavefront sensor [J]. Optics Communications, 2012, 285(12): 2814-2820. doi:  10.1016/j.optcom.2012.02.026
    [907] WANG S, WEI K, ZHENG W, et al. First light on an adaptive optics system using a non-modulation pyramid wavefront sensor for a 1.8 m telescope [J]. Chinese Optics Letters, 2016, 14(10): 100101. doi:  10.3788/COL201614.100101
    [908] DONG J, BI R, HO J-H, et al. Diffuse correlation spectroscopy with a fast Fourier transform-based software autocorrelator [J]. Journal of Biomedical Optics, 2012, 17(9): 097004.
    [909] GIBSON A, HEBDEN J, ARRIDGE S R. Recent advances in diffuse optical imaging [J]. Physics in Medicine & Biology, 2005, 50(4): R1.
    [910] BI R, DONG J, LEE K. Multi-channel deep tissue flowmetry based on temporal diffuse speckle contrast analysis [J]. Optics Express, 2013, 21(19): 22854-22861. doi:  10.1364/OE.21.022854
    [911] VARMA H M, VALDES C P, KRISTOFFERSEN A K, et al. Speckle contrast optical tomography: A new method for deep tissue three-dimensional tomography of blood flow [J]. Biomedical Optics Express, 2014, 5(4): 1275-1289. doi:  10.1364/BOE.5.001275
    [912] WANG L V, HU S. Photoacoustic tomography: In vivo imaging from organelles to organs [J]. Science, 2012, 335(6075): 1458-1462. doi:  10.1126/science.1216210
    [913] BAYER E, SCHAACK G. Two-photon absorption of CaF2: Eu2+ [J]. Physica Status Solidi (B), 1970, 41(2): 827-835. doi:  10.1002/pssb.19700410239
    [914] DENK W, STRICKLER J H, WEBB W W. Two-photon laser scanning fluorescence microscopy [J]. Science, 1990, 248(4951): 73-76. doi:  10.1126/science.2321027
    [915] FERCHER A, MENGEDOHT K, WERNER W. Eye-length measurement by interferometry with partially coherent light [J]. Optics Letters, 1988, 13(3): 186-188. doi:  10.1364/OL.13.000186
    [916] HUANG D, SWANSON E A, LIN C P, et al. Optical coherence tomography [J]. Science, 1991, 254(5035): 1178-1181. doi:  10.1126/science.1957169
    [917] VELLEKOOP I M, MOSK A P. Focusing coherent light through opaque strongly scattering media [J]. Optics Letters, 2007, 32(16): 2309. doi:  10.1364/OL.32.002309
    [918] POPOFF S M, LEROSEY G, CARMINATI R, et al. Measuring the transmission matrix in optics: An approach to the study and control of light propagation in disordered media [J]. Physical Review Letters, 2010, 104(10): 100601. doi:  10.1103/PhysRevLett.104.100601
    [919] VELLEKOOP I M. Feedback-based wavefront shaping [J]. Optics Express, 2015, 23(9): 12189-12206. doi:  10.1364/OE.23.012189
    [920] POPOFF S, LEROSEY G, FINK M, et al. Image transmission through an opaque material [J]. Nature Communications, 2010, 1: 81. doi:  10.1038/ncomms1078
    [921] CUI M. Parallel wavefront optimization method for focusing light through random scattering media [J]. Optics Letters, 2011, 36(6): 870-872. doi:  10.1364/OL.36.000870
    [922] LEITH E N, UPATNIEKS J. Holographic imagery through diffusing media [J]. JOSA, 1966, 56(4): 523-523. doi:  10.1364/JOSA.56.000523
    [923] KATZ O, HEIDMANN P, FINK M, et al. Non-invasive single-shot imaging through scattering layers and around corners via speckle correlations [J]. Nature Photonics, 2014, 8(10): 784-790. doi:  10.1038/nphoton.2014.189
    [924] YAQOOB Z, PSALTIS D, FELD M S, et al. Optical phase conjugation for turbidity suppression in biological samples [J]. Nature Photonics, 2008, 2(2): 110. doi:  10.1038/nphoton.2007.297
    [925] BERTOLOTTI J, VAN PUTTEN E G, BLUM C, et al. Non-invasive imaging through opaque scattering layers [J]. Nature, 2012, 491(7423): 232. doi:  10.1038/nature11578
    [926] YANG W, LI G, SITU G. Imaging through scattering media with the auxiliary of a known reference object [J]. Scientific Reports, 2018, 8(1): 9614. doi:  10.1038/s41598-018-27754-x
    [927] LYU M, WANG H, LI G, et al. Deep speckle correlation: A deep learning approach toward scalable imaging through scattering media [J]. Optica, 2018, 5(10): 1181-1190.
    [928] MENG L, WANG H, LI G, et al. Learning-based lensless imaging through optically thick scattering media [J]. Advanced Photonics, 2019, 1(3): 10. doi:  10.1364/OPTICA.5.001181
    [929] VELTEN A, RASKAR R, WU D, et al. Femto-photography: Capturing and visualizing the propagation of light [J]. ACM Transactions on Graphics, 2013, 32(4): 1-8. doi:  10.1145/2461912.2461928
    [930] MIKAMI H, GAO L, GODA K. Ultrafast optical imaging technology: Principles and applications of emerging methods [J]. Nanophotonics, 2016, 5(4): 497-509. doi:  10.1515/nanoph-2016-0026
    [931] ZHU L, CHEN Y, LIANG J, et al. Space- and intensity-constrained reconstruction for compressed ultrafast photography [J]. Optica, 2016, 3(7): 694-697. doi:  10.1364/OPTICA.3.000694
    [932] LAURENZIS M, VELTEN A. Nonline-of-sight laser gated viewing of scattered photons [J]. Optical Engineering, 2014, 53(2): 023102. doi:  10.1117/1.OE.53.2.023102
    [933] REPASI E, LUTZMANN P, STEINVALL O, et al. Advanced short-wavelength infrared range-gated imaging for ground applications in monostatic and bistatic configurations [J]. Applied Optics, 2009, 48(31): 5956-5969. doi:  10.1364/AO.48.005956
    [934] SEN P, CHEN B, GARG G, et al. Dual photography [J]. ACM Transactions on Graphics (TOG), 2005, 24(3): 745-755. doi:  10.1145/1073204.1073257
    [935] ZHANG Z, JIAO S, YAO M, et al. Secured single-pixel broadcast imaging [J]. Optics Express, 2018, 26(11): 14578-14591. doi:  10.1364/OE.26.014578
    [936] BUTTAFAVA M, ZEMAN J, TOSI A, et al. Non-line-of-sight imaging using a time-gated single photon avalanche diode [J]. Optics Express, 2015, 23(16): 20997-21011. doi:  10.1364/OE.23.020997
    [937] JIN C, XIE J, ZHANG S, et al. Reconstruction of multiple non-line-of-sight objects using back projection based on ellipsoid mode decomposition [J]. Optics Express, 2018, 26(16): 20089-20101. doi:  10.1364/OE.26.020089
    [938] O’TOOLE M, LINDELL D B, WETZSTEIN G. Confocal non-line-of-sight imaging based on the light-cone transform [J]. Nature, 2018, 555(7696): 338-341. doi:  10.1038/nature25489
    [939] WU C, LIU J, HUANG X, et al. Non–line-of-sight imaging over 1.43 km [J]. Proceedings of the National Academy of Sciences, 2021, 118(10): e2024468118. doi:  10.1073/pnas.2024468118
    [940] SCRIBNER D A, KRUER M R, KILLIANY J M. Infrared focal plane array technology [J]. Proceedings of the IEEE, 1991, 79(1): 66-85. doi:  10.1109/5.64383
    [941] MILTON A F, BARONE F R, KRUER M R. Influence of nonuniformity on infrared focal plane array performance [J]. Optical Engineering, 1985, 24(5): 245855. doi:  10.1117/12.7973588
    [942] PERRY D L, DERENIAK E L. Linear theory of nonuniformity correction in infrared staring sensors [J]. Optical Engineering, 1993, 32(8): 1854-1860. doi:  10.1117/12.145601
    [943] SCHULZ M, CALDWELL L. Nonuniformity correction and correctability of infrared focal plane arrays [J]. Infrared Physics & Technology, 1995, 36(4): 763-777. doi:  10.1016/1350-4495(94)00002-3
    [944] SCRIBNER D A, SARKADY K A, CAULFIELD J T, et al. Nonuniformity correction for staring IR focal plane arrays using scene-based techniques[C/OL]//Infrared Detectors and Focal Plane Arrays. International Society for Optics and Photonics, 1990: 224–233.[2019–06–09].https://www.spiedigitallibrary.org/conference-proceedings-of-spie/1308/0000/Nonuniformity-correction-for-staring-IR-focal-plane-arrays-using-scene/10.1117/12.21730.short.
    [945] DUGDALE S J. A practitioner’s guide to thermal infrared remote sensing of rivers and streams: Recent advances, precautions and considerations [J]. Wiley Interdisciplinary Reviews:Water, 2016, 3(2): 251-268. doi:  10.1002/wat2.1135
    [946] SCRIBNER D A, SARKADY K A, KRUER M R, et al. Adaptive nonuniformity correction for IR focal-plane arrays using neural networks[C/OL]//Infrared Sensors: Detectors, Electronics, and Signal Processing. International Society for Optics and Photonics, 1991: 100–109.[2019–06–09].https://www.spiedigitallibrary.org/conference-proceedings-of-spie/1541/0000/Adaptive-nonuniformity-correction-for-IR-focal-plane-arrays-using-neural/10.1117/12.49324.short.
    [947] HARRIS J G, CHIANG Y-M. Nonuniformity correction using the constant-statistics constraint: Analog and digital implementations[C/OL]//Infrared Technology and Applications XXIII. International Society for Optics and Photonics, 1997: 895–905.[2019–06–09].https://www.spiedigitallibrary.org/conference-proceedings-of-spie/3061/0000/Nonuniformity-correction-using-the-constant-statistics-constraint--analog-and/10.1117/12.280308.short.
    [948] HARRIS J G, CHIANG Y-M. Minimizing the ghosting artifact in scene-based nonuniformity correction[C/OL]//Infrared Imaging Systems: Design, Analysis, Modeling, and Testing IX. International Society for Optics and Photonics, 1998: 106–113. [2019–06–09].https://www.spiedigitallibrary.org/conference-proceedings-of-spie/3377/0000/Minimizing-the-ghosting-artifact-in-scene-based-nonuniformity-correction/10.1117/12.319364.short.
    [949] HARRIS J G, CHIANG YU-MING. Nonuniformity correction of infrared image sequences using the constant-statistics constraint [J]. IEEE Transactions on Image Processing, 1999, 8(8): 1148-1151. doi:  10.1109/83.777098
    [950] HAYAT M M, TORRES S N, ARMSTRONG E, et al. Statistical algorithm for nonuniformity correction in focal-plane arrays [J]. Applied Optics, 1999, 38(5): 772-780. doi:  10.1364/AO.38.000772
    [951] TORRES S N, HAYAT M M. Kalman filtering for adaptive nonuniformity correction in infrared focal-plane arrays [J]. JOSA A, 2003, 20(3): 470-480. doi:  10.1364/JOSAA.20.000470
    [952] TORRES S N, PEZOA J E, HAYAT M M. Scene-based nonuniformity correction for focal plane arrays by the method of the inverse covariance form [J]. Applied Optics, 2003, 42(29): 5872-5881. doi:  10.1364/AO.42.005872
    [953] TORRES S N, VERA E M, REEVES R A, et al. Adaptive scene-based nonuniformity correction method for infrared-focal plane arrays[C/OL]//Infrared Imaging Systems: Design, Analysis, Modeling, and Testing XIV. International Society for Optics and Photonics, 2003: 130–139. [2019–06–09].https://www.spiedigitallibrary.org/conference-proceedings-of-spie/5076/0000/Adaptive-scene-based-nonuniformity-correction-method-for-infrared-focal-plane/10.1117/12.487217.short.
    [954] VERA E, TORRES S. Fast adaptive nonuniformity correction for infrared focal-plane array detectors [J]. EURASIP Journal on Advances in Signal Processing, 2005, 2005(13): 560759. doi:  10.1155/ASP.2005.1994
    [955] PEZOA J E, HAYAT M M, TORRES S N, et al. Multimodel Kalman filtering for adaptive nonuniformity correction in infrared sensors [J]. JOSA A, 2006, 23(6): 1282-1291. doi:  10.1364/JOSAA.23.001282
    [956] HARDIE R C, HAYAT M M, ARMSTRONG E, et al. Scene-based nonuniformity correction with video sequences and registration [J]. Applied Optics, 2000, 39(8): 1241-1250. doi:  10.1364/AO.39.001241
    [957] RATLIFF B M, HAYAT M M, HARDIE R C. An algebraic algorithm for nonuniformity correction in focal-plane arrays [J]. JOSA A, 2002, 19(9): 1737-1747. doi:  10.1364/JOSAA.19.001737
    [958] RATLIFF B M, HAYAT M M, TYO J S. Radiometrically accurate scene-based nonuniformity correction for array sensors [J]. JOSA A, 2003, 20(10): 1890-1899. doi:  10.1364/JOSAA.20.001890
    [959] ZUO C, CHEN Q, GU G, et al. Scene-based nonuniformity correction algorithm based on interframe registration [J]. JOSA A, 2011, 28(6): 1164-1176. doi:  10.1364/JOSAA.28.001164
    [960] ZUO C, CHEN Q, GU G, et al. Improved interframe registration based nonuniformity correction for focal plane arrays [J]. Infrared Physics & Technology, 2012, 55(4): 263-269. doi:  10.1016/j.infrared.2012.04.002
    [961] ZUO C, ZHANG Y, CHEN Q, et al. A two-frame approach for scene-based nonuniformity correction in array sensors [J]. Infrared Physics & Technology, 2013, 60: 190-196. doi:  10.1016/j.infrared.2013.05.001
    [962] BLACK W T, TYO J S. Feedback-integrated scene cancellation scene-based nonuniformity correction algorithm [J]. Journal of Electronic Imaging, 2014, 23(2): 023005. doi:  10.1117/1.JEI.23.2.023005
    [963] TORRES S N, VERA E M, REEVES R A, et al. Scene-based non-uniformity correction method using constant range: Performance and analysis[C]//Proceedings of the 6th SCI, IX: 224–229.
    [964] ZHANG T, SHI Y. Edge-directed adaptive nonuniformity correction for staring infrared focal plane arrays [J]. Optical Engineering, 2006, 45(1): 016402. doi:  10.1117/1.2158404
    [965] ROSSI A, DIANI M, CORSINI G. Temporal statistics de-ghosting for adaptive non-uniformity correction in infrared focal plane arrays [J]. Electronics Letters, 2010, 46(5): 348-349. doi:  10.1049/el.2010.3559
    [966] QIAN W, CHEN Q, GU G. Space low-pass and temporal high-pass nonuniformity correction algorithm [J]. Optical Review, 2010, 17(1): 24-29. doi:  10.1007/s10043-010-0005-8
    [967] ZUO C, CHEN Q, GU G, et al. New temporal high-pass filter nonuniformity correction based on bilateral filter [J]. Optical Review, 2011, 18(2): 197-202. doi:  10.1007/s10043-011-0042-y
    [968] ZHANG C, ZHAO W. Scene-based nonuniformity correction using local constant statistics [J]. JOSA A, 2008, 25(6): 1444-1453. doi:  10.1364/JOSAA.25.001444
    [969] ZUO C, CHEN Q, GU G, et al. Scene-based nonuniformity correction method using multiscale constant statistics [J]. Optical Engineering, 2011, 50(8): 087006. doi:  10.1117/1.3610978
    [970] ROSSI A, DIANI M, CORSINI G. Bilateral filter-based adaptive nonuniformity correction for infrared focal-plane array systems [J]. Optical Engineering, 2010, 49(5): 057003. doi:  10.1117/1.3425660
    [971] VERA E, MEZA P, TORRES S. Total variation approach for adaptive nonuniformity correction in focal-plane arrays [J]. Optics Letters, 2011, 36(2): 172-174. doi:  10.1364/OL.36.000172
    [972] RATLIFF B M, HAYAT M M, TYO J S. Generalized algebraic scene-based nonuniformity correction algorithm [J]. JOSA A, 2005, 22(2): 239-249. doi:  10.1364/JOSAA.22.000239
    [973] ZUO C, CHEN Q, GU G, et al. Registration method for infrared images under conditions of fixed-pattern noise [J]. Optics Communications, 2012, 285(9): 2293-2302. doi:  10.1016/j.optcom.2012.01.019
    [974] LIU N, XIE J. Interframe phase-correlated registration scene-based nonuniformity correction technology [J]. Infrared Physics & Technology, 2015, 69: 198-205. doi:  10.1016/j.infrared.2015.01.004
    [975] BOUTEMEDJET A, DENG C, ZHAO B. Robust approach for nonuniformity correction in infrared focal plane array [J]. Sensors, 2016, 16(11): 1890. doi:  10.3390/s16111890
    [976] ANTIPA N, KUO G, HECKEL R, et al. DiffuserCam: Lensless single-exposure 3D imaging [J]. Optica, 2018, 5(1): 1-9. doi:  10.1364/OPTICA.5.000001
    [977] BARBASTATHIS G, OZCAN A, SITU G. On the use of deep learning for computational imaging [J]. Optica, 2019, 6(8): 921. doi:  10.1364/OPTICA.6.000921
    [978] ZUO CHAO, FENG SHIJIE, ZHANG XIANGYU, et al. Deep learning based computational imaging: Status, challenges, and future [J]. Acta Optica Sinica, 2020, 40(1): 0111003. (in Chinese) doi:  10.3788/AOS202040.0111003
    [979] FEI W, HAO W, YAOMING B. Application of deep learning in computational imaging [J]. Acta Optica Sinica, 2020, 40(1): 14. (in Chinese)
    [980] KHORASANINEJAD M, CHEN W T, DEVLIN R C, et al. Metalenses at visible wavelengths: Diffraction-limited focusing and subwavelength resolution imaging [J]. Science, 2016, 352(6290): 1190-1194. doi:  10.1126/science.aaf6644
    [981] LALANNE P, CHAVEL P. Metalenses at visible wavelengths: Past, present, perspectives [J]. Laser & Photonics Reviews, 2017, 11(3): 1600295. doi:  10.1002/lpor.201600295
    [982] CHEN W T, ZHU A Y, SANJEEV V, et al. A broadband achromatic metalens for focusing and imaging in the visible [J]. Nature Nanotechnology, 2018, 13(3): 220. doi:  10.1038/s41565-017-0034-6
    [983] WANG S, WU P C, SU V-C, et al. A broadband achromatic metalens in the visible [J]. Nature Nanotechnology, 2018, 13(3): 227. doi:  10.1038/s41565-017-0052-4
    [984] ZHANG L, MEI S, HUANG K, et al. Advances in full control of electromagnetic waves with metasurfaces [J]. Advanced Optical Materials, 2016, 4(6): 818-833. doi:  10.1002/adom.201500690
    [985] HUANG K, QIN F, LIU H, et al. Planar diffractive lenses: Fundamentals, functionalities, and applications [J]. Advanced Materials, 2018, 30(26): 1704556. doi:  10.1002/adma.201704556
    [986] COLBURN S, ZHAN A, MAJUMDAR A. Metasurface optics for full-color computational imaging [J]. Science Advances, 2018, 4(2): eaar2114. doi:  10.1126/sciadv.aar2114
    [987] LIN R J, SU V-C, WANG S, et al. Achromatic metalens array for full-colour light-field imaging [J]. Nature Nanotechnology, 2019, 14(3): 227. doi:  10.1038/s41565-018-0347-0
    [988] LI C, ZHANG X, LI J, et al. The challenges of modern computing and new opportunities for optics [J]. PhotoniX, 2021, 2(1): 20. doi:  10.1186/s43074-021-00042-0
    [989] LIN X, RIVENSON Y, YARDIMCI N T, et al. All-optical machine learning using diffractive deep neural networks [J]. Science, 2018, 361(6406): 1004-1008. doi:  10.1126/science.aat8084
  • [1] 李智渊, 翟爱平, 冀莹泽, 李国辉, 王东, 王文艳, 石林林, 冀婷, 刘飞, 崔艳霞.  光学偏振成像技术的研究、应用与进展 . 红外与激光工程, 2023, 52(9): 20220808-1-20220808-16. doi: 10.3788/IRLA20220808
    [2] 郭恩来, 师瑛杰, 朱硕, 程倩倩, 韦一, 苗金烨, 韩静.  深度学习下的散射成像:物理与数据联合建模优化(特邀) . 红外与激光工程, 2022, 51(8): 20220563-1-20220563-13. doi: 10.3788/IRLA20220563
    [3] 刘金鹏, 冯怡, 刘蕾, 相萌, 刘飞, 邵晓鹏.  全息技术在散射成像领域中的研究及应用进展(特邀) . 红外与激光工程, 2022, 51(8): 20220307-1-20220307-15. doi: 10.3788/IRLA20220307
    [4] 陈钱.  先进夜视成像技术发展探讨 . 红外与激光工程, 2022, 51(2): 20220128-1-20220128-8. doi: 10.3788/IRLA20220128
    [5] 陈沁, 南向红, 梁文跃, 郑麒麟, 孙志伟, 文龙.  片上集成光学传感检测技术的研究进展(特邀) . 红外与激光工程, 2022, 51(1): 20210671-1-20210671-18. doi: 10.3788/IRLA20210671
    [6] 赵海博, 刘彦丽, 杨雯铄, 苏云, 高大化, 孙权森, 赵慧洁.  双通道衍射计算成像光谱仪系统 . 红外与激光工程, 2022, 51(5): 20220077-1-20220077-8. doi: 10.3788/IRLA20220077
    [7] 韩申生, 胡晨昱.  信息光学成像研究回顾、现状与展望(特邀) . 红外与激光工程, 2022, 51(1): 20220017-1-20220017-22. doi: 10.3788/IRLA20220017
    [8] 李珮明, 蒋文杰, 赵海潇, 孙宝清.  掩膜版调制关联成像的发展及应用(特邀) . 红外与激光工程, 2021, 50(12): 20210738-1-20210738-14. doi: 10.3788/IRLA20210738
    [9] 程永强, 王宏强, 曹凯程, 刘康, 罗成高.  微波关联成像研究进展及展望(特邀) . 红外与激光工程, 2021, 50(12): 20210790-1-20210790-21. doi: 10.3788/IRLA20210790
    [10] 廖兆琨, 王汉, 陈文, 孙鸣捷.  紧凑双光路单像素成像系统(特邀) . 红外与激光工程, 2021, 50(12): 20210723-1-20210723-7. doi: 10.3788/IRLA20210723
    [11] 王艺霖, 范庆斌, 徐挺.  电磁超表面透镜的前沿成像应用进展 . 红外与激光工程, 2021, 50(5): 20211026-1-20211026-12. doi: 10.3788/IRLA20211026
    [12] 孙宝清, 江山, 马艳洋, 蒋文杰, 殷永凯.  单像素成像在特殊波段及三维成像的应用发展 . 红外与激光工程, 2020, 49(3): 0303016-0303016-16. doi: 10.3788/IRLA202049.0303016
    [13] 范斌, 刘彦丽, 赵海博, 徐婧, 孙权森, 王旭.  新型深空高光谱衍射计算成像探测技术(特约) . 红外与激光工程, 2020, 49(5): 20201005-20201005-6. doi:  10.3788.IRLA20201005
    [14] 潘安, 姚保利.  高通量快速傅里叶叠层显微成像技术研究进展 . 红外与激光工程, 2019, 48(6): 603012-0603012(19). doi: 10.3788/IRLA201948.0603012
    [15] 张子邦, 陆天傲, 彭军政, 钟金钢.  傅里叶单像素成像技术与应用 . 红外与激光工程, 2019, 48(6): 603002-0603002(19). doi: 10.3788/IRLA201948.0603002
    [16] 刘正君, 耿勇, 谭久彬.  基于柱透镜多旋转测量的计算成像 . 红外与激光工程, 2019, 48(6): 603016-0603016(5). doi: 10.3788/IRLA201948.0603016
    [17] 刘正君, 郭澄, 谭久彬.  基于多距离相位恢复的无透镜计算成像技术 . 红外与激光工程, 2018, 47(10): 1002002-1002002(16). doi: 10.3788/IRLA201847.1002002
    [18] 李生福, 赵宇, 陈光华, 罗振雄, 叶雁.  选择型照明傅里叶叠层成像提取粒子尺度 . 红外与激光工程, 2017, 46(11): 1103005-1103005(8). doi: 10.3788/IRLA201746.1103005
    [19] 赵惠, 魏静萱, 庞志海, 刘美莹.  波前编码超分辨成像技术 . 红外与激光工程, 2016, 45(4): 422003-0422003(10). doi: 10.3788/IRLA201645.0422003
    [20] 刘子伟, 许廷发, 王洪庆, 申子宜, 饶志涛.  深度成像理论与实现 . 红外与激光工程, 2016, 45(7): 726001-0726001(5). doi: 10.3788/IRLA201645.0726001
  • 加载中
图(207) / 表(2)
计量
  • 文章访问数:  3638
  • HTML全文浏览量:  1489
  • PDF下载量:  1173
  • 被引次数: 0
出版历程
  • 收稿日期:  2022-02-01
  • 修回日期:  2022-02-06
  • 网络出版日期:  2022-03-04
  • 刊出日期:  2022-02-25

计算光学成像:何来,何处,何去,何从?

doi: 10.3788/IRLA20220110
    作者简介:

    左超,男,教授,博士生导师,博士,主要从事计算光学成像与光信息处理技术的研究 (Email: zuochao@njust.edu.cn; Website: www.scilaboratory.com)

    通讯作者: 陈钱,男,教授,博士生导师,博士,主要从事光电成像与信息处理等方面的研究 (Email: chenqian@njust.edu.cn)。
基金项目:  国家自然科学基金(U21B2033);江苏省基础研究计划前沿引领专项(BK20192003);中央高校科研专项资助项目(30920032101)
  • 中图分类号: O438

摘要: 计算光学成像是一种通过联合优化光学系统和信号处理以实现特定成像功能与特性的新兴研究领域。它并不是光学成像和数字图像处理的简单补充,而是前端(物理域)的光学调控与后端(数字域)信息处理的有机结合,通过对照明、成像系统进行光学编码与数学建模,以计算重构的方式获取图像与信息。这种新型的成像方式将有望突破传统光学成像技术对光学系统以及探测器制造工艺、工作条件、功耗成本等因素的限制,使其在功能(相位、光谱、偏振、光场、相干度、折射率、三维形貌、景深延拓、模糊复原、数字重聚焦,改变观测视角)、性能(空间分辨、时间分辨、光谱分辨、信息维度与探测灵敏度)、可靠性、可维护性等方面获得显著提高。现阶段,计算光学成像已发展为一门集几何光学、信息光学、计算光学、现代信号处理等理论于一体的新兴交叉技术研究领域,成为光学成像领域的国际研究重点和热点,代表了先进光学成像技术的未来发展方向。国内外众多高校与科研院所投身其中,使该领域全面进入了“百花齐放,百家争鸣”的繁荣发展局面。作为本期《红外与激光工程》——南京理工大学专刊“计算光学成像技术”专栏的首篇论文,本文概括性地综述了计算光学成像领域的历史沿革、发展现状、并展望其未来发展方向与所依赖的核心赋能技术,以求抛砖引玉。

English Abstract

    • 上帝说要有光,于是便有了光;光学“optics”一词源自古希腊字“ὀπτική”,意为 “看见”、“视见”。三千年前,古埃及人与美索不达米亚人第一次将石英晶体磨光制成宁路德透镜(Nimrud lens),这翻开了人类光学成像历史的第一页[1]。时光流转,如今我们手持搭载潜望式长焦镜头与人工智能算法的智能手机就能拍摄皎洁白月与绚丽星空[2]。现如今,人类享受着光学成像技术带来的多姿多彩的绚丽生活,也一直在为了看得“更远、更广、更清晰”这个永无止境的目标前赴后继。由于视觉是人类获得客观世界信息的主要途径,据估计人类感知外界信息有80%是来自于视觉。而人眼由于受限于视觉性能,在时间、空间、灵敏度、光谱、分辨力等方面均存在局限性。光学成像技术利用各种光学成像系统,即获取客观景物图像的工具,如显微镜、望远镜、医疗CT、手机摄像机和照相机等(见图1),实现光信息的可视化,同时延伸并扩展人眼的视觉特性。

      一个典型的光学成像系统主要由光源、光学镜头组、光探测器三部分组成。光学镜头将三维场景目标发出或者透/反/散射的光线聚焦在表面上,探测器像素和样品之间通过建立一种直接的一一对应关系来获取图像,光场的强度由光探测器离散采集并经过图像处理器数字化处理后形成计算机可显示的图像,整个过程如图2所示。这种“所见即所得”的成像方式受强度成像机理、探测器技术水平、光学系统设计、成像衍射极限等因素限制以及单视角、相位丢失、光谱积分、二维平面成像等因素的制约,导致高维度样品信息的缺失或丢失。此外光学镜头组通常需要和光学镜片、镜筒、光圈以及调焦系统等部件配合使用以获得清晰的图像,大大增加了成像装置的体积和复杂度。

      图  1  常见的光电成像系统

      Figure 1.  Common optoelectronic imaging systems

      图  2  传统光学成像系统的成像过程

      Figure 2.  Conventional optical imaging process

      光学成像技术的出现延伸并扩展人眼的视觉特性,其以成像分辨率(时间、空间、光谱)的提高、成像维度的拓展、探测灵敏度的提升作为技术发展目标(图3)。受当今电子信息时代的影响,高性能、低成本、体积小、重量轻的光学成像系统越来越受到广泛的重视与需求。商用相机和手机摄像头因其光学系统结构小巧,价格低廉,已成为人们不可或缺的日常用品。然而传统光学成像系统因受强度成像机理、探测器技术水平、光学系统设计、成像衍射极限等因素制约,在空间分辨、时间分辨、光谱分辨、信息维度与探测灵敏度等方面仍存在一定局限性。随着人们对成像系统功能与性能的不断追求,以及军用和民用领域日益增长的高分辨、高灵敏度以及多维高速成像的应用需求,也对光学成像技术提出了更具挑战性的要求:例如在显微成像领域,一方面需要显微成像系统能够对无色透明的生物细胞组织实现无标记、多维度、高分辨、宽视场成像观察,另一方面需要显微成像系统能够小型化便携式,以满足当今迅速增长的即时检验与远程医疗的应用需求。在空间科技领域,同样需要光学成像系统不断减小重量和体积,以节省运载空间或降低运载成本。在工业制造领域,需要视觉检测仪要能够实现高精度、高分辨、高速实时的三维成像与传感,以满足快速在线检测与机器人视觉导航等应用需求。在医疗诊断领域,如内窥镜等设备,在保证清晰成像观测的同时,需要将设备做得更小,以减轻患者的痛苦与不适。在地质勘探领域,如在光线较暗的环境探测情况下,需要光学成像系统对光具有更高的透过率、响应灵敏度和动态范围,以提高图像的亮度与成像的信噪比。采用传统光学成像系统设计思路想要获得成像性能的少量提升,通常意味着硬件成本的急剧增加,甚至难以实现工程化应用。另一方面,光探测器规模尺寸、像元大小、响应灵敏度等已接近物理极限,很难满足这些极具挑战性的需求。

      图  3  光学成像技术的五方面发展目标

      Figure 3.  Five goals for the development of optical imaging technology

      随着成像电子学的发展,计算机数据处理能力的增强,光场调控、孔径编码、压缩感知、全息成像等光、电信息处理技术取得了重大的进展;另一方面,经过成千上万年,自然界已经演化出多类能够适应不同生存需求的生物视觉系统,从生物视觉系统中获得灵感无疑可以对新一代光学成像技术的发展带来有益的启示。在此背景下,20世纪90年代中期光学成像界和计算机视觉界的许多研究人员不约而同地探索出了一种新型成像模式:即图像形成不再仅仅依赖于光学物理器件,而是前端光学和后探测信号处理的联合设计[3],这种技术就是现在广为人知的“计算成像”(Computational imaging)技术。计算成像将光学调控与信息处理有机结合,为突破上述传统成像系统中的诸多限制性因素提供了新手段与新思路[3]。对于“计算成像”,目前国际上并没有清晰的界定和严格的定义。目前普遍接受的一种说法是计算成像是通过光学系统和信号处理的有机结合与联合优化来实现特定的成像系统特性,它所得到的图像或信息是二者简单相加所不能达到的。它可以摆脱传统成像系统的限制,并且能够创造新颖的图像应用[48]。这种成像技术的实现方法与传统成像技术有着实质上的差别,给光学成像领域注入了新的活力[9]。21世纪初,计算成像技术在斯坦福大学、麻省理工学院、哥伦比亚大学、杜克大学、南加州大学、微软研究院等国际著名研究机构的研究学者的推动下得以迅猛发展,发展了波前编码成像、光场成像、时间编码成像、孔径编码成像、偏振成像、高光谱成像、单像素成像、结构光三维成像、数字全息成像、无透镜成像、定量相位成像、衍射层析成像、穿透散射介质成像等一系列计算光学成像的新概念与新体制。近年来,光学成像技术的发展已经由传统的强度、彩色成像发展进入计算光学成像时代。通过将光学系统的信息获取能力与计算机的信息处理能力相结合,实现相位、光谱、偏振、光场、相干度、折射率、三维形貌等高维度视觉信息的高性能、全方位采集。现如今,计算光学成像已发展为一门集几何光学、信息光学、计算光学、计算机视觉、现代信号处理等理论于一体的新兴交叉技术研究领域,成为光学成像领域的一大国际研究重点和热点。

      这里必须说明的是:“计算成像”这个新兴词汇很容易被误解为“计算机成像”,或者仅仅被误认为是“传统成像”与“数字图像处理”技术的延伸。笔者认为这里有必要加以强调与区分。传统光学成像是为了获得可满足人眼或者机器视觉要求的图像,所以在进行图像采集时就需要保证获取高质量的图像数据。而实际操作中由于种种原因,成像效果往往达不到理想预期,所以通常还需要借助于数字图像处理技术对采集图像进行进一步加工。从学术级的Matlab、ImageJ,到专业级的Adobe Photoshop,乃至大众都在使用的“美图秀秀”,都属于典型的数字图像处理软件的范畴。在此过程中,光学成像过程与数字图像处理是独立且串行的关系,算法被认为是后处理过程,并不纳入成像系统设计的考虑之中,如图4所示。这即决定了传统成像技术无法从根本上通过图像处理技术来挖掘出更多场景的本质信息。简言之,如果成像前端所获取的图像数据缺失或者质量不理想(如严重离焦、噪声污染),后端仅依靠图像处理技术很难加以弥补。因为信息并不会凭空产生,正所谓“巧妇难为无米之炊”。

      图  4  传统数字图像处理往往仅作为成像的后处理过程

      Figure 4.  Conventional digital imaging processing is only a post-processing step in the whole imaging process

      与传统光学成像系统“先成像,后处理”的成像方式截然不同,计算光学成像采用的是“先调制,再拍摄,最后解调”的成像方式。其将光学系统(照明、光学器件、光探测器)与数字图像处理算法作为一个整体考虑,并在设计时一同进行综合优化。前端成像元件与后端数据处理二者相辅相成,构成一种“混合光学—数字计算成像系统”,如图5所示。不同于传统光学成像的“所见即所得”,计算光学成像通过对照明与成像系统人为引入可控的编码或者“扭曲”,如结构照明、孔径编码、附加光学传函、子孔径分割、探测器可控位移等并作为先验知识,目的是将物体或者场景更多的本质信息调制到传感器所能拍摄到的原始图像信号中(又被称作中间像,Intermediate image,因为该图像往往无法直接使用或观测)。在解调阶段,基于几何光学、波动光学等理论基础上通过对场景目标经光学系统成像再到探测器这一完整图像生成过程建立精确的正向数学模型,再经求解该正向成像模型的“逆问题”,以计算重构的方式来获得场景目标的高质量的图像或者所感兴趣的其它物理信息。正如其名,“计算成像”中的图像并不是直接拍摄到的,而是计算出来的。这种计算成像方法实质上就是在场景和图像之间建立了某种特定的联系,这种联系可以是线性的也可以是非线性的,可以突破一一对应的直接采样形式,实现非直接的采样形式,使得采样形式更加灵活,更能充分发挥不同传感器的特点与性能。如果说光电成像技术延伸并扩展了人眼的视觉特性,那么计算成像技术则进一步延伸并扩展光电成像器件的成像维度与探测性能。这种新型的成像方式将有望突破传统光学成像技术对光学系统以及探测器制造工艺、工作条件、功耗成本等因素的限制,使其在功能(相位、光谱、偏振、光场、相干度、折射率、三维形貌、景深延拓,模糊复原,数字重聚焦,改变观测视角)、性能(空间分辨、时间分辨、光谱分辨、信息维度与探测灵敏度)、可靠性、可维护性等方面获得显著提高,有助于实现成像设备的高性能、微型化、智能化。

      图  5  计算光学成像系统的成像过程

      Figure 5.  Computational optical imaging process

      近年来,计算光学成像也已逐步进入了我国从事光学成像、光学测量、光信息学以及计算机视觉领域科研人员的视野,在光学信息获取与处理领域占据了越来越重要的地位。2017年,在包括我们在内的计算成像领域的同行学者的一致建议下,国家自然科学基金委结合未来学科的发展方向和趋势首次将“计算成像”列入信息科学部四处F05学科代码下F0501光学信息获取、显示与处理研究方向,并作为一个独立的子方向(F050109),见图6(注:2021年基金委信息科学部优化学科布局,调整代码之后仅保留一级和二级代码,已不再设立三级代码)。近年来,以“计算光学成像”为议题的国际会议与专题研讨会在国内也逐步兴起,国内外各类学术期刊均争相推出了相关专刊与专栏,广大从业人员对此领域的兴趣与热情日益高涨,前沿热点研究方向,如相位成像、全息成像、光谱成像、偏振成像、三维成像、光场成像、超分辨成像、无透镜成像、单像素成像(鬼成像)、穿透散射介质成像等层出不穷。因此现阶段迫切需要对此蓬勃发展且充满前景的新领域进行梳理,归纳与总结,并基于此为我国相关领域研究人员在计算光学成像技术及其应用领域方面提供一些有益的参考。

      图  6  从2017年修订后的国家自然科学基金委学科代码,其中“计算成像”被列入信息科学部四处下一个独立的子方向(F050109)

      Figure 6.  The revised discipline code of the National Natural Science Foundation of China in 2017. “Computational imaging” has been listed as an independent sub-direction of Information Science (F050109)

      在此背景下,本文作为本期《红外与激光工程》——南京理工大学专刊“计算光学成像技术”专栏的首篇论文,概括性地综述了计算光学成像领域的历史沿革(何来)、发展现状(何处)、并展望其未来发展方向(何去)与其所依赖的核心赋能技术(何从)。在第一章中,我们首先简要回顾光学成像技术的历史以及计算光学成像的发展由来。计算光学成像被认为是人类从光化学成像时代、胶片成像时代、数码成像时代后的第四次成像革命。第二章将是文中的要点内容,我们将综述计算光学成像技术的发展现状。这里我们按照采用计算成像技术的“动机”或者说计算成像技术所带来的成效来将计算光学技术体系进行细分。并对每种成像技术或者方法的基本原理、发展现状、代表性成果以及典型应用进行了概述。第三章评估了计算光学成像的当前优势和劣势,以及未来发展面临的机会和威胁。第四章分析了计算成像未来发展所依赖的核心赋能技术。最后,我们在第五章中给出了文中的总结性评论。值得说明的是,由于本文涉及内容广泛且作者水平精力与文章篇幅所限,文中难免存在疏漏与错误之处,在此由衷期望读者不吝指正。

    • 光学成像诞生与发展是时代的必然产物。科学技术的进步、人们对长驻影像的渴望、对影像记录和信息传播的需求催生了光学成像技术的诞生;同时光学成像技术的诞生又反过来更进一步促进了科技的发展与人们的需求。光学成像技术并不是某一个人发明出来的,而是经过数代人共同努力的成果,它是适应社会需求的必然产物。在摄影术诞生后180余年的今天,摄像头已经成为我们日常生活不可分割的一部分:打开微信支付宝扫一扫支付,拍一张自拍发个朋友圈,拍一段宠物视频上传抖音,用淘宝拍照识别商品已经成为人们的生活常态。而你可曾知道历史上第一张照片曝光要长达8 h,而现如今手机标配摄像头,在屏幕上按下快门的那一瞬间一张清晰的照片就出炉了。光学成像技术是怎么演变到如今这个阶段的?有哪些人、公司和产品推动了演变的发生?演变带走了什么,留下了什么?带着这些疑问,我们回顾了一番光学成像技术的演变历程。

    • 早在公元前四百多年,中国哲学家墨子观察到小孔成像的现象,并记录在他的著作《墨子•经下》中,成为有史以来对小孔成像最早的研究和论著,为摄影的发明奠定了理论基础。墨子之后,古希腊哲学家亚里士多德和数学家欧几里得、春秋时期法家韩非子、西汉淮南王刘安、北宋科学家沈括等中外科学家都对针孔成像有颇多论述,针孔影像,已为察觉乃至运用,但只可观察,无法记录。在15~16世纪文艺复兴时期,欧洲出现了供绘画用的“成像暗箱”(Camera obscura),如图7所示(最初由意大利人阿贝尔第(Leon Batisti Alberti)研制)。由于暗箱的发明,很多历史的记载中从来没有系统学过绘画的人都“突然之间”摇身一变成了绘画天才,写实技巧骤然提升。在那个摄影技术尚未出现的时代,涌现出了大量能与“单反照片”媲美的杰作。

      图  7  16世纪用于绘图的暗箱装置

      Figure 7.  Camera obscura box, 16th century

    • 1725年,德国纽伦堡阿道夫大学医学教授亨利其·舒尔茨(Heinrich Schulze)发现硝酸银溶液在光作用下会变黑,并于1727年发表论文《硝酸银与白垩混合物对光的作用》,论文讨论了硝酸银混合物在光作用下记录图案的功能,德国人称之为现代摄影的始祖。1793年法国发明家尼埃普斯(Joseph Nicéphore Nièpce)和他的兄弟一起开始了对感光材料的实验。1822~1824年期间,他实验发现把沥青涂在玻璃板和金属板上能够实现感光。1825年,他成功地利用可以感光的纸把铜版画上的影像制作成了一幅图片,由此诞生世界上第一张照片——《牵马少年》,如图8所示。这个以牵马的人为对象的图片虽然不是用照相机“照”出来的,但是这张图片预示着感光材料在实际运用方面迎来了一个新时代。然而在摄影技术诞生的初期,由于感光材料的灵敏度很低,拍摄一张照片往往需要曝光几个小时。

      图  8  尼埃普斯使用的暗箱相机和所拍摄的《牵马少年》

      Figure 8.  The Camera Obscura box used by Joseph Nicephore Niépce and his photo “the man with a horse”

      1825年,尼埃普斯委托法国光学仪器商人夏尔·雪弗莱(Charles Chevalier)为他的暗箱制作光学镜片。并于1826年(也有说1827年)将其发明的感光材料放进暗箱,拍摄现存最早的景物照片,作品《Le Gras窗外的景色》(见图9)在其法国勃艮第(Burgundy)的家里拍摄完成,通过其阁楼上的窗户拍摄,使用暗箱曝光时间超过8小时。尼埃普斯把这种用日光将影像永久记录在金属板上的摄影方法叫做“日光摄影法(Heliography)”-“Helios”来自希腊语.意即太阳,“Graphein”意即记录、描绘。

      图  9  尼埃普斯所拍摄的《窗外景色》

      Figure 9.  “Window at Le Gras” taken by Joseph Nicephore Niépce

      19世纪20年代,法国发明家、画家和舞台背景设计师路易·达盖尔(Louis-Jacques-Mandé Daguerre)开始热衷于寻找把暗箱投影固定下来的方法。他于1827年结识了尼埃普斯,两人于1829年12月开始了正式合作,订立了为期10年的合作契约,在尼埃普斯先前日光摄影法的基础上,共同研究和改进留住影像的工艺。1833年7月5日,尼埃普斯在没有取得丝毫新成果的情况下溘然辞世。但幸运的是,他的笔记留给继续工作的达盖尔。1837年的一天,达盖尔在药品箱中找药品时,突然看到过去曝过光的底片上的影像变得十分清晰。他猜想很可能是药箱里的某种药品在发生作用。为了找到答案,他每天晚