The Chinese Academy of Sciences (CAS), Hefei Institutes of Physical Science, has developed a single-pixel imaging technique that is anti-motion blur for fast-moving objects. This technique takes advantage of the broad spectrum and high sensitivity a single-pixel detector, and helps to break through the bottleneck in single-pixel imaging for fast-moving objects.

Optics Letters published this research.

Wang Yingjian, the leader of the team, stated that the study was based on proof-of-principle capturing imaging information.

Single-pixel imaging has made great strides in the capture of static and slow-moving objects. Motion blur is the biggest problem with single-pixel imaging for practical engineering applications.

Researchers proposed a multitasking system to track and image moving targets. The single-pixel detector only detected a small amount of information. This was used to track and locate moving targets. The synchronization of motion blur correction and imaging of fast-moving objects was possible due to the increasing amount of information detected over time.

The technology proposed fully exploits single-pixel detection. It allows for rapid positioning, clear imaging and recognition of fast moving targets according to the system’s detection data stream. The technology roadmaps propose “tracking before imaging”, which subverts traditional technical methods’ time-sequence relationship.

Dr. Matthew Edgar, who was formerly at University of Glasgow, said that “the experimental results are encouraging” and “I am certain that future research in this area will compare the efficacy of author’s approach to single-pixel sampling, reconstruction methods for real world applications where there is dynamic and rapid motion of objects in a scene.”

Professor Randy Bartels, Colorado State University, stated that this strategy allows for rapid tracking of objects. This approach can be scaled to very high speeds.

Shi Dongfeng and colleagues, Radon single-pixel imaging using projective sampling, Optics Express (2019).DOI 10.1364/OE.27.014594