Real-time video of hidden scenes around corners now possible

Real-time video of hidden scenes around corners now possible

Figure 1: Real-time NLOS Virtual Image Processing Pipeline. The imaging system sends a virtual phaser field (PF) signal to the visible wall and captures the signal returning from the hidden view back to the wall. The massive raw photon stream is recorded by a SPAD (single-photon avalanche diode) array. Raw photons from all channels are virtually remapped to full aperture. The remapped data is then transformed into the frequency domain (Fourier domain histogram, FDH) and propagated by the fast Rayleigh Somerfeld diffraction (RSD) algorithm. The final, temporal frame average yields a constant SNR over the entire reconstructed volume, and the result is displayed. credit: DOI: 10.1038/s41467-021-26721-X

As Ji Hyun Nam slowly tosses a stuffed cat toy into the air, a real-time video captures the playful scene in a 20th-century webcam clip—a mere five frames per second.

Twist? The name is hidden in the corner from the camera. The video of the stuffed animal was created by capturing the light reflected off the wall of the toy and bounced back again in a science-fiction-reality technique known as non-line-of-sight imaging.

And at five frames per second, the video is a pretty sharp improvement over recent hidden view imaging that previously took minutes to recreate a still image.

The new technology uses multiple ultra-fast and highly sensitive light sensors and an improved video reconstruction algorithm to greatly speed up the time taken to display hidden scenes. Researchers at the University of Wisconsin-Madison, who produced the video, say the new advancement opens up the technology for economical, real-world applications of near and far scenes.

Those applications of the future include disaster relief, medical imaging, and military use. The technology could also be used outside of surrounding imaging, such as in improvements to autonomous vehicle imaging systems. The work was funded by the US Department of Defense’s Advanced Research Projects Agency (DARPA) and the National Science Foundation.

Real-time video of hidden scenes around corners now possible

Graduate students Ji Hyun Nam (left) and Ton Le work with Andreas Velten, assistant professor and principal investigator in the Computational Optics Lab. credit: Bryce Richter

Andreas Velten, a professor of biostatistics and medical informatics at the UW School of Medicine and Public Health, and his team published their findings on November 11. nature communication. Nam, a former Welten Lab doctoral student, is the report’s first author. UW-Madison researchers Eric Brandt and Sebastian Bauer, along with colleagues from the Polytechnic University of Milan in Italy, also contributed to the new research.

Welten and his former advisor first demonstrated non-line-of-sight imaging a decade ago. Similar to other light- or sound-based imaging, the technique obtains information about a scene by bouncing light off a surface and sensing the echoes that come back. But for looking around corners, the technique focuses not on the first echo, but on the reflections of those echoes.

“It’s basically echolocation, but using additional echoes—such as with reverb,” ​​says Welten, who also holds an appointment in the Department of Electrical and Computer Engineering.

In 2019, Welten’s lab members demonstrated they could take advantage of existing imaging algorithms by rethinking how they approach the math of the system. The new math allowed them to use a laser rapidly scanning against a wall as a sort of “virtual camera” that provides visibility for the hidden scene.

Algorithms that reconstruct scenes are fast. Brandt, a doctoral student in the lab of study co-author Eftichios Sifakis, further improved them at processing hidden visual data. But data collection was painfully slow for earlier non-line-of-sight imaging techniques, as the light sensors were often only one pixel.

As Ji Hyun Nam slowly tosses a stuffed cat toy into the air, a real-time video captures the playful scene from around a corner. And with refinement, the technology could find use in search and rescue, defense and medical imaging. (Caution: The video contains flashing lights, which can be a problem for some people with photosensitive epilepsy or a history of migraines and headaches.) Credit: University of Wisconsin-Madison

To move forward in real-time video, the team needed specialized light sensors—and more of them. Single-photon avalanche diodes, or SPADs, are now common, finding their way into even the latest iPhones. Capable of detecting individual photons, they provide the sensitivity needed to capture the very weak reflections of light from all around. But commercial SPADs are about 50 times much slower.

Working with colleagues in Italy, Velten’s lab spent years perfecting new SPADs that can tell the difference between photons a second apart reaching just 50 trillion. He also provides ultra-fast time resolution depth information, allowing 3D reconstruction. The sensor can also be turned on and off very quickly to help isolate different reflections.

“If I send a light pulse to a wall, I get a very bright reflection that I have to ignore. I have to look for the very weak light coming from the hidden scene,” Welten says.

Using 28 SPAD pixels, the team can collect light fast enough to enable real-time video with a delay of just one second.

The resulting video is monochrome and fuzzy, yet capable of resolving motion and distinguishing objects in 3D space. In successive scenes, the name demonstrates that the videos can resolve foot-wide syllables and protrude human limbs during natural movements. The projected virtual camera can accurately distinguish a mirror from what it is reflecting, which is technically challenging for a real camera.

“It’s really enjoyable to play with our NLOS (non-line-of-sight) imaging setup,” Nam says. “While standing in hidden view, you can dance, jump, exercise and watch your video on the monitor in real time.”

While the video reflector captures objects only a few meters away from the wall, the same technique can be used to image objects hundreds of meters away, as long as they are large enough to be seen at that distance.

“If you’re in a dark room, the size of the scene is no longer limited,” says Welten. Even when the room lights are on, the system can capture nearby objects.

Although the Welten team uses custom equipment, the light sensor and laser technology required for imaging the surroundings is ubiquitous and inexpensive. After further engineering refinement, the technology can be deployed creatively in many areas.

“Nowadays you can find integrated flight time sensors in smartphones like the iPhone 12,” Nam says. “Can you imagine that you could take a picture with your phone around a corner? There are still many technical challenges, but this work brings us to the next level and opens up possibilities.”

Lessons from traditional imaging let scientists see around the corners

more information:
Ji Hyun Nam et al, Low-latency time-of-flight non-line-of-sight imaging at 5 frames per second, nature communication (2021). DOI: 10.1038/s41467-021-26721-X

Provided by University of Wisconsin-Madison

Citation: Real-time video of hidden scenes around corners now possible (2021, 12 November) March 30, 2022 from recovered. HTML

This document is subject to copyright. No part may be reproduced without written permission, except for any fair use for the purpose of personal study or research. The content is provided for information purposes only.

Related Posts

Leave a Reply

Your email address will not be published.