Engineering team develops approach to enable simple cameras to be viewed in 3D

Stanford engineers convert simple cameras to 3D .  enable me to see

The lab-based prototype lidar system the research team built successfully captured megapixel-resolution depth maps using a commercially available digital camera. credit: Andrew Broadhead

Standard image sensors, such as those already installed in practically every smartphone in use today, capture light intensity and color. Relying on common, off-the-shelf sensor technology—known as CMOS—these cameras have gotten smaller and more powerful over the years and now offer tens-of-megapixel resolutions. But they’ve only seen it in two dimensions, capturing flat images, like a drawing—so far.

Researchers at Stanford University have created a new approach that allows standard image sensors to see light in three dimensions. That is, these common cameras can soon be used to measure distance from objects.

The engineering possibilities are dramatic. Measuring distances between objects with light is currently only possible with specialized and expensive lidar—short for “Light Detection and Ranging”—systems. If you’ve seen a self-driving car tooling around you, you can see it right away from the hunchback of the technology on the roof. Much of that gear is the car’s lidar crash-prevention system, which uses a laser to determine the distance between objects.

Lidar is like radar, but with light instead of radio waves. By applying laser beams to objects and measuring the light that bounces back, it can tell how far an object is, how fast it is traveling, whether it is moving closer or farther, and most critically, it can calculate It is possible that two walking path objects will intersect at some point in the future.

“Existing lidar systems are large and heavy, but someday, if you want the lidar capabilities in millions of autonomous drones or light robotic vehicles, you want them to be very small, very energy efficient and offer high performance.” Okan Atlar, a doctoral candidate in electrical engineering at Stanford and first author on the new paper in the journal nature communication Which features this compact, energy-efficient device that can be used for lidar.

For engineers, advances offer two interesting opportunities. First, it could enable megapixel-resolution lidar – a limitation not possible today. The higher resolution will allow LIDAR to identify targets at a greater range. An autonomous car, for example, may be able to separate a cyclist from a pedestrian from a far distance—and allow the car to avoid a crash more easily. Second, any image sensor available today, which now includes billions in smartphones, can capture rich 3D images with minimal hardware additions.

change the way machines are viewed

One approach to adding 3D imaging to standard sensors is achieved by adding a light source (easily done) and a modulator (not so easily done) that turns the light on and off very quickly, millions of times every second. Is. In measuring variations in light, engineers can calculate distance. Existing modulators can do this as well, but they require a relatively large amount of power. So large, in fact, that it makes them completely impractical for everyday use.

The solution, designed by the Stanford team between the Laboratory for Integrated Nano-Quantum Systems (LINQS) and ArbabianLab, relies on a phenomenon known as acoustic resonance. The team created a simple acoustic modulator using a thin wafer of lithium niobate—a transparent crystal highly desirable for its electrical, acoustic and optical properties—coated with two transparent electrodes.

Crucially, lithium niobate is piezoelectric. That is, when electricity is introduced through the electrode, the crystal lattice at the center of its atomic structure changes shape. It vibrates at very high, very predictable and very controlled frequencies. And, when it vibrates, lithium niobate strongly modulates the light; With a few polarizers, this new modulator effectively turns the light on and off several million times a second.

“What’s more, the geometry of the wafers and electrodes defines the frequency of the light modulation, so we can fine-tune the frequency,” Atlar says. “Change the geometry and you change the frequency of modulation.”

In technical terms, the piezoelectric effect is creating an acoustic wave through a crystal that rotates the polarization of light in desirable, tunable, and usable ways. It is this significant technical departure that enabled the team’s success. A polarizing filter is then carefully placed after the modulator to convert this rotation into intensity modulation—making the light brighter and darker—effectively turning the light on and off millions of times a second.

“While there are other ways to turn the light on and off,” Atlar says, “this acoustic approach is preferable because it is extremely energy efficient.”

practical results

Best of all, the design of the modulator is simple and integrates into a proposed system that uses off-the-shelf cameras, such as those found in everyday cellphones and digital SLRs. Atlar and consultant Amin Arabbian, associate professor of electrical engineering and senior author of the project, think that this could form the basis for a new type of compact, low-cost, energy-efficient lidar—the “standard CMOS lidar”, as they It’s called—which could find its way into drones, extraterrestrial rovers, and other applications.

The effect is huge for the proposed modulator; They say it has the potential to add missing 3D dimensions to any image sensor. To prove this, the team built a prototype lidar system on a laboratory bench that used a commercially available digital camera as a receptor. The authors report that their prototype captured megapixel-resolution depth maps, while requiring a small amount of power to operate the optical modulator.

Better yet, with additional refinements, Atlar says the team has since reduced energy consumption by at least 10 times, well below the limit already reported in the paper, and he believes That several hundred times more energy crunch is within reach. If that happens, the future of small-scale lidar with standard image sensors – and 3D smartphone cameras – could become a reality.


Tiny switches deliver solid-state LiDAR record resolution


more information:
Okan Atlar et al, Longitudinal piezoelectric resonant photoelastic modulators for efficient intensity modulation at megahertz frequencies, nature communication (2022). DOI: 10.1038/s41467-022-29204-9

Provided by Stanford University

Citation: Engineering team on 29 March 2022 from https://techxplore.com/news/2022-03-team-approach-enable-simple-cameras.html to Enabling Simple Cameras for Viewing in 3D (2022, March 28) developed the approach.

This document is subject to copyright. No part may be reproduced without written permission, except for any fair use for the purpose of personal study or research. The content is provided for information purposes only.

Leave a Reply

Your email address will not be published.