
The camera on the first iPhone in 2007 was only 2 megapixels. And it only had a reversing camera; there wasn’t even a selfie shooter out front. These days, you’ll find multiple cameras on the front and back of phones — some with sensors as large as 108 megapixels, like the largest camera on the Samsung Galaxy S21 Ultra.
But while the sensor size and megapixel count of smartphone cameras have increased significantly over the past decade — not to mention improvements in computational photography software — the lenses that aid in shooting remain fundamentally unchanged.
A new company calledmetaalz, which is coming out of stealth mode today, aims to disrupt smartphone cameras with a single, flat-lens system that uses a technology called optical metasurfaces. A camera built around this new lens technology can produce the same if not better quality image as traditional lenses, gather more light for clearer photos, and even enable new forms of detection in phones, all while taking up less space. seizes.
A flat lens
How does it work? Well, first it is important to understand how phone camera lenses work today. The imaging system on the back of your smartphone can have multiple cameras — the latest iPhone 12 Pro has three cameras on the back — but each camera has multiple lenses or lens elements stacked on top of each other. The main camera sensor on the aforementioned iPhone 12 Pro uses seven lens elements. A multi-lens design like the iPhone’s is superior to a single-lens setup; as the light passes through each successive lens, the image becomes sharper and brighter.

Julian Ridder
“The optics in smartphones today usually consist of four to seven lens elements,” says Oliver Schindelbeck, innovation manager at optics manufacturer Zeiss, which is known for its high-quality lenses. “If you have a single lens element, the physics will give you aberrations like distortion or dispersion in the image.”
With more lenses, manufacturers can compensate for irregularities such as chromatic aberration (when colors appear at the edges of an image) and lens distortion (when straight lines in a photo appear curved). However, stacking multiple lens elements on top of each other requires more vertical space in the camera module. It’s one of the many reasons why the camera “bump” on smartphones has gotten bigger and bigger over the years.
“The more lens elements you want to put in a camera, the more space it needs,” says Schindelbeck. Other reasons for the bump’s size include larger image sensors and more zoom cameras, which require extra space.
Phone makers like Apple have increased the number of lens elements over time, and while some, like Samsung, are now folding optics to make “periscope” lenses for greater zoom capabilities, companies have generally stuck with the tried-and-true stacked lens element system.
“The optics got more advanced, you added more lens elements, you created strong aspherical elements to achieve the necessary space reduction, but there has been no revolution in this area in the last 10 years,” says Schindelbeck.
Introducing Metaalz
This is wheremetaalz comes in. Rather than using plastic and glass lens elements stacked over an image sensor, Metallicz’s design uses a single lens built on a glass wafer measuring 1×1 to 3×3 millimeters. If you look very closely under a microscope, you will see nanostructures that measure one-thousandth of the width of a human hair. Those nanostructures bend light rays in a way that corrects many of the shortcomings of single-lens camera systems.
The core technology was shaped by a decade of research when co-founder and CEO Robert Devlin completed his Ph.D. at Harvard University with acclaimed physicist and co-founder ofmetaalz Federico Capasso. The company was spun out of the research group in 2017.
Light passes through these patterned nanostructures, which at a microscopic level resemble millions of circles of different diameters. “Just as a curved lens speeds up and slows down light to bend it, each of these allows us to do the same thing, so we can bend and shape light by changing the diameters of these circles,” Devlin says. .

Julian Ridder
The resulting image quality is as sharp as what you’d get from a multi-lens system, and the nanostructures do the job of reducing or eliminating many of the image-deteriorating aberrations common with traditional cameras. And the design doesn’t just save space. Devlin says a Metalz camera can send more light back to the image sensor, allowing for brighter and sharper images than what you’d get with traditional lens elements.
Another advantage? The company has partnered with two semiconductor leaders (which can currently produce a million Metaalz “chips” per day), meaning the optics are made in the same foundries that produce consumer and industrial devices – a significant step in simplifying the supply chain.
New forms of feeling
Metalz will go into mass production towards the end of the year. The first application will be to serve as the lens system of a 3D sensor in a smartphone. (The company has not given the phone maker’s name.)
Devlin says current 3D sensors, such as Apple’s TrueDepth camera for Face ID, actively illuminate a scene with lasers to scan faces, but this can drain a phone’s battery life. Becausemetaalz can bring more light to the image sensor, he claims it can help save power.
Other good news? If it’s a 3D sensor on the front of a phone for facial authentication, Devlin says the Metalz system could eliminate the need for a bulky camera notch poking into the screen, like those in current iPhones. The amount of space saved by doing away with traditional lens elements is allowing more phone makers to place sensors and cameras under a device’s glass screen, something we’ll see more of this year.
Devlin says that the applications formetaalz extend beyond smartphones. The technology can be used in everything from healthcare instruments to augmented and virtual reality cameras, to the cameras in cars.
Take spectroscopy as an example. A spectrometer is used to accurately detect different wavelengths of light and is often used in medical tests to identify certain molecules in the blood. Because metasurfaces allow you to “collapse a tabletop of optics into a single surface,” Devlin argues that withmetaalz, you can put the right sensors in a smartphone to do the same kind of work.
“You can look at the chemical signature of fruit with a spectrometer and see if it’s ripe,” Devlin says. “It’s really not just an image anymore, you’re actually accessing all kinds of different forms of senses, seeing and communicating with the world, bringing a whole new set of information into the cell phone.”
This story originally appeared on wired.com.