At Tuesday’s unveiling of the iPhone 12, Apple touted the capabilities of its new lidar sensor. Apple says lidar will enhance the iPhone’s camera by allowing more rapid focus, especially in low-light situations. And it may enable the creation of a new generation of sophisticated augmented reality apps.
Tuesday’s presentation offered little detail about how the iPhone’s lidar actually works, but this isn’t Apple’s first device with lidar. Apple first introduced the technology with the refreshed iPad in March. And while no one has done a teardown of the iPhone 12 yet, we can learn a lot from recent iPad teardowns.
Lidar works by sending out laser light and measuring how long it takes to bounce back. Because light travels at a constant speed, the round-trip time can be translated into a precise distance estimate. Repeat this process across a two-dimensional grid and the result is a three-dimensional “point cloud” showing the location of objects around a room, street, or other location.
A June analysis by System Plus Consulting found that the iPad’s lidar sends out light using an array of vertical cavity surface-emitting lasers (VCSELs) made by Lumentum. It then detects the return flash using an array of sensors called single-photon avalanche diodes (SPADs) supplied by Sony. I’ll explain what these are in the next section.
I found Apple’s announcement particularly interesting because I’ve been working on a story about companies that are using the same combination of technologies—VCSEL lasers and SPAD detectors—to build much more powerful lidar for the automotive market. One of the big selling points of VCSELs and SPADs is that they can be created using conventional semiconductor fabrication techniques. As a result, they benefit from the huge economies of scale in the semiconductor industry. As VCSEL-based sensors become more common, they are likely to steadily get cheaper and better.
Two of the companies working on high-end VCSEL-based lidar—Ouster and Ibeo—have already gotten more traction than most companies in the crowded lidar business. Apple’s decision to adopt the technology—and the possibility that other smartphone vendors could follow Apple’s lead—will provide them with a nice tailwind in the coming years.
VCSELs helped Apple make radically simpler lidar
The first three-dimensional lidar sensor was introduced by Velodyne more than a decade ago. The spinning unit cost around $75,000 and was significantly larger than a smartphone. Apple needed to make lidar sensors radically cheaper and smaller in order to put one in each iPhone, and VCSELs helped the company do it.
What’s a VCSEL? If you’re building a laser using conventional semiconductor fabrication techniques, you have two basic options. You can make a laser that transmits light out the side of the wafer (known as an edge-emitting laser) or from the top (a vertical cavity surface emitting laser, or VCSEL).
Traditionally, edge-emitting lasers have been more powerful. VCSELs have been used for decades in everything from optical mice to optical networking gear. They were traditionally considered unsuitable for high-end applications where a lot of light was needed, but VCSELs have become more powerful as the technology has matured.
Making an edge-emitting laser typically requires cutting the wafer to expose the emitter. This adds to the cost and complexity of the manufacturing process and limits the number of lasers that can be made on one wafer. by contrast, VCSELs emit light perpendicular to the wafer, so they don’t need to be individually cut or packaged. This means that a single silicon chip can hold dozens, hundreds, or even thousands of VCSELs. In principle, a chip with thousands of VCSELs shouldn’t cost more than a few dollars when produced at large scale.
The story is similar with single photon avalanche diodes. As the name implies, these are sensitive enough to detect a single photon. High sensitivity means they suffer from a lot of noise. As a result, it takes sophisticated post-processing to use them for an application like lidar. But one big advantage of SPADs is that—like VCSELs—they can be fabricated using conventional semiconductor techniques, and thousands of them can be packed on a single chip.
The combination of VCSELs and SPADs enables a dramatic simplification of conventional lidar designs. Velodyne’s original three-dimensional lidar mounted 64 individually packaged lasers in a column on a spinning gimbal. Each laser had a matching detector. The complexity of this design and the need to precisely align each laser with its corresponding detector was one reason Velodyne’s early lidar units were so expensive.
More recently, a number of companies have experimented with using small mirrors to “steer” a laser beam in a scanning pattern. This design requires only a single laser instead of 64. But it still involves at least one moving part.
By contrast, Apple, Ouster, and Ibeo are building lidar sensors with no moving parts at all. With hundreds or thousands of lasers on a chip, VCSEL-based lidars can have a dedicated laser for each point in the lidar’s field of view. And because all of these lasers come pre-packaged on one chip, assembly is much simpler than for Velodyne’s classic spinning design.
Recent iPhones already had another 3-D sensor called the TrueDepth camera that enabled Apple’s FaceID feature. It also used an array of VCSELs reportedly provided by Lumentum. TrueDepth works by projecting a grid of more than 30,000 dots onto a subject’s face and then estimating the three-dimensional shape of the user’s face based on the way the grid pattern was deformed.
The iPad’s lidar sensor projects many fewer laser dots than the TrueDepth camera. An iFixIt video made with an infrared camera showed the lidar projecting a grid of only a few hundred pixels. But while the TrueDepth pattern tries to guess depths based on the shape of the light that falls on a subject’s face, the iPad’s lidar sensor measures distances directly by measuring how long it takes for the light to bounce off an object and return to the camera. This process likely yields both better precision in depth measurements and also longer range.
More powerful lidar also uses VCSELs and SPADs
The performance of Apple’s lidar is far behind high-end sensors sold by specialized lidar companies. Velodyne, the company that invented three-dimensional lidar, touts a range of more than 200 meters for its most powerful lidar, while Apple’s sensor has a range of around five meters.
Other VCSEL-based lidars are significantly more powerful than Apple’s, too. Ouster’s most powerful VCSEL-based lidar, for example, boasts a range around 100 meters for detecting objects with 10-percent reflectivity.
Ouster’s current sensors are all Velodyne-style spinning units. They have 16 to 128 VCSELs in a row on a single chip—this chip is then mounted vertically on a spinning gimbal like Velodyne’s units. The simplicity of this solid-state design has allowed Ouster to undercut Velodyne on price and emerge as one of Velodyne’s biggest rivals. But these spinning lidar sensors from Ouster still cost thousands of dollars—too expensive to use in mainstream cars, to say nothing of smartphones.
Last week, Ouster announced plans to ship a new solid-state lidar with no moving parts. Instead of arranging 16 to 128 lasers in a row as in Ouster’s current lidar, Ouster’s new unit will have more than 20,000 VCSELs arranged in a two-dimensional grid.
Ibeo is pursuing a similar strategy and may be ahead of Ouster. Ibeo designed the very first lidar ever shipped in a mass-market car—the Audi A8. That lidar was primitive, with only four lines of vertical resolution. But Ibeo is now developing a new model called ibeoNext that will have a laser grid that’s 128 by 80 pixels—a bit smaller than Ouster’s planned sensor but significantly larger than Ibeo’s past offerings. Ibeo says its sensor will have a 150-meter range for objects with 10-percent reflectivity.
A final contender that’s worth mentioning here is Sense Photonics, which we covered back in January. Like the other companies we’ve discussed, Sense is using VCSELs and SPADs for its lidar. However, Sense is using a technique called micro-transfer printing to spread its lasers out. This allows the lasers to use more power without running into heat and eye safety problems. So far, Sense’s lidars have not had long range, but Sense CEO Shauna McIntyre told Ars the company is aiming for 200-meter range for a forthcoming sensor that it will announced in early 2021.
Lidar is about to invade the automotive market
Ibeo, Sense, and Ouster are all rolling out new, low-cost designs because they expect an explosion of demand from the automotive industry. Lidar sensors could dramatically improve vehicles’ advanced driver assistance systems (ADAS).
For example, many people see Tesla as having one of the industry’s most advanced ADAS systems. But the company has a persistent problem with its vehicles crashing into stationary objects—occasionally with fatal results. Lidar is better than cameras or radar at detecting stationary objects, so adding lidar to cars could prevent many of these crashes while making ADAS systems more convenient for drivers.
Until now, lidar was considered too expensive for the automotive market, but that has started to change, with multiple companies promising lidar sensors that cost less than $1,000 in the next few years.
Ouster is aiming to have its ES2 sensor ready for mass automotive production in 2024. The company says it will initially cost $600 in volume, with the price falling to $100 in subsequent years.
Ibeo hasn’t announced a price for the IbeoNext, but the company says it has already scored a deal with Great Wall Motors, a major automaker in China, to begin volume manufacturing in 2022.
Companies with non-VCSEL lidar designs are rushing into this market as well. One of the most prominent is Luminar, which announced a deal with Volvo in May. Volvo is aiming to have cars with Luminar lidar available in 2022.
These designs have different strengths and weaknesses. So far, Luminar’s lidar has boasted longer range—as much as 250 meters. This is possible because Luminar uses lasers at a wavelength of 1550nm—far outside the visible light range. The fluid in the human eye is opaque to 1550nm light, which means that Luminar’s lidar can use a lot more laser power without creating an eye safety hazard. Luminar’s lidar also offers a wider field of view than Ouster’s.
The biggest question for Luminar is whether it can hit its $1,000 price target. When I interviewed Luminar CEO Austin Russel two years ago, he said that Luminar would need to “get down to low single-digit thousands” to reach a mass market. I assumed this meant that Luminar’s lidar cost more than “low single-digit thousands” at the time. But Luminar now says it’s on track to get below $1,000 in the next few years.
By contrast, Ouster and Ibeo shouldn’t have too much trouble making its lidar cheap. The big challenge is likely to be achieving the 200 meter range that’s generally considered necessary for autonomous operation at highway speeds.
“VCSELs are not a very bright laser compared to a traditional lidar,” Ouster CEO Angus Pacala told me in 2018. “If you make a physical model and you plug a SPAD array and a VCSEL array, you get really poor performance out.” However, Pacala said, Ouster has come up with “some fundamental IP on many different levels” to make the combination work. Pacala said that included “exceptional” suppression of out-of-band light and “putting a huge amount of signal processing directly next to the SPADs” to help distinguish returned laser flashes from ambient noise.
So the big challenge that Ouster, Ibeo, and Sense will face in the next couple of years is to push the performance of the VCSEL and SPAD combination enough to achieve the same 200-meter range touted by other lidars. If they can do it, then the low cost and simplicity of semiconductor chips might give them a decisive advantage. If they can’t, they might be relegated to a lower tier of the market.