LiDAR, the new laser eye for mobile phones

LiDAR, the new laser eye for mobile phones

The use of lasers is widespread in industrial or security applications and the technology is also central to the development of the autonomous car. But the integration of a laser sensor (LiDAR) in smartphones has the potential to be a momentous step forward, both for the photographic capabilities of mobile devices and for a new (and perhaps definitive?) breakthrough for augmented reality.

The incorporation of laser sensors in mobiles improves the performance of cameras and opens the field to mass-market augmented reality. Credit: Apple.

 

MARÍA GÓMEZ BRAVO | Tungsteno

The continuous evolution of mobile phones has resulted in a connectivity boost in 2020 (driven by the arrival of 5G) and in the number and sophistication of their built-in cameras. While the most significant efforts by manufacturers have been focused on these features—as well as on the duration of batteries and their recharging speed—a trend has emerged to incorporate sensors so that these devices can better understand the world around them, adding an old technological acquaintance to new handsets.

The megapixel and optical zoom wars, which digital cameras went through in the previous decade, have now returned with a vengeance to the field of smartphones, which now aspire to dethrone them definitively. In this context, the manufacturers of the Android family—Xiaomi, Huawei and Samsung, primarily—have unveiled significant new features this year, while their rival Apple doesn’t seem to have taken major steps in this race, preferring to capitalise on the area that it leads: the performance of their devices' brains, their processors, as well as their integration with the operating system, both designed by the manufacturer from Cupertino (USA). However, in 2020, Apple's latest iterations have come with the addition of the LiDAR sensor, currently reserved for the top range of devices (the iPad Pro and iPhone Pro).

Both LiDAR and ToF systems, which were used before, collect the information collected from outside and process it thanks to artificial intelligence to get the best image. Credit: Sony.

A legacy technology to change the mobile landscape

The use of this technology, however, is not new. The LiDAR (Light Detection and Ranging) scanner makes it possible to establish the distance at which an object is located by emitting a beam of pulsed laser light and determining the time it takes for the light to bounce off the object and be picked up again by the sensor. Unlike radar, which uses radio waves to detect objects, LiDAR allows information to be collected in a 360° map, generating a cluster of points with which it recreates in 3D, with a high degree of accuracy, the physical space that surrounds the sensors.

This system is an old acquaintance of the industrial applications or of advanced security systems, but also of aviation, to map the terrestrial surface, and even archaeology, to reveal ruins hidden by vegetation or buried underground. This scanner has also become an integral part of autonomous vehicles, which it provides with "vision" to detect obstacles as well as the distance at which they are located, so that they can be avoided and the direction of driving reoriented. This is the device chosen, for example, by Google for its Waymo cars, but also by other manufacturers such as General Motors, Ford, Hyundai and Volkswagen for the driving assistants in their conventional cars.

This type of sensor represents an advance towards the objective of improving the interaction of the devices with the environment. In fact, what all these applications have in common is a balance between data collection and analysis, an aspect that can be very useful when applied to tablets and smartphones. In other words, the LiDAR sensor enables these devices to go one step further in recognising the context that surrounds them, for example when an image is taken, or interpreting it and providing it with extra information, as is the case with augmented reality (AR).

Before Apple incorporated LiDAR into its mobile phones, other manufacturers were already using different depth sensor systems. An example of this is the ToF (Time of Flight) sensor, which uses infrared beams of light to map out a scene in 3D, allowing it to measure the entire depth, including the objects that are part of the scene, without the need to focus several different shots, information that is processed thanks to artificial intelligence. These sensors have already been used in our devices for some years, enabling, for example, facial recognition or photography modes such as portrait and night photography, avoiding blurred, coarse-grained and low-definition images. This is the system used by most Android models.

The LiDAR approach, unlike the ToF sensors, collects more information, because instead of a single light pulse to map an entire space, this scanner uses multiple points of light to produce the surrounding scan, with a faster result, higher resolution and more accuracy. In other words, thanks to these sensors, several photographs are taken with different exposures that are then analysed and interpreted to create a new image from the best captures in a matter of nanoseconds.

The arrival of LiDAR initiates a new era focused on seeking that devices understand better with the environment. Credit: Wikimedia Commons.

The road to mass-market augmented reality

The step taken by Apple launches a new race in the mobile device market focused on computer photography and virtual reality, encouraging other manufacturers to also work on new modifications of ToF sensors for their handsets. This is the case with Samsung, for example, which has already unveiled its own ISOCELL Vizion 33D sensor, especially aimed at improving the functionalities of its cameras. And it also opens the door to take on other augmented reality projects, in which actors such as Google now have a path to follow. The tech giant had also used the ToF sensor system to recreate the mobile virtual experience with the Tango project, but ended up replacing this with the ARCore platform, which met the expectations until then for existing cameras and applications.

These new sensors open up a huge field in the development of apps to incorporate external elements into a map or to scan spaces, something particularly useful for interior design, shopping, tourism or, of course, the video game arena. This technology, combined with artificial intelligence, provides smartphone users with "instant" augmented reality—without the need for AR glasses—which saves costs and simplifies access to these representations of reality.

How will LiDAR and ToF sensors change the way we use smartphones? While at the moment these innovations are reserved for manufacturers’ higher-end devices, it does seem clear that the focus will likely remain on the cameras. The next challenge is to get ever closer to the precision of the human eye under any lighting conditions. Our phones will take pictures that we will perceive as more real, because they will understand the environment better and better; and beyond that, such understanding will open a window to new perceptions of what is real, those of augmented reality.

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

 

FIND US ON FACEBOOK

INNOVATION VIDEOS

CONTACT SACYR

Condesa de Venadito, 7
28027 - Madrid (Spain)
Phone: +34 91 545 50 00

www.sacyr.com