Insights from industry

LiDAR Sensors & Sensors Expo: Louay Eldada

Louay Eldada
CEO & Co-Founder of Quanergy

In this interview, AZoSensors spoke to Louay Eldada about Quanergy, LiDAR sensors and Sensors Expo.

What are LiDAR Sensors - what are their applications, what are they used for and how do they work?

LiDAR sensors are the only 3D sensors. They do a direct measurement of the range or distance of objects, as opposed to radar which is 1D and a camera which is 2D. People use camera detection to analyze 2D pixel arrays to estimate distance based on what they're seeing in 2D - how far away an object potentially is, how large it might be and what it might be. With 2D video, eventually the whole thing disappears to one pixel, so the resolution degrades rapidly with distance.

LiDAR starts a direct time-of-flight measurement – you are physically bouncing laser pulses off objects and measuring the round-trip time. It’s a direct measurement with an active device, so the signal that we send is the signal that we send as opposed to cameras that are passive and rely on ambient lighting.

The resolution is very high with distance because when you have a well collimated laser beam you maintain high resolution in the distance as opposed to video where as I said the whole thing disappears to one pixel because of perspective. You could be standing in the dark, under a bridge, on the highway because your car broke down at night, or complete darkness and I could still see every detail of how you're moving, behaving, waving at me; and I could detect really small details within your hand that you are raising.

I can see everything in your hand at even 100 meters away with a LiDAR. With a camera you see absolutely nothing when you're in the dark and even when you have an ambient light, the whole human body at 100 meters would be essentially one or two pixels - you cannot even tell that it's a human. LiDAR is by far the most capable spatial sensor.

What kind of applications does LiDAR have at the moment?

We have identified and are going after over 50 applications. As you can imagine any application where you need to be aware of your environment can take advantage of LiDAR.

Autonomous vehicles of course are very popular, it's a hot topic and that's an obvious application, but the applications far exceed that. We're using LiDAR in security and surveillance, we're using it in industrial automation for mining, agriculture, factory automation, warehouse automation, delivery drones and so on and so forth. We also use it for 3D mapping and surveying, which includes surveying infrastructure for structural integrity.

Given the resolution that you're detecting with and using a vehicle or a drone that flies around a bridge or power line, you could detect hairlines cracks starting to develop, it can detect sagging in power lines that might be about to reach trees which could cause interruption in power and so on. Those are very specific examples based on what some of our customers are already doing with our LiDAR.

Security and surveillance includes smart phones - it could be just lifestyle applications or it could be something used just for personal benefit or comfort. You walk into your house, you are recognized via the alarm system, it automatically turns off because the system recognizes who you are. It sets your prefered settings for lighting, temperature, open or closed curtains and so on. Someone might be on the couch and it looks like they’re taking a nap so it turns off the TV - you can imagine that you have this virtual butler at home that reacts to everything you do without you having to worry about it or even ask anything. It's observing every move and it's private. It's private in the sense that you don't have to save the data and when your monitoring the scene, it's just a smart home.

When we talk about privacy in public spaces, we talk about privacy, we are talking about smart homes, smart buildings, smart cities, smart nations, smart borders and can use our LiDAR to control cameras.

As long as our LiDAR is the only sensor monitoring, then you have privacy. We can see there's human beings, and you could be detecting 10,000 or more human beings in a crowded area at sport events, in a large urban area and so on. We can track each person one by one, each has unique ID number.

We maintain privacy so we only know it’s a human being, we don't know who they are, but the moment someone does something suspicious they lose their right to privacy and so the live feed from a PTZ (Pan Tilt Zoom) camera pans over and zooms in on the individual, you identify them based on their facial features and you can track them.

They may have been detected as, for example, leaving a package behind on the sidewalk, dropping a package in the garbage bin or maybe walking back and forth by the entrance of a government building in a suspicious manner that's not normal. Once you see that kind of behavior then the algorithm picks up on it and you can track them with the LiDAR and the video that's controlled by the LiDAR. I could talk all day about applications but that gives you an idea.

How do you see the sensors developing in the future? How do you think they'll progress?

At Quanergy we make solid state LiDAR. Solid state LiDAR has no moving parts on any scale - on a macro scale or a micro scale. It’s based on an optical phase array (OPA) which is the optical equivalent of phased array radar. A phased array radar is the end of point of several decades of radar research. It has multiple antenna elements and by controlling the phase of the signal being lost by each antenna you control how the beams interfere and it generates a beam that's steered in one direction or the other based on the phase value without any moving parts. We do the same with optical phased arrays.

Whereas in phased array radar you might have 8, 16 or 32 antenna elements, in optical phased arrays we have many 1,024 optical antenna elements and everything is about all the dimensions of the antenna elements which are about 20,000 times smaller than radar. It scales with the wavelength between radar and the infrared, and we steer beams by controlling the phase value of the optical beams emitted by the 1,024 optical antenna elements. An optical antenna element is simply a particular location on a silicon chip where light exits into free space.

Given the phase relationships between those 1,024 elements you control the direction of the beam and the spot size. You can zoom in, zoom out, and hop around and across the scene in the random access manner because it’s not mechanical, you're not oscillating a mirror or spinning a lens, it's easy to go from pointing in one direction to any other direction by changing all the phase values.

That's important because it drives reliability. When you have no moving parts, you go from a few thousand hours of operation in terms of the mean time between failures, to well over 100,000 hours of operation between failures, so reliability is significantly higher and the life span is significantly longer. Also, the price goes down. Everything that we do is based on silicon and we don't use any semiconductors or exotic materials. We have scale of production so you can guarantee low cost and the highest possible reliability. We are able to reduce the size, reduce the price and increase the reliability of the product and we have a roadmap in place to keep improving those key parameters.

Is that what makes Quanergy’s products so unique?

We're the only ones who have a solid state LiDAR, a really small estate that meets performance, reliability and cost requirements. We are not aware of any other LiDAR that can do that, and our customers can confirm that.

We made the concept of said LiDAR popular. We started this whole trend in the industry and our customers started to ask other LiDAR suppliers - they said “We like what Quanergy is doing, can you make solid state LiDAR?” So many people tried to quickly put together solid LiDARs. They bought micro MEMS mirrors off the shelf, they bought the detectors off the shelf and they decided to potentially redefine ‘solid state’ from being truly solid state (as it had no moving parts) to mean something made of semiconductors. So, when you have micro MEMS mirrors etched in silicon, although you have mechanical motion, they decided to really stretch the definition so they can say that they have a solid state solution.

If you search for solid state LiDAR, you might see twenty companies supplying it. In reality we are one of only three, and the other two companies do not use optical phase arrays which give you performance reliability and cost benefits - they do the flash approach.

The flash LiDAR is also solid state, but what we do in the optical phase array is that we have the small spot which provides very high resolution over distance and we have a very fast actuation mechanism, so it can jump around at half a million times per second, meaning we can cover something by jumping around very rapidly with a small spot and that allows you to keep the power level low.

LiDAR is eye-safe while it’s at a wave length of below1,100 nanometers. Silicon detects light up to 1,100 nanometers, however below 1,100 nanometers which is where we operate (we are at 905 nanometers) it’s possible to get very low-cost lasers and you can detect with silicon. This is very important because when the detector has to use exotic materials rather than silicon, the cost up can increase by up to 1,000 times.

The other two solid state LiDAR companies that use the flash LiDAR approach illuminate the entire scene with one big intense beam of light. The problem with that is that they have to go to 1,550 nanometers which is the safest eye-safe region when you have that high an intensity, because for a LiDAR to be eye-safe you have to be able to get very close to it and stick essentially your eyeball to the LiDAR and still have it be safe for the eye.

For the flash approach if you do it at 1,550 nanometers which is the safest eye-safe region, you cannot detect in silicon, so your only option is to use other materials which can be up to 1,000 times more expensive.

So, to cut a long story short there is only one other approach that's truly powered safe, the flash LiDAR approach however it’s either long range and very expensive or it's all silicon, but the range is now on the order of 10 meters only.

What can attendees expect to learn and find out about at your session at Sensors Expo?

Usually we have a very diverse audience because of the applicability of our LiDAR in many industries. We usually have an audience that is firstly heavily automotive because the space is just so important, and people are so happy in that space. We do get people who are roboticists, we get people who ask about drones, we get people from the security industry as well as from industrial automation. We address all these applications because it's really the same hardware and the same core software.

The perception is the same - we have a perception software called Qortex which does object detection, tracking and classification of each object. That's the same for all applications whether you're marking a border to see if people are crossing illegally and you want to look at origin of each person and where they're coming from, or you are driving an autonomous vehicle and want to look at each pedestrian’s behavior then even predict whether or not they might be about to cross the street.

This core perception software really mixes the human physiology with human psychology. It's the same for all applications and we extend it beyond the field of vehicles and animals to other objects. All of the layers of the software are the same, the hardware is identical and it's just going to be that last layer where what you see in the scene helps you react. In the vehicle you might decide to steer or slow down or stop. In a security application you might decide to send a live image to law enforcement, where that image keeps tracking the individual and they get intercepted if anything looks illegal or suspicious.

All of that is to say that though we address many applications we are not really stretched thin as a company. The core development – certainly most of it - is really the same for all the various applications.

About Louay Eldada

Louay is the CEO and co-founder of Quanergy; a company create high-performing, low-cost LiDAR sensors. He also holds a B.S. and M.S. in Electrical Engineering and a Ph.D. in Optoelectronics, from Columbia University.

Disclaimer: The views expressed here are those of the interviewee and do not necessarily represent the views of AZoM.com Limited (T/A) AZoNetwork, the owner and operator of this website. This disclaimer forms part of the Terms and Conditions of use of this website.

Zoe Peterkin

Written by

Zoe Peterkin

Upon graduating from the University of Exeter with a BSc Hons. in Zoology, Zoe worked for a market research company, specialising in project management and data analysis. After a three month career break spent in Australia and New Zealand, she decided to head back to her scientific roots with AZoNetwork. Outside of work, Zoe enjoys going to concerts and festivals as well as trying to fit in as much travelling as possible!

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Quanergy. (2019, January 15). LiDAR Sensors & Sensors Expo: Louay Eldada. AZoSensors. Retrieved on October 30, 2024 from https://www.azosensors.com/article.aspx?ArticleID=1214.

  • MLA

    Quanergy. "LiDAR Sensors & Sensors Expo: Louay Eldada". AZoSensors. 30 October 2024. <https://www.azosensors.com/article.aspx?ArticleID=1214>.

  • Chicago

    Quanergy. "LiDAR Sensors & Sensors Expo: Louay Eldada". AZoSensors. https://www.azosensors.com/article.aspx?ArticleID=1214. (accessed October 30, 2024).

  • Harvard

    Quanergy. 2019. LiDAR Sensors & Sensors Expo: Louay Eldada. AZoSensors, viewed 30 October 2024, https://www.azosensors.com/article.aspx?ArticleID=1214.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.