LiDAR vs. Camera Only – What is the best sensor suite combination for full autonomous driving?

THIS MATERIAL IS A MARKETING COMMUNICATION.

LiDAR vs. Camera Only – What is the best sensor suite combination for full autonomous driving?

The autonomous driving industry has been exploring different combinations of sensors to support the development of full self-driving system. In this article we will examine and compare different approaches to sensor suite set up.

Mainstream sensor combination

Mainstream sensor suite includes a combination of LiDAR, camera and Radar. Waymo, for example, is one of the leading autonomous driving companies which include LiDAR in its sensor suite. The company launched its 5th generation platform in March 2020, which has an in-house developed 360 LiDAR that provides high-resolution 3D pictures of its surroundings. It has four perimeter LiDAR placed at four points around the sides of the vehicle, offering unparalleled coverage with a wide field of view to detect objects close by. These short-range LiDAR provide enhanced spatial resolution and accuracy to navigate tight gaps in city traffic and cover potential blind spots. There are 29 cameras on board to help with surrounding understanding. The vehicle is also equipped with six radars which can detect objects at greater distances with a wider field of view.

Full autonomous driving with LiDAR:

Superior low light performance. LiDAR carries its own light source, and its performance is not impacted under low light situation. Even in complete darkness, LiDAR works perfectly. 

More accurate 3D measurement. Human eyes are able to measure distance by triangulating the distance between two eyes and the object. We are also able to recover 3D environment measurements with a camera only setup. However, LiDAR can offer a more accurate annotation of 3D environment, and thus can help boost confidence of the autonomous driving system

Full autonomous driving without LiDAR: 

Tesla is one of the leading companies in autonomous driving system development which does not include LiDAR in its sensor suite. The company selects a combination of eight external cameras, one radar, and 12 ultrasonic sensors for the system.(Waymo 2020) Arguments for not having LiDAR include:

LiDAR faces same penetration handicap as cameras. LiDAR falls between the visible and infrared wave length and is closer to the former, which means its laser bounces around in a similar way as visible light (what the camera is collecting). Under heavy fog/rain situations, LiDAR will also be impacted as laser at that wavelength cannot penetrate fog/rain or anything that blocks viable light, which is similar to what happens to cameras.

LiDAR provides limited information compared to cameras. Modern driving environment is designed for vision-based systems (i.e. human). LiDAR presents a precise 3D measurement of the surroundings without information like colour and text, which as a result is not sufficient to provide a good understanding of the surroundings.  Information collected by cameras, on the other hand, contains richer information which can be interpreted by well-trained neutral nets.

Our take:

Companies choose LiDAR as it provides more accurate 3D measurement. From the arguments above we can see that one key problem for full autonomous driving is to retrieve 3D environment data from 2D images. This requires a well-trained machine learning model, which makes it operate as well as humans in understanding driving conditions at all times. Without significant amount of well-annotated quality data, especially data on long tail scenarios, it is impossible to train such models. The problem for most companies, including Waymo, is that they have no access to sufficient real world annotated data at  the current volume, and what they choose to do for now is to adopt LiDAR in their sensor suite combination to make up for this shortcoming.

Full autonomous driving may be achieved earlier with LiDAR. Waymo has been operating without safety drivers in Phoenix since 2017,(Waymo 2018) while Tesla only launched its full self-driving beta in the US at the end of 2020.(Tesla 2020) Waymo largely focuses its testing in the US whereas tesla vehicles testings are all over the world. Waymo’s approach with high precision map and LiDAR will enable it to remove safety drivers sooner in areas which are well mapped/trained by the company.

Both combination of sensor will achieve full autonomy. Waymo has reached the end of the road in Phoenix and the remaining challenge is to replicate that in more cities/areas. We think vision ML (Machine learning) models will eventually be mature enough for full autonomous driving. For Tesla, we think it will be able to roll out full autonomous driving at a much faster speed once the model is mature, as the company collects significantly more data from a much more diverse environment all over the world.

LiDAR vs. Camera Only – What is the best sensor suite combination for full autonomous driving?

The autonomous driving industry has been exploring different combinations of sensors to support the development of full self-driving system. In this article we will examine and compare different approaches to sensor suite set up.

Mainstream sensor combination

Mainstream sensor suite includes a combination of LiDAR, camera and Radar. Waymo, for example, is one of the leading autonomous driving companies which include LiDAR in its sensor suite. The company launched its 5th generation platform in March 2020, which has an in-house developed 360 LiDAR that provides high-resolution 3D pictures of its surroundings. It has four perimeter LiDAR placed at four points around the sides of the vehicle, offering unparalleled coverage with a wide field of view to detect objects close by. These short-range LiDAR provide enhanced spatial resolution and accuracy to navigate tight gaps in city traffic and cover potential blind spots. There are 29 cameras on board to help with surrounding understanding. The vehicle is also equipped with six radars which can detect objects at greater distances with a wider field of view.

Full autonomous driving with LiDAR:

Superior low light performance. LiDAR carries its own light source, and its performance is not impacted under low light situation. Even in complete darkness, LiDAR works perfectly. 

More accurate 3D measurement. Human eyes are able to measure distance by triangulating the distance between two eyes and the object. We are also able to recover 3D environment measurements with a camera only setup. However, LiDAR can offer a more accurate annotation of 3D environment, and thus can help boost confidence of the autonomous driving system

Full autonomous driving without LiDAR: 

Tesla is one of the leading companies in autonomous driving system development which does not include LiDAR in its sensor suite. The company selects a combination of eight external cameras, one radar, and 12 ultrasonic sensors for the system.(Waymo 2020) Arguments for not having LiDAR include:

LiDAR faces same penetration handicap as cameras. LiDAR falls between the visible and infrared wave length and is closer to the former, which means its laser bounces around in a similar way as visible light (what the camera is collecting). Under heavy fog/rain situations, LiDAR will also be impacted as laser at that wavelength cannot penetrate fog/rain or anything that blocks viable light, which is similar to what happens to cameras.

LiDAR provides limited information compared to cameras. Modern driving environment is designed for vision-based systems (i.e. human). LiDAR presents a precise 3D measurement of the surroundings without information like colour and text, which as a result is not sufficient to provide a good understanding of the surroundings.  Information collected by cameras, on the other hand, contains richer information which can be interpreted by well-trained neutral nets.

Our take:

Companies choose LiDAR as it provides more accurate 3D measurement. From the arguments above we can see that one key problem for full autonomous driving is to retrieve 3D environment data from 2D images. This requires a well-trained machine learning model, which makes it operate as well as humans in understanding driving conditions at all times. Without significant amount of well-annotated quality data, especially data on long tail scenarios, it is impossible to train such models. The problem for most companies, including Waymo, is that they have no access to sufficient real world annotated data at  the current volume, and what they choose to do for now is to adopt LiDAR in their sensor suite combination to make up for this shortcoming.

Full autonomous driving may be achieved earlier with LiDAR. Waymo has been operating without safety drivers in Phoenix since 2017,(Waymo 2018) while Tesla only launched its full self-driving beta in the US at the end of 2020.(Tesla 2020) Waymo largely focuses its testing in the US whereas tesla vehicles testings are all over the world. Waymo’s approach with high precision map and LiDAR will enable it to remove safety drivers sooner in areas which are well mapped/trained by the company.

Both combination of sensor will achieve full autonomy. Waymo has reached the end of the road in Phoenix and the remaining challenge is to replicate that in more cities/areas. We think vision ML (Machine learning) models will eventually be mature enough for full autonomous driving. For Tesla, we think it will be able to roll out full autonomous driving at a much faster speed once the model is mature, as the company collects significantly more data from a much more diverse environment all over the world.

AUTHORED BY
Edward Chan
Investment Analyst – Hardware, Semi-conductor

Date: April 28, 2021
Category: Robotics & AI

Disclaimer

This document has been prepared for presentation; illustration and discussion purpose only and is not legally binding. Whilst complied from sources Mirae Asset Global Investments believes to be accurate, no representation, warranty, assurance or implication to the accuracy, completeness or adequacy from defect of any kind is made. Unless indicated to the contrary, all figures are unaudited. The division, group, subsidiary or affiliate of Mirae Asset Global Investments which produced this document shall not be liable to the recipient or controlling shareholders of the recipient resulting from its use. The views and information discussed or referred in this report are as of the date of publication, are subject to change and may not reflect the current views of the writer(s). the views expressed represent an assessment of market conditions at a specific point in time, are to be treated as opinions only and should not be relied upon as investment advice regarding a particular investment or markets in general. In addition, the opinions expressed are those of the writer(s) and may differ from those of other Mirae Asset Global Investments’ investment professionals.

The provision of this document shall not be deemed as constituting any offer, acceptance, or promise of any further contract or amendment to any contract which may exist between the parties. The issuer of this article is Mirae Asset Global Investments (HK) Limited (“we”) which the individual, or we or our managed funds may hold the mentioned securities. It should not be distributed to any other party except with the written consent of Mirae Asset Global Investments. Nothing herein contained shall be construed as granting the recipient whether directly or indirectly or by implication, any license or right, under any copy right or intellectual property rights to use the information herein. This document may include reference data from third-party sources and Mirae Asset Global Investments has not conducted any audit, validation, or verification of such data. Mirae Asset Global Investments accepts no liability for any loss or damage of any kind resulting out of the unauthorized use of this document. Investment involves risk. Past performance figures are not indicative of future performance. Forward-looking statements are not guarantees of performance. The information presented is not intended to provide specific investment advice. Please carefully read through the offering documents and seek independent professional advice before you make any investment decision. Products, services, and information may not be available in your jurisdiction and may be offered by affiliates, subsidiaries, and/or distributors of Mirae Asset Global Investments as stipulated by local laws and regulations. Please consult with your professional adviser for further information on the availability of products and services within your jurisdiction.

Australia: The information contained on this document is provided by Mirae Asset Global Investments (HK) Limited (“MAGIHK”), which is exempt from the requirement to hold an Australian financial services license under the Corporations Act 2001 (Cth) (Corporations Act) pursuant to ASIC Class Order 03/1103 (Class Order) in respect of the financial services it provides to wholesale clients (as defined in the Corporations Act) in Australia. MAGIHK is regulated by the Securities and Futures Commission of Hong Kong under Hong Kong laws, which differ from Australian laws. Pursuant to the Class Order, this document and any information regarding MAGIHK and its products is strictly provided to and intended for Australian wholesale clients only. The content of this document is prepared by Mirae Asset Global Investments (HK) Limited and has not been reviewed by the Australian Investments & Securities Commission.

Hong Kong: Before making any investment decision to invest in the Fund, investors should read the Fund’s Prospectus and the Information for Hong Kong Investors of the Fund for details and the risk factors. Investors should ensure they fully understand the risks associated with the Fund and should also consider their own investment objective and risk tolerance level. Investors are also advised to seek independent professional advice before making any investment. This document is issued by Mirae Asset Global Investments and has not been reviewed by the Hong Kong Securities and Futures Commission.

Singapore: It is not intended for general public distribution. The investment is designed for Accredited Investors as defined under the Securities and Futures Act of Singapore. Please consult with your professional adviser for further information on the availability of products and services within your jurisdiction.