FREQUENTLY ASKED QUESTIONS
Lidar is an acronym of “light detection and ranging, it can also be considered as the combination of Light and Radar. It is used to measure the distance to a target. With 3D scanning, it can create an image of the surveyed environment. Now it is widely used in areas like autonomous vehicles, autonomous forklifts, service robots, 3D mapping, UAV/drone etc. etc.
The lidar fires pulses of laser light to hit the objects in the environment, and the pulses will bounce back and be received by the lidar. The amount of time between the transmitted and backscattered pulse is recorded by a sensor on the lidar, and by using the speed of light the distance of the object is determined. By repeatedly firing laser light at a speed of millions of pulses per second, lidar can create a 3D “map” by pointcloud describing the environment it is surveying.
The main parameters of Lidar include range, field of view (FOV), angular resolution, accuracy and data rate. Range decides how far the Lidar can see, FOV decides how big the lidar can see, angular resolution, accuracy and data rate decide the perception quality. Nowadays, a 3D lidar can see as far as a few hundred meters away; the FOV varies according to applications, horizontal FOV can reach as large as 360°, vertical FOV can be up to 180°; vertical angular resolution can be less than 1°, while horizontal angular resolution can be less than 0.1°; the accuracy can be within 2cm, and the data rate can be a few millions of points per second with single return.
A typical autonomous driving system is as following. It is a very high-level diagram. The autonomous driving system includes perception system, planning system and control system.
Lidar is in perception system, responsible for “seeing” what is happening around the vehicle: other users on the road, such as cars, trucks, pedestrians and bicycles, their size, speed and direction, the lane marks and road curbs. Planning system decide where to go and how to get there.
Mission planning can be seen as an internal navigation module of the autonomous vehicle software system, i.e. it determines the route for the vehicle to go from the starting point to its destination. It highly depends on HD map for autonomous vehicle.
Behavior Planning receives the output of Mission Planning, as well as the inputs from Perception module and the map information. After analyzing those inputs, Behavior module decides how car should react on the road, such as following, overtaking, avoiding the pedestrians, crossing the crossroad etc. etc.
Motion Planning can be divided into trajectory planning and speed planning. It decides in a short time period t, to move from A to B, the trajectory that the vehicle should follow, including to select the specific path points the vehicle should pass through, and its speed, orientation and acceleration when it reaches each point.
Control module connects directly to CAN-BUS of autonomous vehicle. It converts the inputs from Motion Planning module to control signals for accelerator, brake and steering wheel, and effectuate the trajectory selected by Motion Planning
Nowadays the algorithm is stored in a computer, in the future it will be integrated in a chip, and the chip will be mounted on Lidar PCB.