Is lidar robust enough to power autonomous cars?

Published: 27 April 2018

► How lidar tech works in cars
► Powering autonomous cars
► But is it safe enough for self-driving? 

It doesn’t take a genius to spot a parked fire engine in the road ahead of you. So you might expect a car packing a state-of-the-art driver assistance system to do just that. Yet a Tesla Model S recently had a high-profile prang on a dry, bright morning in Los Angeles. The car is fitted with Autopilot – so even if the driver is distracted, the car should be able to avoid the collision or reduce its severity.

The driver reportedly told fire officers at the scene that the car was in Autopilot mode. Autopilot does not drive the car, despite what the name might suggest; rather, it’s designed to assist the driver. As a Tesla spokesperson told us: ‘Autopilot is intended for use only with a fully attentive driver.’

Its features include lane keeping and adaptive cruise control. It could have applied automatic emergency braking, but wouldn’t have automatically steered the car around the fire engine, which was attending the scene of an earlier accident.  

How autonomous car levels work

Tesla Model S crashes in Los Angeles

Whatever actually happened in this case,  it highlights the limitations of current tech. 

Assistance systems such as Autopilot are programmed to disregard stationary vehicles in most situations. The focus is on keeping a safe distance from moving vehicles, without slamming the brakes on every time the system catches sight of a parked car.

CAR magazine lives with a Tesla Model S: check out our long-term test review

Radar transmits radio waves and interprets the reflection back from an object. While it can detect large objects, and easily calculate speed and distance in all weather and light conditions, it can’t distinguish colours or tell the difference between large and small objects. A radar image of a street would be a mass of objects from cars to lamp posts, cyclists to litter bins. It’s down to the software engineers to programme the system so that it ignores the insignificant ones. There’s clearly room for error here.

So if radar isn’t enough, what about the system that’s being talked of as the next step up from radar, ‘light detection and ranging’, or lidar? It uses pulsed laser beams to build up a 3D digital representation of the area. It can detect specific objects and calculate the distance to them as well as ‘seeing’ the edge of the road or white lines.

Lidar sensors in a Volvo

It can, however, be affected by rain, snow and fog. At the moment it’s expensive and bulky – you’ll have seen pictures of development prototypes equipped with lidar hardware that looks like a traffic cone attached to the roof. Those cones house lasers rotating 360° to knit together an image of its surroundings. That costs about £50,000 per car.

The next generation of lidar, soon to be seen in semi-autonomous vehicles, replaces those cones with solid state hardware the size of a paperback book, using a tiny microelectromechanical system to transmit the laser pulse.

But lidar is by no means perfect. It doesn’t like fog. Some experts warn of potential eyesight damage. Others believe lidar is susceptible to ‘seeing’ non-existent ‘ghost’ cars when close to other lidar-kitted vehicles. It will never be known if a lidar-equipped car would have avoided that fire truck, but what we can be certain of is that the path to fully autonomous driving will be strewn with challenges that need to be overcome before it is a reality for everyday transport.

How lidar works: the radar and sensors powering autonomous cars

Lidar sensors build up a picture of a car's surroundings, like on this Ford

1) Lidar has weak links...

A car’s radar system will usually be programmed to detect moving objects and ignore stationary ones – such as a parked fire truck that could suddenly appear ahead as other traffic changes lane. Numerous warning systems are built into Autopilot, but a Tesla driver – like any other road user – is still required to pay attention and be ready to assume full command of the car. 

2) Imperfect present

Tesla’s Autopilot uses a combination of forward-facing radar and a camera with three lenses of differing focal lengths – wide angle, standard and long distance – to view the road ahead. It also has forward and rearward looking side cameras and a ring of sonars around the car to give a 360° view of its surroundings.

3) Tomorrow's solution

Lidar can send 150,000 pulses of laser light per second at 186,000mph; a sensor measures the time taken for each pulse to bounce back. As light travels at a constant and known speed it can calculate accurately the distance to the object.

4) The day after tomorrow

Solid-state lidar uses a MEMs chip, in place of spinning mechanical lasers, with micro mirrors to scan the scene and direct the laser. Currently they have a range of about 150 metres, but systems with 200-220 metres and a 120° field of view are under development.

More tech stories by CAR magazine

By Ian Adcock

CAR's engineering whizz, making sense of oily bits and megabytes

Comments