► Autonomous car test in the UK
► We ride in self-driving Nissan Leaf
► Does the tech work on real roads?
Riding in autonomous cars is almost old-hat round these parts: I was first driven by a robot back in 2013, but it’s always been confined to the relative safety of a test track. Now Nissan has let us loose in its latest creation - the autonomous Leaf electric car - which drove us around public roads and lanes near its Cranfield R&D base in Bedfordshire. All on its own…
It’s a significant move, reflecting the growing confidence in the algorithms and sensors that power driverless cars. Are we about to enter a brave new era of autonomous driving (AD)? Will drivers soon be redundant? Read on for our full assessment.
Explained: what the different autonomous car levels mean
The autonomous Nissan Leaf: the story so far
Nissan was always going to pick an electric car for its driverless showcase. It’s the tech flagship of the Japanese brand’s range, it’s now made in its Sunderland UK factory and if you’re going to waft around in a robot-driven urban cocoon, you may as well do so in saintly and silent EV mode.
The project is a significant autonomous car scheme on several counts and has won a total of £13.5 million of public funding: it’s a UK-led consortium, bringing together an array of talents from the automotive, academic, technology and governmental spheres. The HumanDrive project name reveals how it’s aimed at making the car drive more naturally, rather than robotically. (That’s humanistically in the positive sense, not in the phone-wielding, outside-lane-flashing commuter rage fashion).
When will autonomous cars launch?
CAR magazine was invited to Nissan’s European tech HQ to learn more about the autonomous Leaf - and to pop some brave pills and let the computers take us for a spin in the very same car that drove 230 miles from Cranfield to the Sunderland factory in the north east on 28 November 2019. It managed 99.4% of the distance on autonomous mode with only minimal human intervention, despite the rain, mist and roadworks en route. The team were delighted with the result - and confident enough to let a select few journalists hop in for a ride this month.
How do autonomous cars work?
Self-driving cars require a battery of information to understand their place in the world, monitor the behaviour of other road users and calculate a safe and efficient path through both. To that end, our Leaf was festooned with sensors, relatively crudely lashed on to this hard-working prototype in its AD lab with some attached to the same Thule roofbars you’ll find on a common or garden Qashqai:
- 8 lidar sensors (silver/blue cylinder below)
- 7 cameras
- 2 high-accuracy GPS (small disc, bottom left)
- 1 radar
That information is processed on board to identify the car’s position on the road and other traffic and pedestrians in the habitat around the Leaf. Hyper-accurate mapping is required to understand the precise location, so instead of a regular global positioning system here there’s military-spec RTK and differential GPS, which has accuracy down to centimetres rather than metres.
Powering all this tech is a small server’s worth of computers stuffed in the Leaf’s boot. Luggage space is non-existent in this prototype, but engineers assure us that any production version would shrink the hardware to make space for chattels.
So what’s it like to let an autonomous car drive you?
We climb into the last-gen 30kWh Nissan Leaf, picked because it was the current model when the Human Drive project kicked off in 2017. The cabin is peppered with extra wiring and displays sprout from the dashboard like a Dr Who props department spin-off - but the fundamental controls are all still recognisably Leaf.
There are two parts to our test. First up, we’re driven on a private test road where the car displays its humanistic path-planning. Up and down the quarter-mile we plough, circling a roundabout at one end and negotiating a parked car along the straight, seemingly unaffected by the February mist that’s making visibility a little woolly beyond 100m. The laser, radar and optical sensors can see through the murk better than a pair of human eyeballs.
We have a monitor showing the car’s route-planning and I’m struck by how the car picks a slightly different line each time it overtakes the obstruction and negotiates the mini roundabout. This is what Nissan means when it talks about more natural human behaviour; it’s worked closely with the University of Leeds’ Institute of Transport Studies to understand the kind of driving we generally like (distance to parked cars, safety margins, closing speeds). Researchers played back different styles to drivers to test preferences, and have modelled the type of driving that feels safe, natural and repeatable in any given situation.
This Leaf drives smoothly enough, and for the majority of our passenger laps the safety driver’s hands are kept well away from the wheel. Movements are generally smooth, acceleration crisp and decisive and hazard perception appears, ahem, laser-sharp. Robo-Driver could do with a bit more practice at steering wheel shuffling - the parked car occasions the odd jerked overtake, and the test driver intervenes when an unexpected bicycle comes towards us out of the mist.
How does it feel in a self-driving car on the public road?
Next up, we flick into a different prototype autonomous Nissan Leaf - the actual car used for the Cranfield-Sunderland journey dubbed the Grand Drive. There’s one crucial difference: this one has a new Hitachi-developed layer of artificial intelligence. It’s next-level humanistic driving behaviour, in short, and uses machine learning to understand better how to cope with new routes, obstacles and traffic situations.
Progress is again smooth and predictable most of the time. The EV shuffles silently out of junctions, once it’s satisfied the chosen route is clear and I find myself comparing my own driving; the computer is more cautious at entering roundabouts or T-junctions, but remarkably frictionless once on the move.
The sternest test we encounter is a major roundabout joining the M1 motorway, where lorries and cars are zooming from all angles and the painted white lines are faded to zero in parts. The Leaf is unphased and joins the serried ranks waiting at traffic lights controlling the flow on the roundabout. It feels weird to be surrounded by articulated lorries and other vehicles while waiting for the lights to go green, and my toes twitch as I learn to trust the bots.
It’s worth pointing out that our HumanDrive prototype is at this demonstration stage following a closely prescribed route that had been driven by a test engineer earlier - we’re still a long way from being let loose in full autonomous vehicles on any road of your choosing on any route. This Leaf was playing back an earlier recording, albeit coping with new traffic scenarios.
The future of autonomous driving: when will it be on sale?
Don’t go expecting self-driving cars to be a reality any time soon. The technology is still in its relative infancy and there are a host of hurdles to clear, from agreed regulatory frameworks and consistent road markings to social preparation and common human/car handovers. But the message is clear: manufacturers, governments, technologists and other influential players are betting big on connected and increasingly autonomous cars, and the direction of travel is set.
We’d predict an increasing dripfeed of ever-cleverer semi-autonomous systems in your next car. Cruise control will get cleverer. Lane control will become super-sophisticated. Stop-start control in traffic jams will be widespread. And assisted driving will creep into the mainstream. We’re going to inch into the driverless future, not arrive overnight with a single giant leap for carkind.
The HumanDrive project proves that car makers are at the vanguard of this revolution, but the big question is whether the Silicon Valley tech giants will get there first. The future role of the car in society is at stake, and I for one am hoping the vehicle manufacturers can use their decades of experience to stay in the race. I don’t much fancy a future where a faceless GoogleBot drives me home every night on a pre-ordained route in a bland vanilla pod.
Do you want autonomous cars? Be sure to sound off in the comments below