AT THE Consumer Electronics Show in Las Vegas two years ago a leading car maker unveiled a machine that it said was a vision of the future. It certainly looked the part, with a sleek silver body shell, a steering wheel that retracted into the dashboard and four lounge-style chairs that could rotate to face one other. The most startling feature, though, was its self-driving ability.
It was filmed navigating through San Francisco shortly before its futuristic doors swung open to journalists. We stepped onto the car’s wooden floor and looked at a calming forest projected onto the windows as the car drove itself along the runway of a nearby airbase. It felt as if the future had arrived — until that is, the engineers revealed the car’s secret: it was simply following a programmed route, from which pedestrians and other vehicles were being kept well clear. And it hadn’t been driving itself through San Francisco at all.
That was impossible: the amount of computing equipment required simply would not have fitted in the vehicle and would have had to be hauled round on a trailer. Not quite so sleek.
The problem is the amount of information needed to recreate a driver’s awareness. To replicate human eyesight, for example, a driverless car requires ultrasonic sensors, radar, cameras with night vision and a laser-based lidar system (a light-based version of radar). In 2015 processing all this data in real time so that a driverless car could react instantly was not possible without a truckload of PCs.
“We’re using a technique called deep learning to train the car to understand its environment and operate safely”
One of the biggest challenges for car makers is data processing. Which is why an electronics company better known for chips used in graphic-rich computer gaming is involved in producing a “supercomputer for driverless cars” that will support the development of artificial intelligence (AI). It is a crucial part of the jigsaw. Fully driverless cars won’t be robots on wheels; they will have to think like us.
“There’s no way you could write a computer program that could take account of everything that happens around your car,” says Danny Shapiro, head of automotive at Nvidia. “Instead we’re using artificial intelligence — or more specifically a technique called deep learning — to train the car to understand its environment and operate safely.”
Deep learning imitates the way our brains function. Instead of programming a car to remain within white lines and to stop when it sees an octagonal red sign with white writing, AI allows a computer to learn from what it sees and what we do.
The system compares data from the car’s sensors with the actions of a human developer driving normally on public roads. That knowledge can then be carried over to unfamiliar situations. During testing, Nvidia’s driverless car spotted a lorry waiting to join the road but hidden behind a parked car. The autonomous vehicle slowed cautiously, even though it had never previously been presented with such a situation.
As well as other vehicles, driverless cars must learn to recognise everything else they might encounter on the road. “In Australia it’s important to train a system to recognise kangaroos, but also the [animals’] behaviour is something that AI can be taught to anticipate,” says Shapiro. “If there’s a cow at the side of the road, it’s usually grazing, but if there’s a kangaroo, it could jump the fence and run into the road. A mailbox is not going to move, but a child, dog or bicycle does, and they move in different ways.”
The question of the right response when there’s no right response is a thorny topic. If an accident is inevitable, do you run over the pair of schoolchildren in front of the car or the group of pensioners on the pavement? Fans of heated ethical debates or expensive litigation will be disappointed to learn that no one we spoke to planned to program cars to kill one group at the expense of another; or to scan faces and use profiling to class some road users as less worthy than others.
Bosch — best known in Britain for its power tools and washing machines but also a leader in the field of autonomous cars — says the solution is simply to slam on the brakes, cutting the energy of an impact as much as possible, instead of acting as executioner.
“We have to avoid accidents,” says Dirk Hoheisel, a board member at Bosch with responsibility for driverless cars. “That is [priority] No 1. No 2 is that we avoid accidents, and No 3 is the same.” Accident prevention is the reason the first driverless cars won’t be cheap. A lidar sensor alone costs $7,500 (£5,700), a hefty sum to add to the price of a car. “We must invest and install enormously expensive equipment,” Hoheisel says. “It may cost 20, 50 or even 100 times as much as the electronics of existing cars.”
And cost is one reason the public’s first taste of fully driverless technology is unlikely to be in their own vehicles but in “robo-cabs”. These will be designed to zip from, for example, station to city centre, hailed by app and offering the convenience of a taxi service — without the chatter from the driving seat.
Bosch has teamed up with manufacturers to begin development of a small electric car that could function as a robo-cab, and discussed with London’s transport authority how it could be used. It envisages a fleet of black robo-cabs with orange lights on top, a homage to the capital’s famous black cabs. In Milton Keynes, the plan is to have taxi pods that drive through pedestrianised areas, as well as on roads. A trial of the technology has just begun.
And if this convenient, cheap inner-city transport takes hold, privately owned cars could be banned from city centres — an extension of the clean air zones that are being set up to discourage owners of old, polluting cars from driving into areas of poor air quality.
“If there’s a change from ‘I want to buy a car’ to ‘I want to lease some mobility’, the impact will be profound,” says Nathan Marsh of Atkins, a design and engineering consultancy. “At the moment, privately owned cars spend about 96% of their time static. We are going to see very different cities and highways: cars will be moving.”
Despite the imminent arrival of driverless technology, there are plenty of gaps to be filled, not least the question of how a driverless car can hope to navigate busy streets, which have been designed around the nuances of human perception.
How will a driverless car tell that a group of people standing by a zebra crossing are just chatting rather than waiting to cross? Will they understand signs from traffic police beckoning cars? And what will be their response to rude gestures from other motorists?
It’s a question that car manufacturers are researching now. The Audi Aicon concept vehicle, for example, has lights at the front that can resemble eyes. These can follow pedestrians as they walk past, which is meant to provide reassurance that the car knows they are there and isn’t about to run them over.
Little discussed in this vision of a driverless future is one influential group of people for whom autonomous cars could be vital. “There is an expectation that the early adopters will be young city-dwellers, but first across the line may be the elderly population who are beginning to lose the ability to drive,” says Marsh.
The economic benefits of bringing personal transport to the elderly and disabled could be enormous. Google released a video of one of its early driverless cars going to a drive-thru. The man behind the wheel had nothing to do, but choose his food. For him, it was a revelation, because he was blind.
Fully autonomous cars will be some of the most advanced machines yet created when they arrive. But is there a danger that we will slip into the world of the film Wall-E, in which robots fulfil every need while we sit slumped in our cars? That’s not going to happen just yet. People will still be driving — for pleasure.
From cruise control to total control
Traditional cars and the five levels of autonomy
Cars that are completely controlled by the driver, save basic functions such as cruise control.
Features such as adaptive cruise control, which will automatically slow the car down to keep a safe distance from vehicles in front.
First seen 1997 (Toyota Celsior in Japan)
Available in cars such as Ford Fiesta, Vauxhall Astra, Peugeot 3008…
Systems that can take over steering, accelerating and braking functions on some roads but must still be monitored by a driver.
First seen 2003 (Honda Inspire in Japan)
Available in cars such as BMW 5-series, Mercedes E-class and Volvo XC90…
A big leap because these vehicles don’t need human supervision in certain conditions (initially on motorways at slower speeds). Drivers have to remain alert, as they may need to take back control with a few seconds’ warning.
More advanced technology that can drive the car on most roads in most conditions. If it starts to struggle, the car will pull over until the driver can take control.
Due 2020 (estimated)
No need for steering wheel, accelerator or brakes — or a human.
Due 2021 (estimated)
Sections in The Future of Transport in association with Audi:
- The co-pilot takes over The latest autonomous cars are learning road sense from the experts: us
- The ride to work Don’t be late for the office: get your rocket skates or jetpack on
- In pods we trust From London to Edinburgh in 45 minutes? Take the hyperloop
- Back in full flow Eco-jets and floating cycle paths could soon rejuvenate city rivers