It happened on a road to three lanes of the West Coast of the United States, California, near San Francisco, more exactly in Mountain View, close to the headquarters of Google.
a car brand Lexus traveling on the right lane was confronted with sandbags. To avoid them, it slowed to dislocate left. Then, after waiting several vehicles, it committed very slowly in the center lane.
She thought the bus before which she would sneak would give it priority. The bus driver probably felt, conversely, that the weight of the vehicle would deter Lexus committing. And boom! Both vehicles ended up in the same place at the same time.
A textbook case
This is the story of an ordinary accident. But it went around the planet. And for good reason: the Lexus had no driver. She was guided by software developed by Google. The person in front of the steering wheel, a test pilot employed by the Internet giant, had orders to intervene only in case of danger. And for the first time Google, which tests its prototypes in circulation and is not her first accident, said he was responsible. It is therefore a case study
“This is a classic example of arbitrage which is an integral part of driving. We all try to anticipate the movements of each of other, “said Google.
But what concerns human appreciation is sometimes hard to understand by the robot. The company immediately announced a change in its artificial intelligence software to better take into account the weight of vehicles in accordance with the priorities. But she can anticipate all situations? This accident reveals the changes that will accompany the revolution of the car without driver.
End of no-claims bonus for drivers
The car without driver means the end of a principle well established: the driver’s responsibility. Today, to ensure that accident victims are compensated, it is the person behind the wheel who finds himself in court.
“90% of accidents are caused driver error, explains François Nedey, CTO and insurance responsibilities at Allianz France. It is rare to find a defect on a vehicle in the cause. “
Since Mountain View is different. If the center of decision moves the brain of the driver to drive the car, the same is true in the responsibility before the law.
“It is hard to blame the driver for choosing wrong vehicle or the wrong software, “notes François Nedey
Under these conditions, the current system of no-claims bonus, which rewards good drivers. – by a discount on the insurance – and penalizing poor performers , will be less and less relevant.
A driving license for software
the other lesson from Mountain View, is the question of software.
“Nobody today control the algorithms that determine the reactions of the vehicle correspond to the law, the Highway code, and ethics of conduct,” said François Nedey.
and it is clear that it is not enough to respect the highway code to avoid having an accident. You also need a code of conduct.
In the future, it is likely that the authorities set up a certification system software. A driving license for robots! Insurers, if they are to cover the manufacturers require. But it will solve a very complex problem.
To register a vehicle, it must be tested in as many situations. But for an autonomous car, the number of these situations is almost infinite. Impossible to anticipate everything!
The other solution is to validate the artificial intelligence system itself. But this would require the software publishers agree to deliver their precious algorithms to others … and that they have the level of knowledge to understand and judge them.
More permits for drivers?
the other fundamental question posed by the Mountain View accident, it is the place of humans in the autonomous car. Should it be in a position to intervene at any time? In Europe today the Vienna Convention requires that the driver keeps both hands on the wheel, just in case. It will be modified to leave room for experimentation of the autonomous car. But before marketing autonomous cars will require rethinking the set of rules.
In the meantime, Google shows credentials, assuming its responsibility. For what is to have an autonomous car if you must attend? What good is having a car without a driver if he is forbidden to call, read or smoking …? Should a niche expertise if all cars are parked automatically? Eventually, the car will automatically slow down in town, always put his blinker, and not klaxonnera. It probably will not do anything more to learn the rules of the road, or know how to drive.
A difficult cohabitation
The drive car for all not for all on.
“the day that a vehicle will be considered truly independent, said Francois Nedey, he will avoid a small site Place de l’Etoile in Paris.”
Some point 2050 2100. earlier else, certainly hope Google. In the meantime, cars with or without drivers will have to coexist. And this phase looks delicate. As shown in the event of Mountain View, it is difficult for robots to anticipate the reactions of drivers and vice versa.
During this phase, the distribution of responsibility will prove complex. François Nedey imagine a strict liability regime, as there is in construction, where many players are involved. The driver is responsible, but it may go back to the manufacturer. Which will then have to turn to its suppliers: the suppliers, but also the software companies like Google. To apportion blame, it will implant in each vehicle a black box, as in aircraft. They collect data binding basis for all companies involved in an accident. This phase has almost begun.
“Certain manufacturer asking us to be covered for active safety systems,” said the expert.
Since the widespread use of ABS, Airbags and speed regulators, the industry has made great progress. Parking, respect safety distances, braking and emergency paths, keeping the same file are now automated on recent high-end models. When the car will be fully autonomous on part or on the entire road network, manufacturers will be fully covered
With the proliferation of autonomous cars, another form of cohabitation should be organized. That between different software, with different algorithms. If obstacle, if the vehicle is programmed to bypass the left, the vehicle B to the right and are in opposite directions … they will collide.
Even if the robots lead more safely than humans, it will also make them respect the rules. They are sometimes difficult to determine. He must choose between the life of a driver and that of a family of pedestrians.