Have you read any of the accident reports where a self-driving car resulted in a fatality? In every case so far the human would have avidoed the accident.
You seem to be under the illusion that a zero-accident rate should be attained before deploying these.
That's never happened before with any new tech, why should it happen now? SO far they seem to be doing quite well.
The best documented accident of where the machine failed and the human driver was much better is the single car accident where the self-driving car ran into a tree killing the driver in California. The driver had documented and reported the problem on a stretch of road where the car set-driving controlled car continually fails to stay on the road.
I'm sure the software has been updated.
There are so many simalarities between the Therac 25 and waht's going on with selfdriving cars.
Did you not read what happened?
With the Therac when the humans were in conrtrol of the machine there were no deaths.
But with the Therac 25 the machine was automated and humans no longer had control, and deaths occured. Society and the courts found the people the machine was supposed to be "saving their lives" wound up killing was not acceptable.
Or pehaps you want something a bit more modern and car related. Toyota had multiple bugs in their softwsare.
And you do know of the five times computer systems reported to the humans nuclear missles had been launced by either the US against Soviet Union/Russia? Or that US was being attacked by the Soviet Union.
In one case it turned out to be the failure of a $0.49 chip. In another it was the sun hitting some clouds.
And I hope you know about the bug in the Boeing 787. Thankfully this was detected before one loaded with passengers fell out of the sky and crash landed on a city. It was just a minor bug,,,,, Every 248 days or 2^16 (as I recall) all of the computers would reboot. During the boot process all displays and flioght controlls would be inopperative. Doesn't matter if the plane was parked, in-flight, taking off or landing.
Amd yes like the Therac, defence computers, Toyota and Tesla's software, the Boeing software was in services indangering the public.
Question still remains with autodriving software. Why is it the the programmers still mistake a tree and parked cars for the road? This hasn't happend once, but multiple times. Even someone dunk or high on drugs wouldn't make that mistake as often as the selfdriving car software.
I think the latest statistics say a selfdriving car would get into an acccident every 21 days if there were not a human in the car. Imagine if all 16 miollion cars in the US were selfdriving. Every 21 days there would be 16 million acciudents. Car repair compnaies would love it.
Did you not learn the story of the Therac 25 in school?