The last hurdle is regulatory. We can’t let AV manufacturers use “there’s no driver” as a way to escape responsibility, externalizing the harms AC cause onto society.
The question is how to achieve fairness. If a human driver commits vehicular manslaughter, they get the book. What about AV? $10 million? Executives go to jail? What if $10 million fine per X AV miles driven is an OK cost of doing business?
For example, every time a Waymo picks me up from my apartment, it blocks a full lane of traffic on an extremely busy street, rather than pulling into a much quieter side street that an Uber driver will always use. I suspect (but have no idea), a lot of these low-level annoyances might be invisible to someone only looking at aggregated crash statistics, ride times, etc.
In many ways, I suspect the AI future might be better in many of the ways we can measure, but worse in those which aren't legible to statistics.
If the violations are intentional and easily fixable, then just pass laws/regulations requiring AV's to follow rules or else cease operations entirely.
If the violations are unintentional but happen only rarely in weird edge-case situations, then just set low frequency thresholds for them to be allowed, the same way we allow tiny amounts of rodent hairs in peanut butter. If AV companies exceed the threshold, then they get fined at first and eventually lose their permit -- but these aren't tickets for individual violations, but rather a yearly fine for going above the yearly threshold.
If the violations are intentional but not easily fixable -- e.g. they're stopping where not allowed because there's no legal place to stop within 15 blocks -- then the laws/regulations are bad, and tickets are essentially an unfair tax. That's the case in my city where moving trucks are essentially illegal, because it's illegal to double-park them, but there's usually no legal parking available within any reasonable distance that movers could carry furniture. So you just know that the cost of moving includes a "tax" of a parking ticket, unfair as it is.
Finally, if the violations are unintentional but happen all the time, the AV company should lose its permit because its software sucks.
I don't see how ticketing AVs for individual violations makes any sense.
EDIT: for those who think I'm letting AV companies get off too easily, it's precisely the opposite. I'm saying that if AV companies are violating traffic rules all the time and can't fix it, they should be banned. Ticketing is not the answer, because ticketing isn't holding these vehicles to a high enough standard. It's letting the companies get off the hook by merely paying occasional tickets instead of improving their software.
So I do wonder what happens in the future when roads and cars are all automated and city funding from this channel dries up.
Waymos are currently dropping off and picking up passengers in a bike lane which is not legal (because it is dangerous) however many ride share drivers also do this. As somebody who is commonly a biker / pedestrian I am excited that AVs will likely make many things safer for that class of user. That being said, I do worry about how we encode these "social understandings" of laws. - A waymo I rode in on a highway was happy to go slightly above the speed limit - It seems at stop signs waymo prefers to be slightly aggressive to make it through rather than follow the letter of the law.
It seems silly that we have to teach robots to break certain laws sometimes but parking in bike lanes / yielding to pedestrians are laws that human drivers break all the time and I hope the mechanisms mentioned in the article prevent us from teaching robots to program anti-social but common behavior.
https://futurism.com/future-society/waymo-bike-lanes-traffic
Archive link in case of random paywalling like I got: https://archive.ph/xHMDO
1) If theses companies get enough points on their license, their license is revoked. Not just for that vehicle, but for all of their vehicles. (The number of points would need to be adjusted for number of miles driven.)
2) Senior executives could be held criminally liable for vehicular manslaughter the way a normal drivers are. A death doesn't mean someone is going to prison, but their would be a police investigation. If an exec decided to ship a product with a known bug that lead to someone's death it should be treated with the same seriousness as a drunk driver killing someone.
We'll just run someone over with our "driverless" car and pay a fine - capitalism, baby!
> the car is basically a taxi and the taxi service is to blame for any mistakes
@skybrian - Agreed! but if you read the article, the CA DMV is ticketing the manufacturer, not the operator.
None of my concerns hold if the operator was ticketed - infact, existing regulations are set up exactly that way, so no new regulation was even necessary. Something's not adding up.
> Right now, no one can independently own and operate an AV the way Waymo or Tesla does
@ourspacetabs - Sure but the regulation seems to be specifically addressed at the manufacturer, not the operator.
I would have no concern if the regulation was addressed to the operator. The article atleast doesn't imply that's the case.
---
> The state's Department of Motor Vehicles (DMV) has announced new regulations on autonomous vehicles (AVs), including a process for police to issue a "notice of AV noncompliance" directly to the car's manufacturer.
> Under the new rules, police can cite AV companies when their vehicles commit moving violations. The rules will also require the companies to respond to calls from police and other emergency officials within 30 seconds, and will issue penalties if their vehicles enter active emergency zones.
These are new frontiers in automotive regulation. Typically, if a car failed because of a manufacturer issue, the driver would be ticketed. For example: if Hyundai sold vehicles where the engine would explode around 50k miles and that caused an accident, the driver of the vehicle would be ticketed for it.
Now if we take the human out of it, it is Hyundai that would be ticketed for it. Insurance companies are certainly going to take notice and adjust their risk models accordingly.
I imagine there will be a lot of fingerpointing by the manufacturer towards customers.
In the worst case, this is the end of customers servicing their own autonomous vehicles.
If we imagine that most vehicles in the next 15 years will be autonomous, this would mean customers would have to handle regulation aimed at multibillion dollar companies, if they were to service their own autonomous vehicles, or give up on servicing their own autonomous vehicles entirely and just rent them instead.