Tesla's "self driving" software disables itself just before crashes to sneakily avoid liability https://fortune.com/2022/06/10/elon-musk-tesla-nhtsa-investigation-traffic-safety-autonomous-fsd-fatal-probe/
@yogthos surprised that doesn't create additional legal liability if intentionally disabling autopilot during that 1 second invariably results in reduced braking and increased velocity on impact.
@yogthos New tech leads to new laws needed to keep up with unprecedented situations.
Here’s hoping countries get on shit like this fast, because fuck a company making grandiose promises about the safety of their product while designing it to not ensure that safety but instead ensure the company isn’t liable for the accident that the product can lead the people into.
@yogthos the autopilot suddenly turning itself off seems like it would be the actual reason for those crash. The car is switching behaviour without warning
@yogthos That's like a driver jumping out of a vehicle at the last second before a crash and then claiming that he can't have liability for the accident because he wasn't in control of the vehicle at the time of the crash.
I'm pretty sure that's not how liability works.
But typically Federal Agencies like NHTSA don't publicly assert something like this unless they feel like they have significant evidence and this is a significant public safety issue. Otherwise they normally deal with this kind of stuff privately through the company. NHTSA would far rather TESLA issue a voluntary recall than force them to do so.
@yogthos it *could* activate the brakes
but it cares more about corporate liability than about people's lives
@yogthos From a technical point of view, I would expect that the autopilot shuts down if it finds itself in a "I don't know what to do anymore"-situation. It would be worse if it continues in any way, after realizing it isn't capable of resolving a situation. From a legal point of view this is bullshit.
There's a "me handling my responsibilities" joke somewhere in here.
Thats the only funny thing about this though. This is some dystopian level crap.
@yogthos still wrong.
Autopilot is NEVER liable.
And Tesla counts every crash as "on autopilot" when it was active anytime in the 5 seconds before impact.
Funny thing is: even being rear ended country as AP crash.
Still 3 times safe than human 🤷♀️
So please stop sharing those lies. There is enough to criticize without blatantly misrepresenting facts.. 😕