@yogthos surprised that doesn't create additional legal liability if intentionally disabling autopilot during that 1 second invariably results in reduced braking and increased velocity on impact.

@hobson @yogthos if there isn't any law to cover such a situation, it might very well be legal (ianal and so on, of course).

I'm just thinking that since very few legislators know first thing about physics, I'm not sure it's going to be easy to explain even why this is a problem.

@timjan @yogthos There are generic laws about engineering a device that intentionally causes harm. Expert witnesses are used to explain the physics. Source code will show intent.

@yogthos New tech leads to new laws needed to keep up with unprecedented situations.

Here’s hoping countries get on shit like this fast, because fuck a company making grandiose promises about the safety of their product while designing it to not ensure that safety but instead ensure the company isn’t liable for the accident that the product can lead the people into.

@yogthos I'm literally laughing out loud because it's my remaining coping mechanism.

@yogthos the autopilot suddenly turning itself off seems like it would be the actual reason for those crash. The car is switching behaviour without warning

@yogthos The capitalist version of, "Stop hitting yourself, stop hitting yourself!"

@yogthos musk is gradually becoming a cartoon villain.

@yogthos That's like a driver jumping out of a vehicle at the last second before a crash and then claiming that he can't have liability for the accident because he wasn't in control of the vehicle at the time of the crash.

I'm pretty sure that's not how liability works.

@1dalm @yogthos

Insurance companies have a lot of lawyers, and judges tend to just rubber-stamp what the insurance companies assert. It will be interesting to see who wins this battle of the titans, should this ever come to court.

@dfloyd888 @yogthos It is a battle of Titans -Tesla vs. the US Federal Government.

But typically Federal Agencies like NHTSA don't publicly assert something like this unless they feel like they have significant evidence and this is a significant public safety issue. Otherwise they normally deal with this kind of stuff privately through the company. NHTSA would far rather TESLA issue a voluntary recall than force them to do so.

@yogthos it *could* activate the brakes

but it cares more about corporate liability than about people's lives

@yogthos From a technical point of view, I would expect that the autopilot shuts down if it finds itself in a "I don't know what to do anymore"-situation. It would be worse if it continues in any way, after realizing it isn't capable of resolving a situation. From a legal point of view this is bullshit.

@yogthos Wow Musk is so devout, he even programmed in a Jesus Take the Wheel mode!

@clacke @yogthos If I had a dollar for every time Jesus stole my steering wheel, I'd have a harder time getting into heaven than a camel trying to fit through a needle.

@yogthos
There's a "me handling my responsibilities" joke somewhere in here.

Thats the only funny thing about this though. This is some dystopian level crap.

@yogthos still wrong.
Autopilot is NEVER liable.
And Tesla counts every crash as "on autopilot" when it was active anytime in the 5 seconds before impact.
Funny thing is: even being rear ended country as AP crash.
Still 3 times safe than human 🤷‍♀️

So please stop sharing those lies. There is enough to criticize without blatantly misrepresenting facts.. 😕

@yogthos https://www.youtube.com/watch?v=eu2ewxZ9IbY came across this Dutch insurance company advertisement, and had to think of this post. ;)
Sign in to participate in the conversation
mas.to

Hello! mas.to is a general-topic instance. We're enthusiastic about Mastodon and aim to run a fast, up-to-date and fun Mastodon instance.