so it turns out Teslas on autopilot will disengage the autopilot when a crash is imminent so the statistics can show "autopilot was not engaged at the time of accident"
this company, man.
https://electrek.co/2025/03/17/tesla-fans-exposes-shadiness-defend-autopilot-crash/
@kyonshi according to reporting from The Verge this is a little bit more nuanced. The Autopilot will disengage. That part is true but if a crash happens within 5 seconds after Autopilot disengages it is always counted as an Autopilot crash. This was a scandal years ago and what followed was a lengthy NHTSA investigation which did not find anything relevant in regard to this behavior. Still fuck Tesla. It’s a terrible company and no one should buy one.
Sources
https://www.theverge.com/tesla/631308/mark-rober-tesla-youtube-autopilot-lidar-fake-claims
@kyonshi Well, that's what airliners do, if the computers get really upset with the state of the world they turn the autopilot off.
@TimWardCam I recall someone explaining that as the computer saying "I've done what I can, you fix this!".
The difference is that in an airplane, things have to get *really* bad for the pilots not to have a reasonable amount of time (up to several minutes) to sort things out. And even if the autopilot disconnects, you've got plenty of other systems to help you decide what to do and which way to point the airplane. It doesn't work out every time, but is lightyears better than in a car.
@mkj @kyonshi Oh yes, better than a car: in an aircraft you're pretty well never a fraction of a second and a few feet away from crashing into a vehicle coming the other way at a closing speed over 100mph, whereas in a car it happens all the time.
Most times when something goes wrong in an aircraft [give or take EFATO and seriously fucked up approaches] you've got seconds and seconds to think about how to respond. (Fixed wing, that is, choppers are, I'm told, different (never flown one myself).)
@kyonshi ………. blame the victim; nothing could be more endemic to capitalism.
@kyonshi We've known that for years (see e.g.
https://futurism.com/tesla-nhtsa-autopilot-report ) - has something new been found out?
@kyonshi Oh here, from the NTSB in June 2022.
"On average in these crashes, Autopilot aborted vehicle control less than one second prior to the
first impact"
@kyonshi I mean, that's literally the whole spiel of that partial autonomy driving.
Tesla - despicable as they are, as he is - are just the first to exploit it.
Nobody with an autonomous car will always be ready to take over in an unexpected emergency.
Is one of my major gripes with self-driving cars, too, @larsmb.
People compare self-driving cars to autopilots in planes, but there is a multitude of relevant differences. Not least recurring training and regularly practicing emergency scenarios.
In a self-driving car, even if the driver even has the *time* to take over, they are going to be out of practice in a complex, possibly highly dynamic situation (or the self-drive wouldn't disconnect). That's unlikely to make things better.
@kyonshi I re-watched Mark Rober's video, you can just about see autopilot disengage right before hitting the mannequin in the fog test too
@kyonshi this has been known for years, and no one has done anything about it.
@kyonshi This was what I was wondering about related to liability and onus in the event of a collision between a car and a car where automation is engaged.
The rhetoric often being peddled is that a machine can drive better than a human can.
@kyonshi This reminds me of that family who died when their Model X veered into the gap between diverging highway lanes. Their deaths were largely due to a poorly maintained guardrail, but autopilot arranged the meeting.
Anyway, Tesla's defense was that the log showed the driver wasn't attentive for up to 7 seconds before the crash. Tesla's attention monitoring system at the time consisted of measuring turning force on the steering wheel. So I guess the driver wasn't tugging on the steering wheel for those 7 seconds. Cool Tesla.
@kyonshi
Ugh, i wanted to read this but it quickly turned into a play by play of the author's Twitter feed and i seriously cannot give any shits for that
@kyonshi shit people shit company
@kyonshi Would not be better to hit the brakes or something 'when a crash is imminent' ??
@kyonshi and now you know why #SelfDrivingCars are illegal in #Germany!
@kyonshi this reminds me of working on an organic chicken ad campaign and learning that the US poultry label as “antibiotic free” means antibiotics were not given prior to slaughter.