Analysis | Testing Tesla’s Autopilot recall, I don’t feel much safer — and neither should you (www.washingtonpost.com)
from L4s@lemmy.world to technology@lemmy.world on 31 Dec 2023 22:00
https://lemmy.world/post/10175407

Analysis | Testing Tesla’s Autopilot recall, I don’t feel much safer — and neither should you::undefined

#technology

threaded - newest

autotldr@lemmings.world on 31 Dec 2023 22:05 next collapse

This is the best summary I could come up with:


The recall was supposed to force drivers to pay more attention while using Autopilot by sensing hands on the steering wheel and checking for eyes on the road.

The underlying issue is that while a government investigation prompted the recall, Tesla got to drive what went into the software update — and it appears not to want to alienate some customers by imposing new limits on its tech.

My Washington Post colleagues found that at least eight fatal or serious crashes have involved Tesla drivers using Autopilot on roads where the software was not intended to be used, such as streets with cross traffic.

In fine print and user manuals most drivers probably haven’t pored over, Tesla says that Autosteer “is designed for use on highways that have a center divider, clear lane markings, and no cross-traffic.” It adds: “Please use it only if you will pay attention to the road, keep your hands on the steering wheel, and be prepared to take over at any time.”

Looking at my own before and after photos, I can see these newer messages — which often ask you to apply slight force to the wheel — have larger type, include an icon and now show up in the upper third of the screen.

Tesla’s recall release notes also suggest the warnings will come more often, saying there is increased “strictness” of driver attentiveness requirements when Autosteer is active and the car is approaching “traffic lights and stop signs off-highway.”


The original article contains 1,964 words, the summary contains 247 words. Saved 87%. I’m a bot and I’m open source!

farcaster@lemmy.world on 01 Jan 2024 00:43 collapse

As a previously mostly-content Tesla owner, these Autopilot updates are a big step back in safety. The “checking for eyes on the road” feature is now so aggressive it will loudly beep at me and flash warnings anytime I even look at the touchscreen to change the temperature while autopilot is engaged. On a straight freeway. Then what’s the goddamn use of having a single touchscreen control everything?

Honestly, Autopilot/Autosteer was fine. For years. It’s a driver aide which, just like regular old dumb cruise-control, makes driving a little less tiring but still obviously requires the user to pay attention. It’s just irresponsible drivers and Tesla’s own “it’s self driving” advertising ruining it for everyone.

abhibeckert@lemmy.world on 01 Jan 2024 01:07 collapse

It’s just irresponsible drivers and Tesla’s own “it’s self driving” advertising ruining it for everyone.

Maybe if they marketed it as something less than “full self driving” people would be more responsible.

farcaster@lemmy.world on 01 Jan 2024 01:15 collapse

Yeah. Well I think they do, technically. Autopilot is just adaptive cruise-control and lanekeeping, both features increasingly also seen in many other vehicles, and is totally separate from the “full self driving” feature. But their confusing messaging over the years (in particular from one highly erratic source…) seems to have convinced some people that all Tesla vehicles are self-driving miracle cars, which in turn I suspect has led them to use autosteer everywhere all the time without paying attention, with predictable consequences…

I never thought too much about it because autopilot in my model 3 was fine when used normally, but now due to all this it’s getting quite annoying…