Tesla Full-Self Driving Veers Off Road, Hits Tree, and Flips Car for No Obvious Reason (No Serious Injuries, but Scary) (fuelarc.com)
from KayLeadfoot@fedia.io to technology@lemmy.world on 23 May 00:02
https://fedia.io/m/technology@lemmy.world/t/2204897

A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don't see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

#autonomy #avs #fsd #selfdriving #technology #tesla

threaded - newest

yoshisaur@lemm.ee on 23 May 00:06 next collapse

Yikes. Glad they’re ok

KayLeadfoot@fedia.io on 23 May 00:14 collapse

Ditto! They were about 1 foot from hitting the tree head on rather than glancing off, could have easily been fatal. Weirdly small axises of random chance that the world spins on

Corkyskog@sh.itjust.works on 23 May 00:50 collapse

I still don’t understand what made it happen. I kept watching shadows and expecting it to happen earlier.

KayLeadfoot@fedia.io on 23 May 00:55 next collapse

It makes no damn sense! There were worse shadows. it was totally unpredictable

IllNess@infosec.pub on 23 May 02:08 next collapse

I thought it might be following the tire tracks but no. It just decided to veer completely off.

Phen@lemmy.eco.br on 23 May 04:03 next collapse

There’s some difference in the fences on the left side at the exact time the car passed by on the other lane. My guess is that the timing of the other car made the software interpret those changes in the input as something moving instead of simply something being different.

echodot@feddit.uk on 23 May 05:40 collapse

They seriously need to pull FSD if it were just a matter of people risking their own lives I wouldn’t mind but they’re risking everyone else’s by driving this glitch machine around.

sidtirouluca@lemm.ee on 23 May 00:21 next collapse

self driving is the future, but im glad im not a beta tester.

KayLeadfoot@fedia.io on 23 May 00:49 next collapse

You're probably right about the future, but like damn, I wish they would slow their roll and use LiDAR

FaceDeer@fedia.io on 23 May 02:00 collapse

Elon Musk decided they absolutely would not use lidar, years ago when lidar was expensive enough that a decision like that made economic sense to at least try making work. Nowadays lidar is a lot cheaper but for whatever reason Musk has drawn a line in the sand and refuses to back down on it.

Unlike many people online these days I don't believe that Musk is some kind of sheer-luck bought-his-way-into-success grifter, he has been genuinely involved in many of the decisions that made his companies grow. But this is one of the downsides of that (Cybertruck is another). He's forced through ideas that turned out to be amazing, but he's also forced through ideas that sucked. He seems to be increasingly having trouble distinguishing them.

tyler@programming.dev on 23 May 02:52 next collapse

They removed their lidar sensors after the prices had already come down.

LadyAutumn@lemmy.blahaj.zone on 23 May 03:48 next collapse

He really hasn’t. He purchased companies that were already sitting on profitable ideas. He is not an engineer. He is not a scientist. He has no training in any design discipline. He takes credit for the ideas of people he pays. He takes credit for the previous achievements of companies he’s purchased.

What is it going to fucking take for people to finally actually see the grifter for what he is? He’s never had a single good fucking r&d idea in his life 🙃 he has wasted billions of dollars researching and developing absolutely useless ideas that have benefited literally no one and have not made him any money. It is absolutely incredible how powerful his mythos is, that people still believe him to be or have been some kind of engineer or something. He’s a fucking racist nepo baby. He’s never done a single useful thing in his life. He wasn’t the sole individual involved in creating PayPal (and was entirely unrelated in turning it into the successful business it became), he didnt found tesla nor is he responsible for any of the technological developments it made (except for forcing his shitty charger design that notoriously breaks down and charges at half the speed that competitors do), he did not found SpaceX and by all metrics involved has been loathed by everyone at the company for the past decade for continuously committing workers rights violations and fostering a racist sexist and ableist work environment. The man has done nothing but waste people’s time stoking his ego and sexually abusing a slew of employees for the past 2 and a half decades.

echodot@feddit.uk on 23 May 05:35 next collapse

He’s forced through ideas that turned out to be amazing, but he’s also forced through ideas that sucked.

He’s utterly incapable of admitting that one of his ideas is garbage.

There is a reason he fawns all over Trump and that’s because both of them are of a type. Both of them have egos large enough to have their own gravitational fields but lack any real talent. Look his family up, they’re all like that.

Buffalox@lemmy.world on 23 May 19:00 collapse

Musk has drawn a line in the sand and refuses to back down on it.

From what I heard the upcoming Tesla robotaxi test cars based on model Y are supposed to have LIDAR. But it’s ONLY the robotaxi version that has it.

He seems to be increasingly having trouble distinguishing them.

Absolutely, seems to me he has been delusional for years, and it’s getting worse.

madcaesar@lemmy.world on 24 May 12:04 collapse

Self driving via cameras IS NOT THE FUTURE!! Cameras are basically slightly better human eyes and human eyes suck ass.

sidtirouluca@lemm.ee on 24 May 12:19 collapse

i agree

postnataldrip@lemmy.world on 23 May 00:26 next collapse

“It crashed!”

“Yes but it did it all by itself!”

kambusha@sh.itjust.works on 23 May 06:08 collapse

Except for the last 0.05 seconds before the crash where the human was put in control. Therefore, the human caused the crash.

RandomStickman@fedia.io on 23 May 01:33 next collapse

Anything outside of a freshly painted and paved LA roads at high noon while it's sunny isn't ready for self drivings it seems

Bonesince1997@lemmy.world on 23 May 01:47 next collapse

Or silly tunnels you can’t get out of.

Zwuzelmaus@feddit.org on 23 May 04:04 collapse

Tunnels are extra dangerous. Not because of the likelihood of an accident, but because of the situation if an accident happens. It blocks the tunnels easily, fills it with smoke, and kills hundreds.

Except newly built tunnels in rich countries.

SpaceNoodle@lemmy.world on 23 May 02:48 collapse

Actual self-driving vehicles, sure. Just not whatever the fuck Tesla is doing.

Kusimulkku@lemm.ee on 23 May 08:12 collapse

I’m not sure about even the more advanced self-driving cars. Shit gets fucked with snow and all kinds of other stuff.

Flummoxes many human drivers too tbh.

SpaceNoodle@lemmy.world on 23 May 14:16 collapse

I’m confident that they all still need lots of work for advanced weather, but you’re not seeing a Waymo or a Zoox drive into a tree for no reason.

harrys_balzac@lemmy.dbzer0.com on 23 May 01:43 next collapse

Full Self-Destruct

orca@orcas.enjoying.yachts on 23 May 01:56 next collapse

The worst part is that this problem has already been solved by using LIDAR. Vegas had fully self-driving cars that I saw perform flawlessly, because they were manufactured by a company that doesn’t skimp on tech and rip people off.

Texas_Hangover@sh.itjust.works on 23 May 09:48 next collapse

Are those the ones that you can completely immobilize with a traffic cone?

SoftestSapphic@lemmy.world on 23 May 09:57 next collapse

Yes lol

KayLeadfoot@fedia.io on 23 May 09:59 next collapse

Probably Zoox, but conceptually similar, LiDAR backed.

You can immobilize them by setting anything large on them. Your purse, a traffic cone, a person :)

Probably makes sense to be a little cautious with the gas pedal when there is an anything on top the vehicle.

SynopsisTantilize@lemm.ee on 23 May 10:37 collapse

That and if you just put your toddler on the roof of the car or something or trunk for a quick second to grab something from your pocket…VROooOMMM baby gone.

jamesjams@lemmy.world on 26 May 04:32 collapse

Do I need to pay extra for that feature?

SynopsisTantilize@lemm.ee on 26 May 12:55 collapse

Nope! That one’s free.

Chocobofangirl@lemmy.world on 23 May 15:42 next collapse

You say that like it’s a bad thing lol if it kept going, that cone would fly off and hit somebody.

ayyy@sh.itjust.works on 23 May 20:07 next collapse

The same is true when you put a cone in front of a human driver’s vision. I don’t understand why “haha I blocked the vision of a driver and they stopped driving” is a gotcha.

TheGrandNagus@lemmy.world on 24 May 07:25 collapse

A human also (hopefully anyway) wouldn’t drive if you put a cone over their head.

Like yeah, if you purposely block the car’s vision, it should refuse to drive.

NotMyOldRedditName@lemmy.world on 23 May 18:50 next collapse

I wouldn’t really called it a solved problem when waymo with lidar is crashing into physical objects

www.msn.com/en-us/autos/news/…/ar-AA1EMVTF

NHTSA stated that the crashes “involved collisions with clearly visible objects that a competent driver would be expected to avoid.” The agency is continuing its investigation.

It’d probably be better to say that Lidar is the path to solving these problems, or a tool that can help solve it. But not solved.

Just because you see a car working perfectly, doesn’t mean it always is working perfectly.

M137@lemmy.world on 24 May 14:57 collapse

call*

FreedomAdvocate@lemmy.net.au on 24 May 08:03 collapse

Lidar doesn’t completely solve the issue lol. Lidar can’t see line markings, speed signs, pedestrian crossings, etc. Cars equipped with lidar crash into things too.

orca@orcas.enjoying.yachts on 24 May 11:32 collapse

I oversold it in my original comment, but it still performs better than using regular cameras like Tesla did. It performs better in weather and other scenarios than standard cameras. Elon is dumb though and doesn’t think LiDAR is needed for self-driving.

FreedomAdvocate@lemmy.net.au on 24 May 14:31 collapse

Let me guess……you watched mark rober’s video? Lol

orca@orcas.enjoying.yachts on 24 May 18:03 collapse

I’ve watched a few random ones over the years. No idea who he is.

otacon239@lemmy.world on 23 May 02:11 next collapse

I fear the day I’m on the receiving end of a “glitch.” It’s ridiculous that anyone can think these are safe after how many of these videos I’ve seen.

beejjorgensen@lemmy.sdf.org on 23 May 04:21 collapse

As a motorcyclist… Yeah.

xenomor@lemmy.world on 23 May 03:14 next collapse

There’s an obvious reason. It’s a fucking Tesla.

LadyAutumn@lemmy.blahaj.zone on 23 May 03:41 next collapse

I am never getting into a self driving car. I don’t understand why we are investing money into this technology when people can already drive cars on their own, and we should be moving towards robust public transportation systems anyway. A waste of time and resources to… what exactly? Stare at your phone for a few extra minutes a day? Work from home and every city having robust electric transit systems is what the future is supposed to be.

underline960@sh.itjust.works on 23 May 03:56 next collapse

Back when I still believed, I was excited because I wanted get in my car and take a 90-minute nap until I arrived at work.

With public transportation, you can only be half-asleep or you’ll miss your stop.

beejjorgensen@lemmy.sdf.org on 23 May 04:20 next collapse

I used to dream of watching a movie then falling asleep in bed while my car drove the 8 hours to my folks’ house.

But I’d want that beast to be bristling with sensors of every kind. None of this “cameras only” idiocy.

Someday. Maybe.

cestvrai@lemm.ee on 23 May 04:46 collapse

I have a 45 minute high speed train commute to a busy end-of-line station. I can sleep, read, work, or just stare out the window and think.

Same commute is probably twice as long by car during rush hour.

echodot@feddit.uk on 23 May 05:06 collapse

I wish I lived in a place that took rail infrastructure seriously. But all our trains appear to be built out of sheet iron and about four nails, oll movement is accompanied with eeeeeeeeeccccchhhhhhhheeeeeekkkkkkkscccreeeeeeeekkkkeeeeek

[deleted] on 23 May 05:36 next collapse

.

slaneesh_is_right@lemmy.org on 23 May 09:39 next collapse

I’m not a fan of self driving cars, but saying that people are able to drive cars is a stretch.

LadyAutumn@lemmy.blahaj.zone on 23 May 12:58 collapse

In general I am opposed to machines being in direct control of weapons. I am also definitely of the opinion that there are lots of people who shouldn’t be driving.

FreedomAdvocate@lemmy.net.au on 24 May 08:11 collapse

People crash cars far, far, far more than Tesla FSD crashes “per capita”. People are terrible drivers on average.

[deleted] on 23 May 04:21 next collapse

.

venusaur@lemmy.world on 23 May 04:22 next collapse

It was in stunt mode

Skyrmir@lemmy.world on 23 May 04:28 next collapse

I use autopilot all the time on my boat. No way in hell I’d trust it in a car. They all occasionally get suicidal. Mine likes to lull you into a sense of false security, then take a sharp turn into a channel marker or cargo ship at the last second.

echodot@feddit.uk on 23 May 05:04 next collapse

Isn’t there a plane whose autopilot famously keeps trying to crash into the ground. The general advice is to just not let it do that, whenever it looks like it’s about to crash into the ground, pull up instead.

kameecoding@lemmy.world on 23 May 07:20 next collapse

The Being 787 Max did that when the sensor got faulty and there was no redundancy for the sensor’s because that was in an optional addon package

mbtrhcs@feddit.org on 23 May 13:13 collapse

Even worse, the pilots and the airlines didn’t even know the sensor or associated software control existed.

Skyrmir@lemmy.world on 23 May 11:42 next collapse

Pretty sure that’s the Boeing 777 and they discovered that after a crash off Brazil.

GamingChairModel@lemmy.world on 23 May 13:11 collapse

All the other answers here are wrong. It was the Boeing 737-Max.

They fit bigger, more fuel efficient engines on it that changed the flight characteristics, compared to previous 737s. And so rather than have pilots recertify on this as a new model (lots of flight hours, can’t switch back), they designed software to basically make the aircraft seem to behave like the old model.

And so a bug in the cheaper version of the software, combined with a faulty sensor, would cause the software to take over and try to override the pilots and dive downward instead of pulling up. Two crashes happened within 5 months, to aircraft that were pretty much brand new.

It was grounded for a while as Boeing fixed the software and hardware issues, and, more importantly, updated all the training and reference materials for pilots so that they were aware of this basically secret setting that could kill everyone.

dependencyinjection@discuss.tchncs.de on 23 May 05:38 next collapse

Exactly. My car doesn’t have AP, but it does have a shed load of sensors and sometimes it just freaks out about stuff being too close to car for no discernible reason. Really freaks me out as I’m like what you see bro we just driving down the motorway.

ayyy@sh.itjust.works on 23 May 20:10 collapse

For mine, it’s the radar seeing the retro-reflective stripes on utility poles being brighter than it expects.

SynopsisTantilize@lemm.ee on 23 May 10:36 collapse

They have auto pilot on boats? I never even thought about that existing. Makes sense, just never heard of it until just now!

AnUnusualRelic@lemmy.world on 23 May 11:55 next collapse

They’ve had it forever. Tie a rope to the wheel. Presto. Autopilot.

KayLeadfoot@fedia.io on 23 May 13:41 next collapse

I'll point this post out to Wall Street Bets, Maersk stock will pop 10%+ overnight.

Akasazh@feddit.nl on 23 May 15:07 collapse

That’s not how boats (ouside of hollywood) work, tho

ayyy@sh.itjust.works on 23 May 20:09 collapse
JohnEdwa@sopuli.xyz on 23 May 16:01 collapse

They’ve technically had autopilots for over a century, the first one was the oil tanker J.A Moffett in 1920. Though the main purpose of it is to keep the vessel going dead straight as otherwise wind and currents turn it, so using modern car terms I think it would be more accurate to say they have lane assist? Commercial ones can often do waypoint navigation, following a set route on a map, but I don’t think that’s very common on personal vessels.

SynopsisTantilize@lemm.ee on 23 May 21:56 collapse

That’s similar to commercial airliners right?

Sterile_Technique@lemmy.world on 23 May 04:45 next collapse

I mean, if Elon was my dad, I’d probably have some suicidal tendencies too.

embed_me@programming.dev on 23 May 06:24 collapse

More like the abusive step-father

winni@lemmy.world on 23 May 04:49 next collapse

you do get this wrong, its a flying car

minorkeys@lemmy.world on 23 May 06:15 next collapse

But does it do it less often than humans?

Matty_r@programming.dev on 23 May 06:53 next collapse

I have done it zero times, and I am definitely a Human man.

FordBeeblebrox@lemmy.world on 23 May 07:01 collapse

Can confirm, also a human and never caused a car crash

ByteJunk@lemmy.world on 23 May 07:13 next collapse

A good point, but I’m not sure that’s where the bar is. How does it compare to other self-driving systems that have lidar, for instance?

minorkeys@lemmy.world on 23 May 16:33 collapse

Depends on the issue at hand. To get these approved and widespread, better than humans may be the bar.

innermachine@lemmy.world on 23 May 11:09 collapse

I’m gonna answer your question with a question, as I don’t have your answer. When a human wrecks up it’s their fault. Who’s fault is it when something like this happens? Should ut still be the person in the driver’s seat?

minorkeys@lemmy.world on 23 May 16:37 collapse

No idea how that will turn out. Fully auto, but you have to maintain the vehicle and ensure it’s road worthy as the owner.

vegeta@lemmy.world on 23 May 10:15 next collapse

<img alt="" src="https://lemmy.world/pictrs/image/a61658d6-5c2e-45cf-8a50-2e3994398f30.png">

tfm@europe.pub on 23 May 10:28 next collapse

“I’m confident that Save full self driving (SFSD) will be ready next year”

itisileclerk@lemmy.world on 23 May 10:35 next collapse

Why someone will be a passenger in self-driving vehicle? They know that they are a test subjects, part of a “Cartrial” (or whatever should be called)? Self-Driving is not reliable and not necessery. Too much money is invested in something that is “Low priority to have”. There are prefectly fast and saf self-driving solutions like High-speed Trains.

dan1101@lemm.ee on 23 May 18:04 collapse

I have no idea, I guess they have a lot more confidence in self driving (ESPECIALLY Tesla) than I do.

melsaskca@lemmy.ca on 23 May 11:47 next collapse

I have visions of Elon sitting in his lair, stroking his cat, and using his laptop to cause this crash. /s

nyan@lemmy.cafe on 23 May 13:12 next collapse

Why would you inflict that guy on a poor innocent kitty?

postmateDumbass@lemmy.world on 23 May 18:21 collapse

That tree cast shade on his brand.

It had to go.

KingCake_Baby@lemmy.world on 23 May 14:51 next collapse

Don’t drive Tesla

rottingleaf@lemmy.world on 23 May 16:21 next collapse

It’s full self-driving, doesn’t need roads. Put into wrong car.

gamermanh@lemmy.dbzer0.com on 23 May 17:29 next collapse

No serious injuries

How unfortunate

TeddE@lemmy.world on 23 May 18:50 collapse

Look, I respect where you’re coming from. May I presume your line of reasoning is in the vein of “elon musk sucks and thus anyone who buys their stuff is a Nazi and should die” - but that is far, far too loose of a chain of logic to justify sending a man to death alone. Perhaps if you said that they should be held accountable with the death penalty on the table? But c’mon - are you really the callous monster your comment paints you as?

ayyy@sh.itjust.works on 23 May 20:00 next collapse

These aren’t passive victims, they are operating harmfully dangerous machines at high speeds on roads shared with the rest of us.

echodot@feddit.uk on 23 May 21:21 collapse

Right but they believe that the car is safe, because of the advertising and because the product is legally sold.

If anyone is to blame here it’s not the owner of the car, it’s the regulators who allow such a dangerous vehicle to exist and to be sold.

KayLeadfoot@fedia.io on 23 May 21:34 collapse

Yea, this subthread it morally ass.

I don't think it's morally wrong to be a sucker. If you fall for the lie, you think you're actually doing a good thing by using FSD and making the road both safer today and potentially radically safer into the future.

Problem is, it's a lie. Regulators exist to sort that shit out for you, car accidents are rare enough that the risk is hard to evaluate as a lone-gun human out here. The regulators biffed this one about as hard as an obvious danger can be biffed.

gamermanh@lemmy.dbzer0.com on 24 May 02:04 collapse

I give 0 ducks about Nazis who drive the Nazi car. The more of them that oven themselves in them the better

DarrinBrunner@lemmy.world on 23 May 17:41 next collapse

It got the most recent update, and thought a tunnel was a wall.

Buffalox@lemmy.world on 23 May 18:30 next collapse

Took me a second to get it, but that’s brilliant.
I wonder if there might even be some truth to it?

GSV_Sleeper_Service@lemmy.world on 24 May 08:25 collapse

Wonder no more. Someone did this on YouTube using cardboard boxes, Tesla drove straight through them. Skip to around the 15 minute mark to watch it drive through the “wall” without even touching the brakes.

Edit: thought the person you were replying to said it thought a wall was a tunnel, not the other way round. Still funny to watch it breeze through a wall with a tunnel painted on it though.

Buffalox@lemmy.world on 24 May 10:33 collapse

Yes I know the video, what I was wondering is if it could be true that they tried to make the AI detect a wall with a road painted on it, and it falsely believed there was a wall, and made an evasive maneuver to avoid it.

smeenz@lemmy.nz on 23 May 20:41 collapse

… and a tree was a painting.

postmateDumbass@lemmy.world on 23 May 18:20 next collapse

HAL9000 had Oh Clementine!

Has Tesla been training their AI with the lumberjack song?

Buffalox@lemmy.world on 23 May 18:28 next collapse

The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.

What I don’t get is how this false advertising for years hasn’t caused Tesla bankruptcy already?

NikkiDimes@lemmy.world on 23 May 19:42 next collapse

Well, because 99% of the time, it’s fairly decent. That 1%'ll getchya tho.

ayyy@sh.itjust.works on 23 May 19:59 next collapse

To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.

KayLeadfoot@fedia.io on 23 May 21:15 next collapse

Someone who doesn't understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.

FreedomAdvocate@lemmy.net.au on 24 May 07:58 next collapse

What is the failure rate? Unless you know that you can’t make that claim.

bluewing@lemm.ee on 24 May 12:22 collapse

You are trying to judge the self driving feature in a vacuum. And you can’t do that. You need to compare it to any alternatives. And for automotive travel, the alternative to FSD is to continue to have everyone drive manually. Turns out, most clowns doing that are statistically worse at it than even FSD, (as bad as it is). So, FSD doesn’t need to be perfect-- it just needs to be a bit better than what the average driver can do driving manually. And the last time I saw anything about that, FSD was that “bit better” than you statistically.

FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.

echodot@feddit.uk on 24 May 13:53 collapse

FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.

Yeah people keep bringing that up as a counter arguement but I’m pretty certain humans don’t swerve off a perfectly straight road into a tree all that often.

So unless you have numbers to suggest that humans are less safe than FSD then you’re being equally obtuse.

bluewing@lemm.ee on 25 May 12:11 next collapse

A simple google search, (which YOU could have done yourself), shows it’s abut 1 in 1.5 million miles driven per accident with FSD vs 1 in 700,000 miles driven for mechanical cars. I’m no Teslastan, (I think they are over priced and deliberately for rich people only), but that’s an improvement, a noticeable improvement.

And as a an old retired medic who has done his share of car accidents over nearly 20 years-- Yes, yes humans swerve off of perfectly straight roads and hit trees and anything else in the way also. And do so at a higher rate.

jamesjams@lemmy.world on 26 May 04:36 collapse

Humans do swerve off perfectly straight roads into trees, I know because I’ve done it!

echodot@feddit.uk on 26 May 08:49 collapse

Can you confirm that to the best of your knowledge you are not a robot?

KayLeadfoot@fedia.io on 26 May 08:54 collapse

This little subthread <img alt="looks like this" src="https://i.makeagif.com/media/12-21-2015/yI0NDt.gif">.

echodot@feddit.uk on 23 May 21:19 next collapse

Even with the distances I drive and I barely drive my car anywhere since covid, I’d probably only last about a month before the damn thing killed me.

Even ignoring fatalities and injuries, I would still have to deal with the fact that my car randomly wrecked itself, which has to be a financial headache.

NikkiDimes@lemmy.world on 24 May 01:18 next collapse

…It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn’t even time for human intervention, but I frequently had to take over when I used to use it (post v13)

Buffalox@lemmy.world on 24 May 10:37 collapse

Many Tesla owners are definitely dead many times, on the inside.

echodot@feddit.uk on 23 May 21:16 collapse

That’s probably not the failure rate odds but a 1% failure rate is several thousand times higher than what NASA would consider an abort risk condition.

Let’s say that it’s only 0.01% risk, that’s still several thousand crashes per year. Even if we could guarantee that all of them would be non-fatal and would not involve any bystanders such as pedestrians the cost of replacing all of those vehicles every time they crashed plus fixing damage of things they crashed into, lamp posts, shop Windows etc would be so high as it would exceed any benefit to the technology.

It wouldn’t be as bad if this was prototype technology that was constantly improving, but Tesla has made it very clear they’re never going to add lidar scanners so is literally never going to get any better it’s always going to be this bad.

KayLeadfoot@fedia.io on 23 May 21:20 next collapse

...is literally never going to get any better it's always going to be this bad.

Hey now! That's unfair. It is constantly changing. Software updates introduce new reversions all the time. So it will be this bad, or significantly worse, and you won't know which until it tries to kill you in new and unexpected ways :j

FreedomAdvocate@lemmy.net.au on 24 May 07:56 collapse

Saying it’s never going to get better is ridiculous and demonstrably wrong. It has improved in leaps and bounds over generations. It doesn’t need LiDAR.

The biggest thing you’re missing if that with FSD **the driver is still supposed to be paying attention at all times, ready to take over like a driving instructor does when a learner is doing something dangerous. Just because it’s in FSD Supervised mode it slant mean you should just sit back and watch it drive you off the road into a lake.

echodot@feddit.uk on 24 May 12:00 collapse

Your saying this on a video where it drove into a tree and flipped over. There isn’t time for a human to react, that’s like saying we don’t need emergency stops on chainsaws, the operator needs to just not drop it.

echodot@feddit.uk on 23 May 21:13 next collapse

Because the US is an insane country where you can straight up just break the law and as long as you’re rich enough you don’t even get a slap on the wrist. If some small startup had done the same thing they’d have been shut down.

What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.

Buffalox@lemmy.world on 24 May 07:04 collapse

What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.

I’ve argued this point the past year, there are obvious safety problems with Tesla, even without considering FSD.
Like blinker on the steering wheel, manual door handles that are hard to find in emergencies, and distractions from common operations being behind menus on the screen, instead of having directly accessible buttons. With auto pilot they also tend to break for no reason, even on autobahn with clear road ahead! Which can also create dangerous situations.

FreedomAdvocate@lemmy.net.au on 24 May 07:47 collapse

What false advertising? It’s called “Full Self Driving (Supervised)”.

Buffalox@lemmy.world on 24 May 10:34 collapse

For many years the “supervised” was not included, AFAIK Tesla was forced to do that.
And in this case “supervised” isn’t even enough, because the car made an abrupt unexpected maneuver, instead of asking the driver to take over in time to react.

FreedomAdvocate@lemmy.net.au on 24 May 14:30 collapse

The driver isn’t supposed to wait for the car to tell them to take over lol. The driver is supposed to take over when necessary.

SkyezOpen@lemmy.world on 24 May 15:36 next collapse

The attention required to prevent these types of sudden crashes negates the purpose of FSD entirely.

Buffalox@lemmy.world on 24 May 18:41 collapse

No if you look at Waymo as an example, they are actually autonomous, and stop to ask for assistance in situations they are “unsure” how to handle.

But even if you claim was true, in what way was this a situation where the driver could deem it necessary to take over? It was clear road ahead, and nothing in view to indicate any kind of problem, when the car made a sudden abrupt left causing it to roll upside down.

FreedomAdvocate@lemmy.net.au on 24 May 23:04 collapse

They can’t stop and ask for assistance at 100km/h on a highway.

I hope Tesla/Musk address this accident and get the telemetry from the car, cause there’s no evidence that FSD was even on.

Buffalox@lemmy.world on 25 May 07:47 collapse

According to the driver it was on FSD, and it was using the latest software update available.

www.reddit.com/user/SynNightmare/

They can’t stop and ask for assistance at 100km/h on a highway.

Maybe the point is then, that Tesla FSD shouldn’t be legally used on a highway.
But it probably shouldn’t be used anywhere, because it’s faulty as shit.
And why can’t is slow down to let the driver take over in a timely manner, when it can break for no reason.
It was tested in Germany on Autobahn where it did that 8 times within 6 hours!!!

FreedomAdvocate@lemmy.net.au on 25 May 08:04 collapse

According to the driver, with zero evidence backing up the claim. With how much of a hard on everyone has for blaming Elon musk for everything, and trying to drag teslas stock down, his accident is a sure fire way to thousands of Internet karma and e-fame on sites like Reddit and Lemmy. Why doesn’t he just show us the interior camera?

Looking at his profile he’s milking this for all it’s worth - he’s posted the same thread to like 8 different subs lol. He’s karma whoring. He probably wasn’t even the one involved in the crash.

Looked at his twitter which he promoted on there too, and of course he tags mark rober and is retweeting everything about this crash. He’s loving the attention and doing everything he can to get more.

Also he had the car for less than 2 weeks and said he used FSD “all the time”……in a brand new car he’d basically never driven……and then it does this catastrophic failure? Yeh nah lol. Also as others in some of the threads have pointed out, the version of FSD he claims it was on wasn’t out at the time of his accident.

Dudes lying through his teeth.

Buffalox@lemmy.world on 25 May 08:09 collapse

There have been other similar cases lately, which clearly indicate problems with the car.
The driver has put up the footage from all the cameras of the car, so he has done what he can to provide evidence.

www.reddit.com/r/TeslaFSD/…/1328_fsd_accident/

It’s very clear from the comments, that some have personally experienced similar things, and others have seen reporting of it.
This is not an isolated incident. It’s just has better footage than most.

FreedomAdvocate@lemmy.net.au on 25 May 08:26 next collapse

Just no footage from the interior camera, no proof of FSD being used.

Others have pointed out critical holes in his story - namely that he claims that he was on a version of FSD that was not released at the time of his crash.

Buffalox@lemmy.world on 25 May 08:55 collapse

The link I gave you is the place he posted this. And you can see what version he says he was using:

I find it entertaining honestly there is so many conspiracy theories around my incident people saying it’s not a Tesla, the robo taxi drama, both political sides think I’m anti Elon, others saying that I wasn’t on 13.2.8 I definitely put mine on the beta channel to get that version as quick as I did

www.notateslaapp.com/fsd-beta/
So you are parroting bullshit, the current version is 13.2.9.

Just no footage from the interior camera, no proof of FSD being used.

Funny how people in the thread I linked to you, who drive Tesla themselves don’t question this?
Some people believe the FSD saw the shadow of the pole as a curb in the road, or maybe even the base of a wall. And that’s why the FSD decided to “evade”.
There are plenty examples in the comments from people who drive Tesla themselves, about how it steers into oncoming traffic, one describes how his followed black skid marks in the road wearing wildly left to right, another describes how his made an evasive maneuver because of a patch in the road. It just goes on and on with how faulty FSD is.

IDK what Tesla cars have what cameras. But I’ve seen plenty reporting on Tesla FSD, and none of it is good.
So why do you believe it’s more likely to be human error? When if it was a human not paying attention, it would be much more likely to weer slowly. rather than making an abrupt idiotic maneuver?

To me it seems you are the one who lacks evidence in your claims.
And problem with Tesla logging is that it’s a proprietary system that only Tesla has access to, that system needs to be open for everybody to examine.

[deleted] on 25 May 22:46 next collapse

.

FreedomAdvocate@lemmy.net.au on 25 May 22:46 collapse

The link I gave you is the place he posted this. And you can see what version he says he was using:

What he says is what’s questionable though, because he says he was on 13.2.8 but that wasn’t available to the public at the time of his crash. Being on the beta channel means nothing, he still wouldn’t have gotten it because it wasn’t available on the beta channel on that day.

www.reddit.com/r/TeslaFSD/comments/…/mtlj5ki/

So you are parroting bullshit, the current version is 13.2.9.

How am I parroting bullshit? The current version is 13.2.9? Great - has nothing to do with what I said.

Funny how people in the thread I linked to you, who drive Tesla themselves don’t question this?

Some of them did, because he says he was on 13.2.8 which added different camera recordings which aren’t available on his car. He clearly didn’t know this when he lied to say he was on 13.2.8, because if he was on it then these recordings would have been there. Reddit being reddit, and the internet being the internet - people lie. Most of the people in there claiming that their teslas tried to do the same thing are likely 12 year olds who’ve never driven a car before. Don’t believe everything you read on the internet, ESPECIALLY on reddit lol.

Also if you haven’t realized, hating on anything tesla/musk related is the cool thing to do at the moment, especially people on left-leaning sites like Reddit and Lemmy.

Some people believe the FSD saw the shadow of the pole as a curb in the road, or maybe even the base of a wall. And that’s why the FSD decided to “evade”.

Cool, doesn’t mean they’re right. The FSD saw shadows of poles and trees and power lines all the way up to that second too. Well it would have if it was engaged.

To me it seems you are the one who lacks evidence in your claims.

My claim is that it wasn’t using FSD because he has provided ZERO evidence that he WAS using FSD. I’m the one asking for evidence to support the claim, and he has provided NONE. Remember - I’m not the one making the initial claim. I’m the one asking for evidence to support the claim.

Even if I was the one that needed to provide evidence, I have done so - lied about the version. No interior footage to prove it wasn’t driver error. Didn’t reach out to tesla about their car trying to kill him.

And problem with Tesla logging is that it’s a proprietary system that only Tesla has access to, that system needs to be open for everybody to examine.

Tesla have a program you can download to review your logs supposedly.

FreedomAdvocate@lemmy.net.au on 25 May 08:34 collapse

Also this happened in February. He never reached out to Tesla? He never requested the data to show that FSD was engaged? In that thread he says he only just did it. There’s also an official Tesla software program you can use to get the full logs, but as expected he hasn’t done that.

Dudes lying for sure.

Buffalox@lemmy.world on 25 May 08:58 next collapse

February explains why he wasn’t on 13.2.9.
Why would he reach out to Tesla? That’s not his job, but the insurance.
But there is no point, because Tesla never takes responsibility in these cases.

FreedomAdvocate@lemmy.net.au on 25 May 22:38 collapse

It also means that he couldn’t have been on version 13.2.8 which he claims he was on though.

www.reddit.com/r/TeslaFSD/comments/…/mtlj5ki/

My point in saying he didn’t reach out to tesla is that if I owned a car that drove itself off the road into a tree, I’d reach out to tesla and ask them to investigate and see what they’ll do for me for almost killing me. Insurance is a completely different story, they’ll go and do their thing regardless.

Buffalox@lemmy.world on 25 May 09:09 collapse

You are so full of shit, I just checked it out:

www.reddit.com/r/technology/comments/…/mtmbkvm/?c…

Ay that’s me thank you for tagging me. I know that there’s a lot of skepticism about my accident. I leased the car at the beginning of February and this happened at the end of February. I was using FSD every chance it would let me. I did not have time to react the cop said it was going 55 miles when it crashed me. I requested the data log today as somebody suggested to me.

He never claimed it was recent.

Every claim you make you never provide sources, because you are probably just parroting hearsay.

FreedomAdvocate@lemmy.net.au on 25 May 22:28 collapse

He never claimed it was recent.

Did you not even read my post? Literally THE FIRST SENTENCE OF MY POST IS THIS:

Also this happened in February.

The part of mine that you misunderstood as me saying it was recent was, I assume, this:

He never requested the data to show that FSD was engaged? In that thread he says he only just did it.

Again, you misread and misunderstood - a common theme with you apparently - me saying that he only just requested the data from Tesla as me saying “he only just crashed the car”. The quote you posted of the guy literally confirms what I said - that the crash was in February and that he only just requested the data lol

You are so full of shit, I just checked it out:

Every claim you make you never provide sources, because you are probably just parroting hearsay.

Care to apologize?

Gammelfisch@lemmy.world on 23 May 20:19 next collapse

Typical piss poor quality from Leon Hitler. F Tesla.

rational_lib@lemmy.world on 23 May 21:11 next collapse

To be fair, that grey tree trunk looked a lot like a road

KayLeadfoot@fedia.io on 23 May 21:13 next collapse

GPS data predicted the road would go straight as far as the horizon. Camera said the tree or shadow was an unexpected 90 degree bend in the road. So the only rational move was to turn 90 degrees, obviously! No notes, no whammies, flawless

LePoisson@lemmy.world on 23 May 21:41 collapse

It’s fine, nothing at all wrong with using just camera vision for autonomous driving. Nothing wrong at all. So a few cars run off roads or don’t stop for pedestrians or drive off a cliff. So freaking what, that’s the price for progress my friend!

I’d like to think this is unnecessary but just in case here’s a /s for y’all.

RaptorBenn@lemmy.world on 23 May 21:35 next collapse

Not really worth talking about unless the crash rate is higher than human average.

KayLeadfoot@fedia.io on 23 May 21:45 collapse

Imagine if people treated airbags that way XD

If Ford airbags just plain worked, and then Tesla airbags worked 999 times out of 1,000, would the correct answer be to say "well thems the breaks, there is no room for improvement, because dangerously flawed airbags are way safer than no airbags at all."

Like, no. No, no, no. Cars get recalled for flaws that are SO MUCH less dangerous.

LanguageIsCool@lemmy.world on 24 May 02:44 next collapse

Kill me” it said in a robotic voice that got slower, glitchier, and deeper as it drove off the road.

zebidiah@lemmy.ca on 24 May 11:44 collapse

EXTERMINAAAAAATE!!!

FreedomAdvocate@lemmy.net.au on 24 May 07:45 next collapse

Why was the driver not paying attention and why didn’t they just steer it back into the lane? It’s not called “Full Self Driving (Supervised)” for no reason. Hopefully Tesla get all the telemetry and share why it didn’t stay on the road, and also check if the driver was sleeping or distracted.

rabber@lemmy.ca on 24 May 11:52 collapse

Watch the video. Happens insanely quickly. And on straight road that should be no issue so person’s guard was down

FreedomAdvocate@lemmy.net.au on 24 May 14:33 collapse

Watched it. Is there even any evidence that it was in FSD mode?

GladiusB@lemmy.world on 24 May 14:52 collapse

Can you prove that it wasn’t in self driving mode? Since you are being so inept about a technology that has had several other reports doing the same thing. You are defending the car which is a known POS.

FreedomAdvocate@lemmy.net.au on 24 May 22:10 collapse

The claim is that it was in SFD. Where’s the evidence?

KayLeadfoot@fedia.io on 24 May 22:52 next collapse

That's not how that works, New-account-with-negative-2500-karma. You supply evidence for your own claims, others can review the evidence.

FreedomAdvocate@lemmy.net.au on 25 May 02:02 collapse

The claim was that FSD did this, but no evidence was provided to say it did.

My claim is that there’s no evidence to show FSD was enabled.

GladiusB@lemmy.world on 24 May 23:54 collapse

The fact that many other Teslas have done the behavior that is reported. Are you new to how facts work?

FreedomAdvocate@lemmy.net.au on 25 May 01:34 collapse

That’s not evidence lol. The claim is that FSD did this, but no evidence was provided to show that it was.

The owner of this Tesla could have also posted the internal cabin camera footage. Wonder why they didn’t?….

[deleted] on 25 May 04:08 collapse

.

[deleted] on 25 May 04:12 collapse

.

rabber@lemmy.ca on 24 May 11:53 next collapse

Elon took the wheel because that person made a mean tweet about him

sickofit@lemmy.today on 24 May 14:48 next collapse

This represents the danger of expecting driver override to avoid accidents. If the driver has to be prepared enough to take control in an accident like this AT ALL TIMES, then the driver is required to be more engaged then they would be if they were just driving manually, because they have to be constantly anticipating not just what other hazards (drivers, pedestrians,…) might be doing, they have to be anticipating in what ways their own vehicle may be trying to kill them.

Bytemeister@lemmy.world on 24 May 15:07 collapse

Absolutely.

I’ve got a car with level 2 automation, and after using it for a few months, I can say that it works really well, but you still need to be engaged to drive the car.

What it is good at… Maintaining lanes, even in tricky situation with poor paint/markings. Maintaining speed and distance from the car in front of you.

What it is not good at… Tricky traffic, congestion, or sudden stops. Lang changes. Accounting for cars coming up behind you. Avoiding road hazards.

I use it mostly like an autopilot. The car takes some of the monotonous workload out of driving, which allows me to move my focus from driving the car to observing traffic, other drivers, and road conditions.

RememberTheApollo_@lemmy.world on 24 May 14:54 next collapse

The problem with automation is complacency. Especially in something that people already have a very hard time taking seriously like driving where cell phone distraction, conversations, or just zoning out is super common.

phoenixz@lemmy.ca on 24 May 15:13 next collapse

I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week

That’s because it won’t, that’s because Elmo musk is gasp a liar. Always has been. That robo taxi is actuyab older lie he used a couple of years prior, but he dusted it lfft and re-used it.

Anytime Elmo says that he’s confident they can do it now, he means that they’re nowhere near a real product. Anytime he says “next year” it means that it won’t ever happen. Anytime he says that they alrethave a product, it just needs to me produced, it means that it’ll never happy

He is a vaporware con man who has been cheating people (and mostly the US government) out of billions

Literally look at all of his promises over the last decade, you start seeing patterns. It’s always almost there.

SpaceX, arguay his most successful company that he actually did with his leadership is a shit show of lies. According to him we’d be having colonies on Mars by now, it’s what he took 3 billion dollars in funding for, and he literally isn’t at 1% of that. Yet, he keeps claiming, within a few years now! Three billion dollars and he managed to blow up a banana over the Indian ocean, and obliterate a launch pad

If I commit fraud in the thousands, take thousands and then don’t deliver, I go to jail. He does it with countless billions and he’s still out there. Bit alas, his behavior finally is catching up with him, Tesla is going off a cliff bow that nobody wants to drive a Nazi brick anymore

atmorous@lemmy.world on 24 May 23:51 next collapse

If it was open source tech people could check it and see if it really is capable for themselves but because it’s not we don’t know what it is missing to be way way better

phoenixz@lemmy.ca on 25 May 14:58 collapse

Nah, on the 5 levels of autonomous driving, telsas as at level 2

Elmo isn’t even close but that wint stip him from just lying about it because that is what Elmo does best

demonsword@lemmy.world on 26 May 12:24 collapse

Literally look at all of his promises over the last decade, you start seeing patterns. It’s always almost there.

Cheers for the guy/gal that maintains an updated list of all his bullshit, check it out sometime

phoenixz@lemmy.ca on 29 May 04:38 collapse

Holy shit that is a treasure trove! Thanks kind stranger!

AA5B@lemmy.world on 24 May 23:44 next collapse

It’s ready, but you’re assuming an entirely general taxi service. It will be carefully constrained like Wayno was. It will be limited to easy streets and times, probably lower speeds, where there is less chance of problems. It’s ready for that.

There’s always a reason. I agree with the author: most likely it misinterpreted a shadow as a solid obstacle. I’m not excusing it but humans do that too, and Tesla will likely ensure it doesn’t come up in their taxi service.

Remember that robotaxi doesn’t actually exist yet. I’m pretty sure the plan is to start with Model Y having human safety drivers. it’s ready for that

I did a trial to find out for myself and my reason for it not being ready yet is a bit different. Full self-driving did perfectly under “normal” conditions, and every time it made me nervous was an edge case. However it made me realize driving is all edge cases. It’s not ready and may never be

atmorous@lemmy.world on 24 May 23:50 collapse

For no reason?

They are running proprietary software in the car that people don’t even know what is happening in background of. Every electric car needs to be turned into an open source car so that the car cannot be tampered with, no surveillancing, etc etc

Everyone should advocate for that because the alternative is this with Tesla. And I know nobody wants this happening to other car manufacturers cars as well