Self-Driving Tesla Nearly Hits Oncoming Train, Raises New Concern On Car's Safety (www.ibtimes.co.uk)
from rosschie@lemmy.zip to technology@lemmy.world on 21 May 2024 23:46
https://lemmy.zip/post/15884075

Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty’s Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

#technology

threaded - newest

lavacake1111@lemmy.world on 22 May 2024 00:07 next collapse

Did the owner not put their car in “Do not ram train” mode?

IphtashuFitz@lemmy.world on 22 May 2024 00:21 next collapse

That’s an optional software upgrade. It’ll cost you $12.95 a month.

Excrubulent@slrpnk.net on 22 May 2024 01:41 collapse

Oh come on, who wouldn’t pay for that? To not run into trains? That’s a bargain! Thanks Daddy Musk 🥰

laurelraven@lemmy.blahaj.zone on 22 May 2024 04:23 collapse

And just think, the rest of us rubes have to manually not drive into trains, like barbarians

Daft_ish@lemmy.world on 22 May 2024 19:44 collapse

Why even live anymore.

drives into train

AmidFuror@fedia.io on 22 May 2024 04:44 next collapse

So sick of people referring to "Do not ram train" mode. You see it all over social media, but especially Lemmy. It's "Do not ram train (Supervised)" mode, and you'd have to be living under a rock for the last 5+ years to think you don't have to actually take control of the wheel to stop it from ramming a train.

[deleted] on 22 May 2024 09:01 collapse

.

magnetosphere@fedia.io on 22 May 2024 00:14 next collapse

Man, am I glad that I couldn’t afford a Tesla when I thought they were cool and didn’t find Musk repulsive.

Beaver@lemmy.ca on 22 May 2024 00:41 next collapse

Bro I wouldn’t trust Elon to make me a sandwich.

SuckMyWang@lemmy.world on 22 May 2024 03:09 next collapse

He’d keep telling you it’s nearly ready but you have already been waiting for hours.

Logh@lemmy.ml on 22 May 2024 04:10 next collapse

The hypersub sounds difficult, but it’s really easy. I know you are hungry now, but it’s worth it to wait a few years. We’ll make enough hypersubs for everyone by 2028.

Evotech@lemmy.world on 22 May 2024 05:52 next collapse

And when you get it it’s clearly missing some of the ingredients but he tells you to trust him

rollerbang@lemmy.world on 22 May 2024 08:52 next collapse

The lack of waiters is not his fault.

Aceticon@lemmy.world on 22 May 2024 11:26 collapse

And when you get it the bread is cardboard, the cheese is cheap factory stuff but there is little flag pinned on top and sauces with quit daring appearences (some kind of gel with little stars and a bright fluerscent yellow one) have been applied with a clear decorative intent.

sp3tr4l@lemmy.zip on 22 May 2024 11:06 collapse

At this point, I believe I know more about sandwich making than any human alive on Earth.

Scene: a gigantic pop up tent with diesel generators in a desert, featuring many granite counters as well as top of the line kitchen appliances, and gamer lighting, all being set up in by workers who are immediately laid off once the jobsite is completed

Elon enters the tent with 4 Tesla bots slowly shambling behind him. One lags out when its remote link to a human controller is severed. Minutes later the remnants of a starlink satellite crash through the far end of the tent kitchen.

3 remaining Tesla bots proceed to bumble around like idiots, unable to open packets of deli ham, entirely ripping off the tops of deli mustard containers

Elon is awkwardly smiling and doing jazz hands the whole time

A neuralink mind controlled pig walks in as one Tesla bot wields a knife. Elon raises his hand to his ear, nods, then pushes a button on some phone app

The pig screams, then a popping noise is heard, and the pig collapses to the ground with smoke and blood coming out of its ears and nose

knife wielding tesla bot attempts to cut the pig’s flank, falls, cannot recover

the two remaining tesla bots continue in vain to open a loaf of bread without ripping the entire loaf apart. One slips and falls backwards, the other one runs out of battery and is frozen in place, holding a single piece of wonder bread

Elon curses, reaches into a refrigerator and hands you a crustable

Furbag@lemmy.world on 22 May 2024 16:08 collapse

Cut him a break, he’s pioneering sandwich making with this innovative tech. A few hiccups are to be expected.

sp3tr4l@lemmy.zip on 23 May 2024 01:23 collapse

Don’t say “cut him a break” around a Tesla product, god knows what command it’ll interpret that as.

dis_honestfamiliar@lemmy.world on 22 May 2024 03:36 next collapse

Same

EtherWhack@lemmy.world on 22 May 2024 05:29 next collapse

More and more they are starting to look like someone just vacuum molded the body over a chassis rather than putting any sort of real artistry to it. It also makes me wonder if Ellen is saying the look is “sexy,” does he have some sort of spandex fetish. Maybe he stole his mum’s pantyhose when alone and strutted around the house wearing them, thinking of himself as such. Maybe he still does…

Scotty_Trees@lemmy.world on 22 May 2024 14:13 collapse

Just watched the video. Maybe it’s just me, but if you’re the driver in a Tesla, and it’s foggy as fuck outside, maybe, just maybe, don’t use the self driving aspect when visibility is that bad. The amount of people willing to trust Tesla with their lives (and others on the road…) is too damn high!

Scolding7300@lemmy.world on 22 May 2024 14:45 collapse

I’d put the blame on the branding, full self drive shluld mean full self drive, not most-conditions self drive without explicitly providing the limitations. Even irplane autopilot systems, which solve a simpler problem, have explicit limitations stated

assassin_aragorn@lemmy.world on 22 May 2024 15:01 collapse

There really needs to be legal pressure for them to change the name. I don’t see how it’s not false advertising.

Talaraine@fedia.io on 22 May 2024 00:15 next collapse

AI said kill this guy right here

Beaver@lemmy.ca on 22 May 2024 00:41 collapse

It noticed he didn’t like Elon’s tweet that day.

joekar1990@lemmy.world on 22 May 2024 03:20 collapse

You know… I’d believe Elon is petty enough to actually put something like that in.

Daft_ish@lemmy.world on 22 May 2024 19:47 collapse

What? Every critic of tesla has been in a fata----

car bursts through wall

TheFeatureCreature@lemmy.world on 22 May 2024 00:22 next collapse

Link?

0110010001100010@lemmy.world on 22 May 2024 00:28 next collapse

Looks like the initial post is simply a person on a forum: teslamotorsclub.com/tmc/threads/…/page-8#post-825…

There are dashcam videos here but I’m not signing up for a dropbox to view them.

The best (subjective, I know) article that I could quickly find seems to be Yahoo: yahoo.com/…/tesla-full-self-driving-mode-16050072…

The videos here don’t load for me, could be an extension blocking them.

Not saying it’s not credible, but I would take it with a grain of salt.

That said, knowing the issues with FSD I would be shocked to learn this was made up, lol.

Scolding7300@lemmy.world on 22 May 2024 00:52 collapse
noxy@yiffit.net on 22 May 2024 16:40 collapse
karrbs@kbin.social on 22 May 2024 00:24 next collapse

If he had time to notice it not slowing down he had time to brake and take it out of full self driving. I understand as someone who is sceptical about the fsd mode that I am more proactive at taking over than those who trust it a little bit more. I just feel if a company tells you to supervise it you should supervise it.

I still find fsd to be very finicky and vastly oversold

assembly@lemmy.world on 22 May 2024 00:28 next collapse

If something is sold as fully self driving, I would like to think it should be capable of fully self driving and not a feature that will drive me face first into a train.

halcyoncmdr@lemmy.world on 22 May 2024 00:45 next collapse

Regardless of the naming, because everyone gets so stuck on fucking names and seems to ignore everything else because that makes for a quick comment with a ton of votes and feel good bullshit.

It is sold as a work in progress piece of software that is constantly being updated and still needs to be supervised. It has a ton of warnings about it’s capabilities, and lack thereof when activating it. There is no question when actually setting up FSD in the vehicle that it is something still in testing and not to be treated as a full replacement for paying attention. It constantly watches you and will warn you if you aren’t paying active attention for too long. If you ignore those warnings enough it will deactivate itself, forcing you to drive, and with enough deactivation will remove the capability entirely.

Image of the activation screen.

Image of the free trial page for those that have not purchased it, thereby avoiding notes on the standard sales pages

They’ve even updated the setting in the vehicle to be more specific, showing it as “Full Self-Driving (Supervised)”…
tesla.com/…/GUID-2CB60804-9CEA-4F4B-8B04-09B99136…

All of these reported situations are from people actively ignoring numerous safety and attention warnings, yet no one seems to ever put any blame on the driver in comments or articles. It’s always about blaming everything on Tesla when they’re actively telling every driver that it needs to be supervised because it will make mistakes.

Shrank7242@lemmy.zip on 22 May 2024 00:53 next collapse

Well said and thanks for posting the examples. It’s something that bothers me about any social media kind of site. Especially here on Lemmy. Nobody gives a damn about the incredible amount of negligence the drivers must have. It immediately becomes an anti Elon circlejerk every time.

It’s similar with news articles, which this post doesn’t even link to, most of the articles name drop Tesla or Elon just because otherwise it’s not a story. “Somebody hit a car / person / train because they weren’t paying attention to the road” isn’t story worthy. But as soon as doubt can be cast on an Elon company, it become a must post thing. I can’t stand Musks antics either, but he gets too much free rent in peoples mind. It’s wild

/rant

baru@lemmy.world on 22 May 2024 02:38 collapse

At the same time, there’s too many people who say that Full Self Driving obviously doesn’t mean that the vehicle still fully drive itself. Though for unknown reasons it is totally fine to keep using the name Full Self Driving.

Thorny_Insight@lemm.ee on 22 May 2024 04:03 next collapse

It’s called Full Self Driving (Supervised) nowdays. They changed the name.

The vehicle is capable of driving you to the grocery store on the other side of the city and back, sometimes with zero interventions from the driver. If that’s not Full Self Driving then I don’t know what is.

AmidFuror@fedia.io on 22 May 2024 04:39 next collapse

It's like how they changed the cybertruck windows to Unbreakable Glass (Fragile).

Thorny_Insight@lemm.ee on 22 May 2024 07:16 collapse

I can’t find any source for that claim

icy_mal@lemmy.world on 22 May 2024 06:58 collapse

That’s Full Self Driving (Sometimes).

Thorny_Insight@lemm.ee on 22 May 2024 07:14 collapse

Then what is full self driving to you? How good does the system need to be to qualify?

dream_weasel@sh.itjust.works on 22 May 2024 10:37 collapse

And that is between Tesla and the NTSB to sort even though I agree. The car itself doesn’t mince words describing it to you, and at the time you’re driving it, the required supervision is unambiguous.

If it were called “Tesla Drive” or something else, everyone would still be here taking a shit on it nonetheless.

Excrubulent@slrpnk.net on 22 May 2024 01:59 next collapse

You’re right that it’s just a name, which means it’s within Tesla’s power to not call it “full self driving”. Like maybe keep the word “full” for when it’s better than “full self driving brackets not really”.

The reason it’s called that is so when you’re buying the car, you can read “full self driving”, the salesman can call it “full self driving”, and then you can get excited and think you’re getting full self driving and pay stupid amounts of money for an iPhone on wheels.

It’s also so Musk can get up on stage and lie for years about how you’ll be able to go coast to coast while you sleep by the end of the year or whatever it is. Having a bunch of warnings in the software setup is not enough for someone gargling Musk’s jizz to cough it up and see it for the bullshit that it is.

You can give us all this extra info but you can’t change the core reality that the name is a lie.

halcyoncmdr@lemmy.world on 22 May 2024 02:26 next collapse

So tired of the same arguments. They don’t mean anything in the real world. Complain all you want about what the shit is called, it makes no real world difference.

Legally, the driver is responsible for the fucking vehicle and these articles and comments like yours just give the impression you think they shouldn’t be responsible because of what it’s called. You’re giving shitty drivers a pass because they’re actively being stupid and you don’t like what Tesla names the software. That is the stupidest take in the world.

Go ahead in a court of law and claim you are not responsible for an accident that happens while FSD is activated and let’s see whether the name matters for your liability.

baru@lemmy.world on 22 May 2024 02:35 next collapse

So tired of the same arguments. They don’t mean anything in the real world

The second sentence is a fallacy.

And it does matter that the company is calling it full self driving while it doesn’t fully self drive. That it would have that capability is something Musk has promised for many years. It’s also a reason that Tesla stock is worth so much.

Go ahead in a court of law and claim you are not responsible for an accident

That’s would be a very specific case. Tesla has been reminded multiple times that they need to take into account how people use their vehicles. The company is also under investigation for possible fraud because they are selling something that doesn’t do what people would respect it to do.

You’re focusing on one thing, but there’s multiple ways that the company could be liable. There’s been multiple articles explaining that the company is either under investigation or that the company has been warned to change things or else.

Thorny_Insight@lemm.ee on 22 May 2024 04:17 next collapse

How does it not fully self drive? What’s your definition of full self driving then?

Mercedes Drive Pilot is Level 3 and even it will prompt you to take over when necessary, does it not fully self drive then either? What about Waymo/Cruze? They have remote operators controlling the vehicles when they get stuck. Not fully self driving either? Is the standard that it needs to be absolutely flawless and never fail or what is it?

mojofrododojo@lemmy.world on 22 May 2024 07:29 collapse

Well put. It’s a funny thing, words have meaning and if you advertise your products with those words, some portion of the population (gasp!) might believe you!

Excrubulent@slrpnk.net on 22 May 2024 02:36 next collapse

I notice you danced around the question of whether it was a lie to focus on bullshit legal stuff, which isn’t the arbiter of truth and reality. As I once heard a judge say, “You don’t come here for justice, you come here for a judgment.”

Anyway since you think it matters, the lawsuit you’re talking about is happening and a judge has ruled that the case has merit to continue: reuters.com/…/tesla-must-face-vehicle-owners-laws…

halcyoncmdr@lemmy.world on 22 May 2024 03:05 collapse

Merit to continue just means it’s not clearly a bullshit lawsuit that should be thrown out to avoid wasting court time. Your linked lawsuit also is not about whether someone is legally responsible for a vehicle driving itself, it is again about the marketing.

I don’t give a shit whether their marketing is a lie. Marketing is not the issue at hand, as much as you all want it to be for whatever reason instead of actually blaming the shitty drivers. It has no bearing on whether someone is responsible for the vehicle they are in the driver seat of hitting something or someone.

Why do you not want to put any blame on these drivers? Drivers that ignore the warning when they turned on the function in the first place and that warns them every time they turn it on to still pay attention. Why are you so insistent that the blame should be on Tesla because of what they call it?

Makes me start to think you’re that type of driver and trying to justify your belief that you shouldn’t be responsible for the actions of a 2 ton murder machine traveling at high speed that you are in control of.

Excrubulent@slrpnk.net on 22 May 2024 04:18 collapse

I don’t give a shit whether their marketing is a lie.

I mean clearly. “Full self-driving” is marketing, and it is a lie. That’s the point being made here. You don’t have to care about it, but that doesn’t mean it’s not a lie.

I’m not not blaming the drivers. They were foolish enough to buy a Tesla and trust it with their lives for a start. But I am also blaming the marketing. Two things can be true.

0x0@programming.dev on 22 May 2024 09:10 collapse

You do know that marketing is, by its very definition, in practice all lies, right?

Wanna rent a VPN with military-grade encryption?

Excrubulent@slrpnk.net on 22 May 2024 09:17 collapse

So you admit that this is in fact a lie?

“Hey, every psychopathic corporate entity that functionally runs our society lies to us all the time,” is not a great sales pitch for why this psychopathic corporation should get a pass for its specific lies.

This is the depth capitalism apologists will stoop to with absolutely no self awareness.

0x0@programming.dev on 22 May 2024 09:53 collapse

I don’t have to admit to anything because a) i’m not on trial, b) don’t work or care for Tesla and c) don’t give a rat’s ass about your opinion on me :)

Marketing is and always has been mostly lies or half-truths. Yes, legislation has mitigated that to some extent but marketing is still marketing. You’d have to be cognitively impaired to take all that advertising to the letter (come to think of it, maybe that’s why coffee in some countries comes with a warning saying it’s hot… or microwave instructions must specify they’re not suitable for pets).

You’d be kinda excused for believing in all the hype when AI/Tesla were just hitting the scene but nowadays? Get a grip.

Bitch all you want about how Tesla misleads or mismarkets their products (they do, anyone with two brain cells knows that) but if you’s gonna use that as an excuse to trust a car to drive 100% independently and flawlessly because they said so… it’s totally on you. Or your corpse.

Excrubulent@slrpnk.net on 22 May 2024 10:49 collapse

You don’t have to admit to anything, but you just did.

Bitch all you want about how Tesla misleads or mismarkets their products (they do, anyone with two brain cells knows that)

See? That’s what I would call an “admission”.

but if you’s gonna use that as an excuse to trust a car to drive 100% independently and flawlessly because they said so… it’s totally on you. Or your corpse.

I… don’t? I already said that if you’d been paying attention. It also does not release Tesla and Musk from culpability for lying to people about their product, many of whom went on to become corpses as a result of their negligent and malicious lies, many of them well before Tesla and Musk lost their credibility. I would also say that if your business model relies on selling dangerous bullshit to gormless rubes, then if those gormless rubes get killed by your dangerous bullshit, then you are culpable for that dangerous bullshit by virtue of targetting the gormless rubes.

EDIT: Let’s not forget the bystanders who were killed by the dangerous bullshit along with the gormless rubes. People’s actions affect more than themselves. This is part of living in a society, which should actually protect its members and not leave them to the machinations of the worst psychopaths. You know, if it’s going to be a society worth living in.

Scammers target the mentally infirm by deliberately making their communications extremely unprofessional and laden with grammatical errors, that doesn’t mean we blame those people for being scammed.

Also if you have to invent an imaginary version of the other person - someone who drives a Tesla and believes it is actually fully self driving in this case - in order to make your point, then it’s probably not a very good point.

0x0@programming.dev on 22 May 2024 13:26 collapse

You don’t have to admit to anything, but you just did.

Did i now… if you say so… keep’em internet points, i attach no value to them. 😘

Excrubulent@slrpnk.net on 22 May 2024 14:23 collapse

The quote is right there, in your own words, and you’re not even saying anything anymore.

I don’t care about internet points either, but I do care about the truth. Apparently you don’t.

mojofrododojo@lemmy.world on 22 May 2024 07:27 next collapse

Hey everyone, Elon’s come to lemmy!

assassin_aragorn@lemmy.world on 22 May 2024 15:13 collapse

They don’t mean anything in the real world.

Uh. They mean everything in the real world? You get sued for false advertising and fraud. Fox News got sued heavily for knowingly lying about voting machines. There’s a reason that companies have PR departments. Words matter a lot in the real world.

Are the drivers stupid? Sure. They believed the FSD claim after all. But that doesn’t mean Tesla is off the hook. Deceiving stupid people is still deceit.

ShepherdPie@midwest.social on 22 May 2024 03:09 collapse

The reason it’s called that is so when you’re buying the car, you can read “full self driving”, the salesman can call it “full self driving”, and then you can get excited and think you’re getting full self driving and pay stupid amounts of money for an iPhone on wheels.

Who exactly are you describing here? Like someone who’s been living under a rock for the past 5+ years and thinks cars just drive themselves now, but who also has $50k-$100k to drop on a car and also doesn’t do any research beforehand? I think you’re blowing this way out of proportion.

All these systems are flawed to some extent as nobody has cracked the code to L5 automation. There is some danger to it but there are many dangers to driving and this eliminates some of them. People have died in Teslas but many more have died in every other production vehicle that has ever existed. If this guy did the samr thing in a Toyota Camry, do you think we’d even be talking about it right now? These systems can only get better with a lot of real-world usage.

I totally get where the other person is coming from. These arguments are so tired and meaningless at this point. If you want Musk to go away, then stop bringing him up at every turn because it just makes people sound like bizzaro-world Musk fans as they hate the guy but can’t stop talking about him and following his every move. Some of us want to discuss and debate the actual technology and Musk had nothing to do with developing it.

Excrubulent@slrpnk.net on 22 May 2024 04:35 next collapse

Musk was involved in marketing and lying about it, though, and his extremely prolific public image is what gave it so much credibility. He’s lost a lot of that credibility now though, largely because people have spent a lot of time criticizing him. If you want him to disappear from the public eye that’s great, so do I, but he’s one of the most powerful men in the world, so that’s not going to happen.

ShepherdPie@midwest.social on 22 May 2024 21:08 collapse

Well why are you lending him further credibility by continuing to make him prolific? You’re just feeding into his goal of garnering more attention at every turn. Media outlets will continue to focus on all his insane ramblings because people like you are guaranteed to click on it.

Excrubulent@slrpnk.net on 23 May 2024 00:06 collapse

but he’s one of the most powerful men in the world, so that’s not going to happen.

0x0@programming.dev on 22 May 2024 09:06 collapse

nobody has cracked the code to L5 automation.

especially if you remove LIDAR.

ShepherdPie@midwest.social on 22 May 2024 21:04 collapse

The systems with LIDAR aren’t L5 either. It’s impossible to claim what is and isn’t needed when nobody has actually come up with a solution.

0x0@programming.dev on 23 May 2024 07:53 collapse

Never said they were, but LIDAR is a big improvement over “cameras and AI”. Most other manufactures use it, Tesla stopped using it because of cost sav… because their AI kicks ass and if eyes are enough for humans they’re enough for their cars too.

Hell any system where lives are involved should be triple-redundancy.

Thorny_Insight@lemm.ee on 22 May 2024 03:59 collapse

Even this “article” is about nothing happening. The driver was paying attention and took over when the vehicle was about to do something it should. Just as they should.

Also, even if FSD was 10x safer than a human driver and we replaced every single car on the roads with Teslas there would still be 8 people dying every single day in the US alone. Linking articles about these accidents does not prove it being unsafe. It only feeds the confirmation bias of the person posting it and the people upvoting it. People want it to be unsafe so that they can shit on Elon. The standards they apply to Tesla are ridiculous compared to that of other companies. The extremely limited Mercedes Drive Pilot is praised as revolutionary tech while FSD already checks most boxes for Level 4 self-driving.

karrbs@kbin.social on 22 May 2024 01:13 next collapse

Just some insight from my pov. Fsd is marketed as FSD (Supervised). I don't agree with the jamming but it is what it is. I know it does janky stuff, it still forces you to pay attention. Do I believe this could happen, yes but do I doubt the driver always until proven otherwise.

I have had my model y yell at me to take control when I was already out of any auto/fsd mode. I have many downs and many ups. I agree that the car should actively steer you into the train. I was curious if anyone had a link to the dash footage or even to an article with it.

mojofrododojo@lemmy.world on 22 May 2024 07:32 collapse

Fsd is marketed as FSD (Supervised)

it is, now, it was not marketed with any kind of parenthetical qualifier until recently.

halcyoncmdr@lemmy.world on 22 May 2024 07:50 collapse

Actually, it said Full Self Driving (BETA) until it was updated to (Supervised) recently.

If anything, the beta qualifier is actually better than just saying supervised since that term means not complete and still being developed.

mojofrododojo@lemmy.world on 22 May 2024 10:01 collapse

never heard musk refer to it as anything but full self drive, no qualifiers.

companies concerned with safety wouldn’t market the shit until it’s safe (see mercedes apparent lead).

Thorny_Insight@lemm.ee on 22 May 2024 10:50 collapse

Mercedes Drive Pilot is hilariously limited system. It for example needs a car in front of it that it can follow or else it wont work. It also only works on limited number of hand-picked highways in California and Nevada.

There’s a video on YouTube comparing FSD to Mercedes’ equivalent driver assistant software (not the level 3 one) and it’s not even a competition. The Mercedes system is completely unusable.

flyingjake@lemmy.one on 22 May 2024 03:46 next collapse

To be fair, it could have fully driven itself into the train: “fully self driving” <> “fully safe driving” /s

jas0n@lemmy.world on 22 May 2024 11:20 collapse

Found the SQL developer!

CrowAirbrush@lemmy.world on 22 May 2024 04:15 next collapse

I too come from a time where company’s had to sell functional products or go bankrupt, but alas those days are long gone.

Maddier1993@programming.dev on 22 May 2024 08:05 collapse

Those should be your expectations when you are on the shop floor and that should allow you to reject the purchase if it’s s deal breaker for you. Not when you’re crossing railway tracks.

5C5C5C@programming.dev on 22 May 2024 10:36 collapse

Weird how this notion of “personal responsibility” applies to every person except for those people who choose to intentionally misrepresenting the product by branding it in ways that are misleading. The people running this company aren’t responsible for their role in misleading the public, just because the fine print happens to indicate that the product isn’t actually what it’s marketed as?

Now you’ll probably say something to the effect of “I never said that! You’re putting words in my mouth!” except what other motivation can you have to jump to the defense of the liar and blame people for being misled, except that you want to put all the responsibility on individuals for being misled and not on the company that is systematically and intentionally misleading them? Maybe you just manage to derive a smug sense of superiority thinking of yourself as someone who is invulnerable to this kind of tactic so blaming the victims lets you feel good about yourself.

Thorny_Insight@lemm.ee on 22 May 2024 10:42 collapse

You literally cannot buy FSD without being told that it needs driver supervision. It also tells you that every single time you enable it and it’s constantly nagging to you when you take your hands off the wheel aswell as if you’re looking at your phone etc. and given enough warnings the system locks you out of it.

Has Musk been dishonest/misleading about it’s capabilities in the past? Yes. Is there a single Tesla owner with FSD who doesn’t know the truth? No.

5C5C5C@programming.dev on 22 May 2024 13:14 collapse

I’m sure you’re on the shop floor for every one of those conversations.

But anyway, enjoy being confidently incorrect: consumerreports.org/…/tesla-driver-monitoring-fai…

Thorny_Insight@lemm.ee on 22 May 2024 13:23 next collapse

“Confidently incorrect”

Then proceeds to link over 2 year old article and even that aknowledges the existence of such system in the title.

It has an indoor camera that is constantly monitoring the driver and nags when they’re not paying attention. That’s a fact. Nothing what I said has been proven incorrect.

How Tesla’s Driver Monitoring System Works

SaltySalamander@fedia.io on 22 May 2024 22:39 collapse

But anyway, enjoy being confidently incorrect

👆 This is what we call irony

cedarmesa@lemmy.world on 22 May 2024 00:29 next collapse

Elon here. Thanks for fighting the good fight. We will deposit 100 X bucks in your account.

karrbs@kbin.social on 22 May 2024 01:08 next collapse

Lol 😂

ZoopZeZoop@lemmy.world on 22 May 2024 01:36 collapse

I hear X bucks can be used in place of the three seashells. Does anyone know if that’s true?

NeoNachtwaechter@lemmy.world on 22 May 2024 03:02 next collapse

If he had time to notice it not slowing down he had time to brake and take it out of full self driving.

And that’s the way he survived. But wrecked that plastic box.

mojofrododojo@lemmy.world on 22 May 2024 07:24 collapse

full self driving

I just take serious issue with this label. It’s not fully self driving, it requires the user’s attention. the accidents and years of promises broken are just cherries on the shit sundae.

ShittyBeatlesFCPres@lemmy.world on 22 May 2024 00:38 next collapse

If self-driving A.I. models need a workforce to help identify trains, we could probably assemble an army of toddlers willing to be paid in cookies. My friend’s kid gets HYPE and yells “TRAIN!” when he sees one. He can also reliably identify cows. He calls most construction equipment “big truck” but that might be good enough. If a Tesla thinks a backhoe is a big truck, it’ll avoid it.

QuarterSwede@lemmy.world on 22 May 2024 02:47 next collapse

And that’s the biggest issue so far with “AI.” It’s about as intelligent as a newborn.

Thorny_Insight@lemm.ee on 22 May 2024 04:21 next collapse

It’s about as intelligent as a newborn.

Newborns can’t even utter one cohesive word. I don’t get the point of making such an obviously false claims about anything.

ealoe@ani.social on 22 May 2024 05:08 collapse

A wise Jedi once said, “The ability to speak does not make you intelligent”

Thorny_Insight@lemm.ee on 22 May 2024 05:11 collapse

Intelligence is not binary, but a spectrum.

assassin_aragorn@lemmy.world on 22 May 2024 15:07 collapse

No, newborns rapidly take in new information and learn. “AI” is just a sophisticated text probability model. It doesn’t know anything. It isn’t learning how things work. It just regurgitates.

It’s like the difference between a student who understands the concepts versus memorizes the test answers.

QuarterSwede@lemmy.world on 23 May 2024 03:18 collapse

Ohhh I like that one.

tektite@slrpnk.net on 22 May 2024 06:32 next collapse

Please select all images containing TRAIN

littlewonder@lemmy.world on 22 May 2024 15:32 collapse

Autistic kid dream job.

Scolding7300@lemmy.world on 22 May 2024 00:52 next collapse

packaged-media.redd.it/…/m2-res_480p.mp4?m=DASHPl…

Video link

Neato@ttrpg.network on 22 May 2024 03:39 next collapse

So not even turning towards an intersection with a train or anything complicated. Tesla can’t even tell there’s a 12" steel wall in front of it. Fucking pathetic.

rsuri@lemmy.world on 22 May 2024 04:04 collapse

How would it though? It probably didn’t have any images like this in the train-ing data.

bladerunnerspider@lemmy.world on 22 May 2024 05:10 collapse

I guess some sort of radar could identify solid objects in nearly any condition… Hmm…

Thorny_Insight@lemm.ee on 22 May 2024 06:10 collapse

The new models with hardware 4 (atleast models S and X) have a radar but then again humans can manage without so I have no doubt that a vision-based system will be more than sufficient in the end.

SaltySalamander@fedia.io on 22 May 2024 22:27 collapse

Yea...that driver is a complete and utter moron who needs his license revoked.

Rentlar@lemmy.ca on 22 May 2024 00:55 next collapse

Speculation here, I wonder if the ditch lights and general light placement that is different than a normal car confused the self driving module into thinking it’s on the wrong side of the road…?

<img alt="" src="https://www.railpictures.net/images/d2/2/6/4/7264.1457189772.jpg">

•w•

Snowpix@lemmy.ca on 22 May 2024 03:48 next collapse

The locomotive had long passed in the video of the incident, it just kept driving towards the train until it swerved and nearly took out a crossing signal. Funny as that would be, I doubt the ditch lights were the cause.

Rentlar@lemmy.ca on 22 May 2024 05:39 collapse

Hmm yeah it seems the oncoming movement from the side may throw it off somehow.

One of the many forseeable problems with solely relying on cameras and visual processing.

0x0@programming.dev on 22 May 2024 09:15 next collapse

Not in this case, but it may have been when two drivers killed two motorcyclists.

PipedLinkBot@feddit.rocks on 22 May 2024 09:16 collapse

Here is an alternative Piped link(s):

two drivers killed two motorcyclists

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

Cornelius_Wangenheim@lemmy.world on 22 May 2024 09:39 next collapse

The dash cam footage is linked in the article. Looks to be mostly foggy conditions and their system completely ignoring the warning lights.

You999@sh.itjust.works on 22 May 2024 13:49 collapse

I work for a railroad and also own a Tesla. FSD doesn’t actually know what a train is at all. If you watch the visualization while in FSD you’ll see trains as a long string of semi trucks and it sees the crossing arms as flashing red stop lights (ie treat like stop sign).

Rentlar@lemmy.ca on 22 May 2024 14:33 collapse

trains as a long string of semi-trucks

So many long distance delivery trucks take the same route across the country. Why don’t we just string them all together, then have one big-ass truck engine in the front pulling it all? And to save on how big the motor needs to be, we’ll have steel on steel contact to reduce friction. Whoops, you’ve got a train all of a sudden. 😅

flashing red lights treated like a stop sign

Seems kind of dangerous to treat as a rule. Not just at railroad crossing but a stopped car with hazards on or a firetruck might confuse the self driving module…

You999@sh.itjust.works on 22 May 2024 15:13 collapse

So many long distance delivery trucks take the same route across the country. Why don’t we just string them all together, then have one big-ass truck engine in the front pulling it all? And to save on how big the motor needs to be, we’ll have steel on steel contact to reduce friction. Whoops, you’ve got a train all of a sudden. 😅

Get out of here with your crazy ideas…

<img alt="" src="https://sh.itjust.works/pictrs/image/c5dd8677-ae7a-41df-b319-ba3a557da9b2.jpeg">

ohwhatfollyisman@lemmy.world on 22 May 2024 00:58 next collapse

looks like the engineers misunderstood what “training mode” was supposed to do.

they would want to improve their track record after this, otherwise the public would just choo them up.

jaybone@lemmy.world on 22 May 2024 13:20 next collapse

Usually with ML, you separate your train and test data.

assassin_aragorn@lemmy.world on 22 May 2024 15:21 collapse

Oh the engineers probably perfectly understood what was going on. But they don’t have the ability to correct Musk when he’s spewing bullshit.

Ethically speaking though they’re supposed to refuse signing off on the work and whistleblow the issues, so they aren’t free of guilt.

Right now at my work we have a gap in our safety analysis with a contractor’s product, and we’ve had to fight the VP to explain how we can’t just say “it’s their problem so we won’t deal with it” if its part of our product. One of my colleagues had to go up a head to inform the head of safety that we were having issues. It’s still an ongoing fight, but I cannot in good conscience allow the product to be finalized when we know there’s a safety issue that needs to be addressed.

Don’t get me wrong, it isn’t an easy thing to do, and I’m really grateful that my coworker is very steadfast on this. But engineers aren’t supposed to approve of any work they know is unsafe.

cyberpunk007@lemmy.ca on 22 May 2024 01:14 next collapse

“new” concerns lol. There are so many of these articles with self driving cars crashing.

admin@lemmy.my-box.dev on 22 May 2024 03:58 next collapse

Counterpoint: we don’t get much articles about human drivers crashing, because we’re so used to it. That doesn’t make it a good metric to consider their safety.

Edit: Having said that, this wasn’t even an article. Just an unsourced headline with a photo. One should strongly consider the possibility of a selection bias at work here.

Thorny_Insight@lemm.ee on 22 May 2024 04:25 collapse

80 people die every single day in traffic accidents in the US alone and we’re focusing on the leading company trying to solve this issue when their car almost hits a train.

cyberpunk007@lemmy.ca on 22 May 2024 05:54 next collapse

Let’s pretend it’s 50/50 humans drive ng cars and self driving cars. The numbers would be a lot higher. It’s not really a fair comparison.

SaltySalamander@fedia.io on 22 May 2024 22:21 collapse

Unfounded conjecture. You can't spout your feelings as if they're objective fact.

Cornelius_Wangenheim@lemmy.world on 22 May 2024 09:51 next collapse

Tesla is not remotely close to being the leading company. That would be Google/Waymo.

Thorny_Insight@lemm.ee on 22 May 2024 10:33 collapse

What makes them the leader? You can’t even buy a car from them and I would be willing to bet that the number of kilometers driven on autopilot/FSD on Teslas is orders of magnitude greater than the competition and rapidly increasing each day. Even the most charitable view would place them on par with Tesla at best. Waymo/Cruze both have remote operators helping for when their vehicles get stuck. Even the MB Drive Pilot will ask for the driver to take over when needed. They’re not fully functional self-driving vehicles no more than Teslas are.

piranhaphish@lemmy.world on 22 May 2024 12:30 collapse

I’ve never hit a train. And I’ve also never almost hit a train. I think I could go my entire life never almost hitting trains and I would still consider that the bare minimum for a mammal with two eyes and a brain.

SaltySalamander@fedia.io on 22 May 2024 22:22 collapse

Congratulations. Want a cookie? People drive into trains all the time. You can literally find dozens of videos online showing this very thing.

solrize@lemmy.world on 22 May 2024 03:03 next collapse

The good news is that we can finally see the light at the end of the tunnel…

Tronn4@lemmy.world on 22 May 2024 03:27 next collapse

“It’s just a freight train coming your waaaaayyyyyyy!” -metalilica

shortwavesurfer@monero.town on 22 May 2024 03:43 collapse

Seriously, you just had to throw that pun in there. LOL.

ElPenguin@lemmynsfw.com on 22 May 2024 06:02 next collapse

As someone with more than a basic understanding of technology and how self driving works, I would think the end user would take special care driving in fog since the car relies on cameras to identify the roads and objects. This is clearly user error.

[deleted] on 22 May 2024 06:11 next collapse

.

tb_@lemmy.world on 22 May 2024 06:26 next collapse

This is clearly user error.

When it’s been advertised to the user as “full self driving”, is it?

Furthermore, the car can’t recognize the visibility is low and alert the user and/or refuse to go into self driving?

Maddier1993@programming.dev on 22 May 2024 08:03 next collapse

When it’s been advertised to the user as “full self driving”, is it?

I wouldn’t believe an advertisement.

tb_@lemmy.world on 22 May 2024 08:52 next collapse

I wouldn’t trust Musk with my life either.

But, presumably, we have moved beyond the age of advertising snake oil and miracle cures; advertisements have to be somewhat factual.

If a user does as is advertised and something goes wrong I do believe it’s the advertiser who is liable.

0x0@programming.dev on 22 May 2024 09:00 collapse

But, presumably, we have moved beyond the age of advertising snake oil and miracle cures and advertisements have to be somewhat factual.

Keyword presumably.

tb_@lemmy.world on 22 May 2024 09:19 next collapse

Right. But can you blame the user for trusting the advertisement?

0x0@programming.dev on 22 May 2024 09:43 next collapse

At the dealership? Kinda, yeah, it’s a dealership and news like this pop up every week.

On the road? I wouldn’t trust my life to any self-driving in this day and age.

SaltySalamander@fedia.io on 22 May 2024 22:18 collapse

I mean, yes. I blame anyone who falls for marketing hype of any kind.

jaybone@lemmy.world on 22 May 2024 13:48 collapse

If the product doesn’t do what it says it does, that’s the product / manufacturers fault. Not the users fault. Wtf lol how is this even a debate.

helpmyusernamewontfi@lemmy.today on 22 May 2024 08:54 collapse

problem is most people do. anybody remember watch dogs?

darganon@lemmy.world on 22 May 2024 10:19 collapse

There are many quite loud alerts when FSD is active in subpar circumstances about how it is degraded, and the car will slow down. That video was pretty foggy, I’d say the dude wasn’t paying attention.

I came up on a train Sunday evening in the dark, which I hadn’t had happen in FSD, so I decided to just hit the brakes. It saw the crossing arms as blinking stoplights, probably wouldn’t have stopped?

Either way that dude was definitely not paying attention.

noxy@yiffit.net on 22 May 2024 16:42 collapse

Leaving room for user error in this sort of situation is unacceptable at Tesla’s scale and with their engineering talent, as hamstrung as it is by their deranged leadership

SaltySalamander@fedia.io on 22 May 2024 22:20 collapse

If you are in the driver's seat, you are 100% responsible for what your car does. If you let it drive itself into a moving train, that's on you.

noxy@yiffit.net on 22 May 2024 22:44 collapse

I cannot fathom how anyone can honestly believe Tesla is entirely faultless in any of this, completely and totally free of any responsibility whatsoever.

I’m not gonna say they’re 100% responsible but they are at least 1% responsible.

SaltySalamander@fedia.io on 22 May 2024 23:31 collapse

If Tesla is at fault for an inattentive driver ignoring the myriad warnings he got to remain attentive when he enabled FSD and allowing the 2 ton missile he's sitting in to nearly plow into a train, then Dodge has to be responsible for the Challenger being used to plow into those protestors in Charlottesville.

God fucking damn it, why do you people insist on making me defend fucking Tesla?!

noxy@yiffit.net on 22 May 2024 23:40 collapse

Not defending Tesla is free, you can just immediately enjoy the immense benefits of not defending Tesla.

riodoro1@lemmy.world on 22 May 2024 07:10 next collapse

What a bunch of morons people were in 1912 to believe a ship could be unsinkable. Amirite guys?

Pazuzu@midwest.social on 23 May 2024 07:33 collapse

The Titanic probably wouldn’t have sunk if it hit the iceberg head on. Clearly the Tesla simply mistook the train for an iceberg and itself for an ocean-liner and opted for a more ideal collision. The driver should have disabled ‘sea mode’ if they didn’t want that behavior, it’s all clearly spelled out in the owners manual.

cestvrai@lemm.ee on 22 May 2024 07:10 next collapse

As a frequent train passenger, I’m not overly concerned.

Seems a bit too weak to derail, probably only delay.

FangedWyvern42@lemmy.world on 22 May 2024 07:41 next collapse

Every couple of months there’s a new story like this. And yet we’re supposed to believe this system is ready for use…

darki@lemmy.world on 22 May 2024 08:47 next collapse

It is ready because Musk needs it to be ready. Watch out, this comment may bring the morale down, and Elron will be forced to … Cry like a baby 😆

Buffalox@lemmy.world on 22 May 2024 12:57 collapse

Didn’t he recently claim Tesla robotaxi is only months away?
Well I suppose he didn’t say how many months, but the implication was less than a year, which has been his claim every year since 2016.

dustyData@lemmy.world on 22 May 2024 15:10 collapse

He said that Teslas were an investment worth hundreds of thousands of dollars because owners would be able to use them as robot taxis when they weren’t using their car and charge a small fee by next year…in 2019. Back then he promised 1 million robot taxis nationwide in under a year. Recently he gave the date august 8 to reveal a new model of robot taxi. So, by Cybertruck estimates, I would say a Tesla robot taxi is a possibility by late 2030.

He is just spewing shit to keep the stock price afloat, as usual.

dual_sport_dork@lemmy.world on 22 May 2024 19:01 collapse

He also said they were ready to manufacture the 2nd generation Tesla Roaster “now,” which was back in 2014. No points for guessing that as of yet (despite taking in millions of dollars in preorders) they have not produced a single one.

Given this very early and still quite relevant warning, I’m astounded that anyone is dumb enough to believe any promise Elon makes about anything.

Thorny_Insight@lemm.ee on 22 May 2024 09:19 next collapse

In what way is it not ready to use? Does cars have some other driver assistant features that are fool proof? You’re not supposed to blindly trust any of those. Why would FSD be an exception? The standards people are aplying to it are quite unreasonable.

ammonium@lemmy.world on 22 May 2024 09:59 next collapse

Because it’s called Full Self Drive and Musk has said it will be able to drive without user intervention?

Thorny_Insight@lemm.ee on 22 May 2024 10:25 next collapse

It’s called Full Self Driving (Supervised)

Yeah, it will be able to drive without driver intervention eventually. Atleast that’s their goal. Right now however, it’s level 2 and no-one is claiming otherwise.

In what way is it not ready to use?

noxy@yiffit.net on 22 May 2024 16:36 collapse

Full Self Driving (sike!)

dream_weasel@sh.itjust.works on 22 May 2024 10:29 collapse

The naming is poor, but in no way does the car represent to you that no intervention is required. It also constantly asks you for input and even watches your eyes to make sure you pay attention.

Honytawk@lemmy.zip on 22 May 2024 17:58 collapse

The car maybe not, but the marketing sure does

dream_weasel@sh.itjust.works on 22 May 2024 18:08 collapse

Marketing besides the naming we have already established and Elon himself masturbating to it? Is there some other marketing that pushes this narrative, because I certainly have not seen it.

Holyginz@lemmy.world on 22 May 2024 10:21 next collapse

No, the standards people are applying to it are the bare minimum for a full self driving system like what musk claims.

Thorny_Insight@lemm.ee on 22 May 2024 10:27 collapse

It’s a level 2 self driving system which by definition requires driver supervision. It’s even stated in the name. What are the standards it doesn’t meet?

piranhaphish@lemmy.world on 22 May 2024 12:24 next collapse

It’s unreasonable for FSD to see a train? … that’s 20ft tall and a mile long? Am I understanding you correctly?

Foolproof would be great, but I think most people would set the bar at least as high as not getting killed by a train.

Thorny_Insight@lemm.ee on 22 May 2024 13:02 collapse

Did you watch the video? It was insanely foggy there. It makes no difference how big the obstacle is if you can’t even see 50 meters ahead of you.

Also, the car did see the train. It just clearly didn’t understand what it was and how to react to it. That’s why the car has a driver who does. I’m sure this exact edge case will be added to the training data so that this doesn’t happen again. Stuff like this takes ages to iron out. FSD is not a finished product. It’s under development and receives constant updates and keeps improving. That’s why it’s classified as level 2 and not level 5.

Yes. It’s unreasonable to expect brand new technology to be able to deal with every possible scenario that a car can encounter on traffic. Just because the concept of train in a fog makes sense to you as a human doesn’t mean it’s obvious to the AI.

piranhaphish@lemmy.world on 22 May 2024 13:39 collapse

In what way is it not ready to use?

To me it seems you just spent three paragraphs answering your own question.

can’t even see 50 meters ahead

didn’t understand what it was and how to react to it

FSD is not a finished product. It’s under development

doesn’t mean it’s obvious to the AI

If I couldn’t trust a system not to drive into a train, I don’t feel like I would trust it to do even the most common tasks. I would drive the car like a fully attentive human and not delude myself into thinking the car is driving me with “FSD.”

Thorny_Insight@lemm.ee on 22 May 2024 13:57 collapse

You can’t see 50 meters ahead in that fog.

piranhaphish@lemmy.world on 22 May 2024 14:42 next collapse

Completely true. And I would dictate my driving characteristics based on that fact.

I would drive at a speed and in a manner that would allow me to not almost crash into things. But especially trains.

Thorny_Insight@lemm.ee on 22 May 2024 14:53 collapse

I agree. In fact I’m surprised the vehicle even lets you enable FSD in that kind of poor visibility and based on the video it seemed to be going quite fast aswell.

Honytawk@lemmy.zip on 22 May 2024 17:57 collapse

LIDAR can

Thorny_Insight@lemm.ee on 22 May 2024 19:20 collapse

Yeah there’s a wide range of ways to map the surroundings. Road infrastructure, however is designed for vision so I don’t see why just cameras wouldn’t be sufficient. The issue here is not that it’s didn’t see the train - it’s on video, after all - but that it didn’t know how to react to it.

assassin_aragorn@lemmy.world on 22 May 2024 15:00 next collapse

You’re not supposed to blindly trust any of those. Why would FSD be an exception?

Because that’s how Elon (and by extension Tesla) market it. Full self driving. If they’re saying I can blindly trust their product, then I expect it to be safe to blindly trust it.

And if the fine print says I can’t blindly trust it, they need to be sued or put under legal pressure to change the term, because it’s incredibly misleading.

Thorny_Insight@lemm.ee on 22 May 2024 15:14 collapse

Full Self Driving (Beta), nowdays Full Self Driving (Supervised)

Which of those names invokes trust to put your life in it’s hands?

It’s not in fine print. It’s told to you when you purchase FSD and the vehicle reminds you of it every single time you enable the system. If you’re looking at your phone it starts nagging at you eventually locking you out of the feature. Why would they put driver monitoring system in place if you’re supposed to put blind faith into it?

That is such an old, beat up strawman argument. Yes, Elon has said it would be fully autonomous in a year or so which turned out to be a lie but nobody today is claiming it can be blindly trusted. That simply just is not true.

assassin_aragorn@lemmy.world on 22 May 2024 15:26 next collapse

Unfortunately, companies also have to make their products safe for idiots. If the system is in beta or must be supervised, there should be inherently safe design that prevents situations like this from happening even if an idiot is at the wheel.

Thorny_Insight@lemm.ee on 22 May 2024 15:40 next collapse

ESP is not idiot proof either just to name one such feature that’s been available for decades. It assists the driver but doesn’t replace them.

Hell, cars themselves are not idiot proof.

SaltySalamander@fedia.io on 22 May 2024 22:14 collapse

Hell, cars themselves are not idiot proof.

Yup, almost always, there's an idiot in the driver seat.

VirtualOdour@sh.itjust.works on 22 May 2024 16:31 collapse

Yeah and cars should have a system to stop idiots doing dumb things, best we have is a license so if it’s good enough for cars without added safety features is good enough for them with

Honytawk@lemmy.zip on 22 May 2024 18:02 collapse

It isn’t Full Self Driving if it is supervised.

It’s especially not Full Self Driving if it asks you to intervene.

It is false advertisement at best, deadly at worst.

Thorny_Insight@lemm.ee on 22 May 2024 19:38 collapse

It’s misleading advertising for sure. At no point have I claimed otherwise.

The meaning of what qualifies as “full self driving” is still up for debate however. There are worse human drivers on the roads than what the current version of FSD is capable of. It’s by no means flawless but it’s much better than most people even realize. It’s a vehicle capable of self driving even if not fully.

noxy@yiffit.net on 22 May 2024 16:18 collapse

Of whiat words is FSD an acronym?

dream_weasel@sh.itjust.works on 22 May 2024 10:27 collapse

Ever couple of months you hear about every issue like this, just like you hear about every airline malfunction. It ignores the base rate of accurate performances which is very high.

FSD is imperfect but still probably more ready for use than a substantial fraction of human drivers.

buddascrayon@lemmy.world on 22 May 2024 10:58 next collapse

This isn’t actually true. The Tesla full self driving issues we hear about in the news are the ones that result in fatal and near fatal accidents, but the forums are chock full of reports from owners of the thing malfunctioning on a regular basis.

dream_weasel@sh.itjust.works on 22 May 2024 11:12 collapse

It IS actually true. It does goofy stuff in some situations, but on the whole is a little better than your typical relatively inexperienced driver. It gets it wrong about when to be assertive and when to wait sometimes, it thinks there’s enough space for a courteous merge but there isn’t (it does some Chicago style merges sometimes), it follows the lines on the road like they are gospel, and doesn’t always properly estimate how to come to a smooth and comfortable stop. These are annoying things, but not outrageous provided you are paying attention like you’re obliged to do.

I have it, I use it, and I make lots of reports to Tesla. It is way better than it used to be and still has plenty of room to improve, but a Tesla can’t reboot without having a disparaging article written about it.

Also fuck elon, because I don’t think it gets said enough.

bane_killgrind@lemmy.ml on 22 May 2024 11:36 next collapse

typical relatively inexperienced driver

Look at the rates that teenagers crash, this is an indictment.

provided you are paying attention

It was advertised as fully autonomous dude. People wouldn’t have this much of a hard-on for trashing it if it wasn’t so oversold.

Thorny_Insight@lemm.ee on 22 May 2024 13:08 collapse

This fully autonomous argument is beat to death already. Every single Tesla owner knows you’re supposed to pay attention and be ready to take over when necessary. That is such a strawman argument. Nobody blames the car when automatic braking fails to see the car infront of it. It might save your ass if you’re distracted but ultimately it’s always the driver whose responsible. FSD is no different.

Rekorse@lemmy.dbzer0.com on 23 May 2024 02:24 next collapse

You realize it can be true that the driver is at fault when they crash and that the crash was more likely to happen because you have Elon contradicting his own marketing team constantly and confusing people.

He literally would take reporters in his car and take his hands off the wheel. He just fundamentally doesn’t care about safety now. Probably doesn’t about safety later, just saw a way to make some money.

Pazuzu@midwest.social on 23 May 2024 07:17 collapse

If it’s not fully capable of self driving then maybe they shouldn’t call it full self driving

Thorny_Insight@lemm.ee on 23 May 2024 08:23 collapse

Sure. Make then change the name to something different. I’m fine with that.

Though I still don’t know what most people actually mean by full self driving and how it’s different from what FSD can do right now.

buddascrayon@lemmy.world on 22 May 2024 17:51 collapse

Seriously you sound like a Mac user in the '90s. “It only crashes 8 or 9 times a day, it’s so much better than it used to be. It’s got so many great features that I’m willing to deal with a little inconvenience…” Difference being that when a Mac crashes it just loses some data and has to reboot but when a Tesla crashes people die.

dream_weasel@sh.itjust.works on 22 May 2024 18:05 next collapse

These are serious rate differences man.

Every driver, and even Tesla, will tell you it’s a work in progress, and you’d be hard pressed to find someone who has had an accident with it. I’d be willing to bet money that IF You find someone who has had an accident they have a driving record that’s shitty without it too.

If you want to talk stats, let’s talk stats, but “It seems like Tesla is in the news a lot for near crashes” is a pretty weak metric, even from your armchair.

Rekorse@lemmy.dbzer0.com on 23 May 2024 02:19 collapse

Is 200ish crashes and 6 deaths per year too many?

I know its an absolute number but we are asking if its worth sacrificing people for the potential of safer driving later.

Can you explain why you are so confident that this will all be worth it in the end?

Evidence that teslas are more dangerous than other cars: thedrive.com/…/tesla-drivers-have-the-highest-cra…

Evidence for the 200 crashes and 6 deaths a year claim for FSD: theverge.com/…/tesla-autopilot-fsd-nhtsa-investig…

frostysauce@lemmy.world on 22 May 2024 22:37 collapse

Seriously you sound like a Mac user in the '90s Linux user today.

FTFY

lolcatnip@reddthat.com on 22 May 2024 14:17 collapse

You hear so much about the people Jeffrey Dahmer murdered, but never anything about all the people he didn’t murder!

dream_weasel@sh.itjust.works on 22 May 2024 17:51 collapse

Cute.

Here’s some actual information

People are terrible at probability estimation, and even with two fatal accidents a month FSD is likely still safer than most of the people on the road per million miles driven.

lolcatnip@reddthat.com on 23 May 2024 00:52 collapse

I see you’ve decided to be condescending, and also made a falsifiable claim. This is the part where you bring some actual data or STFU.

dream_weasel@sh.itjust.works on 23 May 2024 02:00 collapse

Whatever you say Mr Dahmer joke instead of content. I see that was really all in good faith and maybe I unintentionally hurt your feelings by citing a source on base rate biases?

What data would you like me to bring for discussion since you’ve been so open thus far? Do you want me to bring some data showing that teslas spend more time not having accidents than having accidents? I’m happy to go do some homework to enrich this interaction.

It’s not as though you can just ask Tesla for every case of an FSD crash. The falsifiable claim is just me tossing a number, the point is that memorable bad press and bad stats are not the same.

Rekorse@lemmy.dbzer0.com on 23 May 2024 02:26 collapse

Why would you think there isnt data on Tesla crashes? Are they hiding their broken cars from bystanders and police or something now?

helpmyusernamewontfi@lemmy.today on 22 May 2024 08:56 next collapse

what’s so “new” about this concern? I’d probably be able to afford a house if I had a dollar for every article I saw on Tesla’s wrecking or nearly wrecking because of FSD.

0x0@programming.dev on 22 May 2024 08:58 next collapse

TL;DR Tesla driver almost won Darwin award.

TammyTobacco@lemmy.ml on 22 May 2024 14:30 collapse

It was so foggy that I’m not surprised the car couldn’t figure out what was happening. The guy also said his car had driven towards trains twice before, so he’s definitely a dumbass for continuing to use self driving, especially in heavy fog.

skyspydude1@lemmy.world on 22 May 2024 16:38 collapse

If only there was some sort of sensing technology that wasn’t purely optical, that’d be pretty neat. Maybe even using something like radio, for detection and ranging. Too bad no one’s ever come up with something like that.

Akasazh@feddit.nl on 22 May 2024 10:09 next collapse

I don’t see any information about the crossing. Was it a crossing without gates? As the sensors must’ve picked that up when driving towards it. If so, is a huge oversight not putting up gated crossings nowadays, certainly on busy roads, regardless of the performance of self driving cars.

Woovie@lemmy.world on 22 May 2024 10:15 next collapse

The video in the article shows lowered arms flashing. Very visible with plenty of time to stop despite the foggy conditions. It just didn’t.

Akasazh@feddit.nl on 22 May 2024 10:16 next collapse

Ah. I’ve read it, but I have media tuned of, so I didn’t see the video. Thanks for the clarification!

Woovie@lemmy.world on 22 May 2024 10:18 collapse

Yep of course!

neshura@bookwormstory.social on 22 May 2024 12:56 collapse

Stuff like that happens whe you opt for visual obstacle detection instead of lidar

ElderWendigo@sh.itjust.works on 22 May 2024 15:54 collapse

Not being able to identify a railroad crossing without a gate is a failing of the car not the train. Gated crossings are not guaranteed, nor should they be because they don’t make sense for every situation in which roads and tracks cross.

Akasazh@feddit.nl on 22 May 2024 19:57 next collapse

True, but it would be an exceptional failure if the car missed a gated crossing, as it turns out it was.

pirat@lemmy.world on 22 May 2024 21:19 next collapse

they don’t make sense for every situation in which roads and tracks cross.

www.youtube.com/watch?v=peXry-_B87g

PipedLinkBot@feddit.rocks on 22 May 2024 21:19 collapse

Here is an alternative Piped link(s):

https://www.piped.video/watch?v=peXry-_B87g

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

pirat@lemmy.world on 22 May 2024 21:19 collapse

they don’t make sense for every situation in which roads and tracks cross.

www.youtube.com/watch?v=peXry-_B87g

PipedLinkBot@feddit.rocks on 22 May 2024 21:19 collapse

Here is an alternative Piped link(s):

https://www.piped.video/watch?v=peXry-_B87g

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

sp3tr4l@lemmy.zip on 22 May 2024 10:55 next collapse

This is horrible!

Obviously the Tesla’s cold gas thrusters must be malfunctioning! The Fully Autonomous Only Non Insane Car AutoPilot was clearly going to jump over the train Speed Racer style!

Thank goodness the driver realized the thruster failure light was on and was able to avoid the worst.

Edit: This is sarcasm! I hate using /s because I’d like to believe people can tell, but I guess not this time…

buddascrayon@lemmy.world on 22 May 2024 10:55 next collapse

When you look at the development notes on the self-driving at Tesla, anyone with a brain wouldn’t trust that shit, not even a little bit. Most of what they did is placate Musk’s petty whims and delusions. Any real R&D issues were basically glazed over it given quick software fixes.

Scolding7300@lemmy.world on 22 May 2024 14:39 next collapse

Are there notes available to read?

buddascrayon@lemmy.world on 22 May 2024 17:45 collapse

Here’s one of the few articles that wasn’t paywalled when I pulled up a Google search on it. It’s really not hard to find the stories are all over the place.

businessinsider.com/elon-musk-tesla-autopilot-fsd…

VirtualOdour@sh.itjust.works on 22 May 2024 16:27 collapse

Demonstrate what you mean because it really sounds like you’re describing what you feel should be true to justify your emotions about the bad Twitter man.

And to be clear, I mean link the documents you claim to have read and the parts of them you claim demonstrate this.

buddascrayon@lemmy.world on 22 May 2024 17:44 collapse

Just need to Google “Tesla self-driving development engineers and Elon Musk”, and you’ll find lots of articles. Here’s one of the few that wasn’t paywalled.

businessinsider.com/elon-musk-tesla-autopilot-fsd…

WoahWoah@lemmy.world on 23 May 2024 00:50 next collapse

This is an article from 2021 about a book researched in 2019.

buddascrayon@lemmy.world on 23 May 2024 01:44 collapse

Yeah, during development of the Tesla self driving system.

WoahWoah@lemmy.world on 23 May 2024 02:37 collapse

Read the development notes from the first years of any technology you use. The research you’re “referencing” is six years old at this point.

What’s next? You going to criticize an iPod Nano to make a point about the broken screen on your iPhone 8? Criticize Google assistant from 2019 to harangue OpenAI?

Look at what six years of development means: youtu.be/qTDlRLeDxxM?si=dFZzLcO_a8wfy2QS

buddascrayon@lemmy.world on 23 May 2024 10:47 collapse

It’s not about how well or badly it worked when they were developing it, it’s about the developing process. It’s about the fact that they had to appease Elon Musk’s ego in all aspects of developing the self drive system. To a disastrous degree.

And again there is a world of difference between the iPhone or Open AI or Google Assistant not working right and a car driving itself not working right because when those other things don’t work nobody dies or gets hurt. But a car can mame and kill people extremely easily.

VirtualOdour@sh.itjust.works on 23 May 2024 10:45 collapse

That’s a very old article about even older opinions, now totally outdated as shown by statements like;

Almost five years on, Tesla still hasn’t completed its autonomous road trip — no car company has even come close.

You’re using unsubstantiated statements from the start of development which is totally different to what you claimed before being asked for a source.

Current development FSD has hit huge milestones which competitors have not.

misterundercoat@lemmy.world on 22 May 2024 11:34 next collapse

He fortunately avoided the train, but unfortunately still owns a Tesla.

werefreeatlast@lemmy.world on 22 May 2024 12:32 next collapse

Oh! As a token of ah…of…aah… a knowledge mental acknowledgement, we the US people would like to gift this here Tesla to you all, Putin, and Iran leadership. You get a Tesla and you get a Tesla…and you get a Tesla!

Buffalox@lemmy.world on 22 May 2024 12:48 next collapse

Obvious strong blinking red light ahead, obvious train passing ahead…

Tesla FSD: Hmmm let’s not even slow down, I don’t see any signs of problems.

FSD is an acronym for Fool Self Driving.

nyan@lemmy.cafe on 22 May 2024 13:36 next collapse

Are there any classes of object left that Tesla FSD has not either hit or almost hit? Icebergs, maybe?

WhiskyTangoFoxtrot@lemmy.world on 22 May 2024 17:04 collapse

youtu.be/xll0VOsiE84

PipedLinkBot@feddit.rocks on 22 May 2024 17:04 collapse

Here is an alternative Piped link(s):

https://piped.video/xll0VOsiE84

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

nifty@lemmy.world on 22 May 2024 14:36 next collapse

For now, cars need more than computer vision to navigate because right now adding cameras by themselves doesn’t help a car spatially orient itself in its environment. What might help? I think the consensus is that the cameras need to get a 360 deg view of the surroundings and the car needs a method for making sense of these inputs without that understanding being the focus of attention.

It seems Teslas do add sensors in appropriate locations to be able to do that, but there’s some disconnect in reconciling the information: notateslaapp.com/…/tesla-guide-number-of-cameras-…. A multi-modal sensing system would bypass reliance on getting everything right via CV.

Think of you focusing on an object in the distance and moving toward it: while you’re using your eyes to look at it, you’re subconsciously computing relative distance and speed as you approach it. it’s your subconscious memory of your 3D spatial orientation that helps you make corrections and adjustments to your speed and approach. Outside of better hardware that can reconcile these different inputs, relying on different sensor inputs would make the most robust approach for autonomous vehicles.

Humans essentially keep track of their body in 3D space and time without thinking about it, and actually most multicellular organisms have learned to do this in some manner.

itsonlygeorge@reddthat.com on 22 May 2024 14:47 next collapse

Tesla opted not to use LIDAR as part of its sensor package and instead relies on cameras which are not enough to determine accurate location data for other cars/trains etc.

This is what you get when billionaires cheap out on their products.

skyspydude1@lemmy.world on 22 May 2024 16:30 next collapse

Not only that, but took out the radar, which while it has its own flaws, would have had no issue seeing the train through the fog. While they claimed it was because they had “solved vision” and didn’t need it anymore, it’s bullshit, and their engineering team knew it. They were in the middle of sourcing a new radar, but because of supply chain limitations (like everyone in 2021) with both their old and potential new supplier, they wouldn’t continue their “infinite growth” narrative and fElon wouldn’t get his insane pay package. They knew for a fact it would negatively affect performance significantly, but did it anyway so line could go up.

While no automotive company’s hands are particularly clean, the sheer level of willful negligence at Tesla is absolutely astonishing and have seen and heard so many stories about their shitty engineering practices that the only impressive thing is how relatively few people have died as a direct result of their lax attitude towards basic safety practices.

Imalostmerchant@lemmy.world on 22 May 2024 17:23 next collapse

I never understood Musk’s reasoning for this decision. From my recollection it was basically “how do you decide who’s right when lidar and camera disagree?” And it felt so insane to say that the solution to conflicting data was not to figure out which is right but only to listen to one.

Jakeroxs@sh.itjust.works on 22 May 2024 18:46 next collapse

Also that LIDAR is more expensive then cameras, which means higher end user price, as far as I remember.

Imalostmerchant@lemmy.world on 22 May 2024 19:14 collapse

I wasn’t sure if he admitted that as being the reason (even though it obviously is)

Jakeroxs@sh.itjust.works on 22 May 2024 20:07 collapse

I thought that was his main justification, idk tho, I don’t listen to the earnings calls or interviews myself lol

Cornpop@lemmy.world on 22 May 2024 22:24 next collapse

All about saving a buck.

wirehead@lemmy.world on 22 May 2024 23:37 next collapse

I mean, I think he’s a textbook example of why not to do drugs and why we need to eat the rich, but I can understand the logic here.

When you navigate a car as a human, you are using vision, not LIDAR. Outside of a few edge cases, you aren’t even using parallax to judge distances. Ergo, a LIDAR is not going to see the text on a sign, the reflective stripes on a truck, etc. And it gets confused differently than the eye, absorbed by different wavelengths, etc. And you can jam LIDAR if you want. Thus, if we were content to wait until the self-driving-car is actually safe before throwing it out into the world, we’d probably want the standard to be that it navigates as well as a human in all situations using only visual sensors.

Except, there’s some huge problems that the human visual cortex makes look real easy. Because “all situations” means “understanding that there’s a kid playing in the street from visual cues so I’m going to assume they are going to do something dumb” or “some guy put a warning sign on the road and it’s got really bad handwriting”

Thus, the real problem is that he’s not using LIDAR as harm reduction for a patently unsafe product, where the various failure modes of the LIDAR-equipped self-driving cars show that those aren’t safe either.

smonkeysnilas@feddit.de on 23 May 2024 07:43 collapse

I mean the decision was stupid from an engineering point of view, but the reasoning is not entirely off. Basically it follows the biological example: if humans can drive without Lidar and only using their eyes than this is proof that it is possible somehow. It’s only that the current computer vision and AI tech is way worse than humans. Elon chose to ignore this, basically arguing that it is merely a software problem for his developers to figure out. I guess in reality it is a bit more complex.

Wrench@lemmy.world on 22 May 2024 17:55 collapse

LIDAR would have similarly been degraded in the foggy conditions that this occurred in. Lasers are light too.

While I do think Tesla holds plenty of responsibility for their intentionally misleading branding in FSD, as well as cost saving measures to not include lidar and/or radar, this particular instance boils down to yet another shitty and irresponsible driver.

You should not be relying on FSD over train tracks. You should not be allowing FSD to be going faster than conditions allow. Dude was tearing down the road in thick fog, way faster than was safe for the conditions.

WoahWoah@lemmy.world on 23 May 2024 00:43 next collapse

Well said.

FreddyDunningKruger@lemmy.ml on 23 May 2024 01:20 next collapse

One of the first things you learn to get your driver’s license is the Basic Speed Law, you must not drive faster than the driving conditions would allow. If only Full Self Driving followed the law and reduced its max speed based on the same.

Rekorse@lemmy.dbzer0.com on 23 May 2024 01:58 collapse

If you were to strictly take that rule seriously, you should not allow FSD to drive at all, as at any speed its more dangerous than the person driving it (given an average driver who’s not intoxicated).

Rekorse@lemmy.dbzer0.com on 23 May 2024 01:55 next collapse

A Tesla drover might get the impression that the cars “opinion” is better than their own, which could cause them to hesitate before intervening or to allow the car to drive in a way they are uncomfortable with.

The misinformation about the car reaches the level of negligence because even smart people are being duped by this.

Honestly I think some people just dont believe someone could lie so publicly and loudly and often, that it must be something else besides a grift.

Pazuzu@midwest.social on 23 May 2024 07:05 collapse

Maybe it shouldn’t be called full self driving if it’s not fully capable of self driving

MacStache@programming.dev on 22 May 2024 14:52 next collapse

Those trains sure are weird and confusing, with their back and forth those tracks and all. Makes you wonder about train safety, it does!

n3m37h@sh.itjust.works on 22 May 2024 15:01 next collapse

10x safer than a human!

// added a fun link

angelmountain@feddit.nl on 22 May 2024 15:07 next collapse

I don’t want to disagree, but I would like a source to support this claim

raspberriesareyummy@lemmy.world on 22 May 2024 15:09 next collapse

That exclamation point in the comment you replied to should be your hint that it’s sarcasm.

n3m37h@sh.itjust.works on 22 May 2024 18:42 collapse

No, Musk said this at one point at some press conference

raspberriesareyummy@lemmy.world on 22 May 2024 21:19 collapse

Not the person I was replying to but okay… what has that got to do with the sarcasm of the comment I was referring to?

n3m37h@sh.itjust.works on 22 May 2024 23:15 collapse

Look @ original comment, Musk has stated many a time that FSD is safer than human drivers and I’m pretty sure at one point he said it was 10x safer… No sarcasm in that statement.

Oh here

n3m37h@sh.itjust.works on 22 May 2024 17:50 next collapse

Musk has multiple times stated that FSD is safer than han driving. I’m not gonna bother finding the bids as I’m at work

n3m37h@sh.itjust.works on 22 May 2024 23:16 collapse

motherfrunker.ca/fsd/

Here ya go someone else did it for me

Gsus4@mander.xyz on 22 May 2024 15:41 collapse

*a drunken human

ArtemisimetrA@lemmy.duck.cafe on 22 May 2024 15:25 next collapse

Let them earn their Darwin awards

explodicle@sh.itjust.works on 22 May 2024 19:54 collapse

I’d rather see some Free Market Darwinism™ in the form of a lawsuit.

ArtemisimetrA@lemmy.duck.cafe on 22 May 2024 21:05 collapse

Yeah I guess I’d be ok with that. I may have lost faith in our judicial system to get that shit done

Furbag@lemmy.world on 22 May 2024 15:57 next collapse

Oh boy, and they just removed “steering wheel nag” in a recent update. I can’t imagine that will have any unintended consequences.

WoahWoah@lemmy.world on 23 May 2024 00:47 collapse

Not really. They just removed unprompted nag. If you’re not constantly keeping a hand on the wheel and looking at the road, it nags more and will pull you over if you ignore it.

If you turn off the internal driver monitoring camera, you can’t engage FSD or even use lane assist.

noxy@yiffit.net on 22 May 2024 16:39 next collapse

Feels like these things were more capable a decade ago when they had radar.

Not that they should be called “full self driving” either then or now, but at least radar can deal fog better than regular ass cameras

Honytawk@lemmy.zip on 22 May 2024 17:07 next collapse

i.imgur.com/ccksjT7.mp4

Jakeroxs@sh.itjust.works on 22 May 2024 18:48 collapse

This is showing it works or no? I can’t tell and there isn’t audio, it seems like it would be stopped correctly.

SaltySalamander@fedia.io on 22 May 2024 21:59 collapse

Definitely shows it working.

Jakeroxs@sh.itjust.works on 22 May 2024 22:45 collapse

Yeah, I was thinking maybe the weird flashing lights on the screen was maybe being pointed to as not working right or something to that effect? Idk lol no context provided at all

Ozymati@lemmy.nz on 22 May 2024 21:19 next collapse

Guess it gained self awareness and realised it was a tesla

enleeten@discuss.online on 22 May 2024 23:22 next collapse

“Your father’s a Cybertruck!”

Etterra@lemmy.world on 23 May 2024 03:07 collapse

TFW even your Tesla thinks your Elon fanboy tweets are insufferable.

[deleted] on 22 May 2024 23:48 next collapse

.

Fades@lemmy.world on 23 May 2024 00:12 collapse

What a horrible thing to say, especially since Elon and Tesla have only relatively recently turned to absolute shit. There are a lot of Tesla drivers that don’t support what he has done to the company and all that.

Here you are advocating for the death of people because they purchased a vehicle. A lot of people bought Teslas as they were one of the better EVs at the time during Tesla’s climb to their peak (which they have since fallen very far from). They too deserve death?

captain_aggravated@sh.itjust.works on 23 May 2024 00:20 collapse

Here you are advocating for the death of people because they purchased a vehicle.

No; I’m expressing the same sentiment that I express for motorcycle riders that refuse to wear a helmet. I really, genuinely don’t care if they beat their brains out on the front bumper of a Hyundai, but I don’t think they get to force a Hyundai driver to hose brains off their car.

Teslas are death traps. Their owners can make that choice for themselves but I don’t think they get to make it for others, which is what they try to do every time they turn on that self-driving feature.

lemmyhavesome@lemmy.world on 23 May 2024 01:52 next collapse

Full Self Demolition

uebquauntbez@lemmy.world on 23 May 2024 07:56 next collapse

Hyperloops … hype … oops

gardylou@lemmy.world on 23 May 2024 08:03 collapse

“Sorry about your dead husband, trains weren’t in the training data. Our bad. Anyway, his loss is not in vain, as now that our engineers are aware that trains can be a potential driving hazard, we are going to fix this soon in a future software update.”