Tesla In 'Self-Drive Mode' Hit By Train After Turning Onto Train Tracks (www.jalopnik.com)
from Davriellelouna@lemmy.world to technology@lemmy.world on 23 Jun 15:11
https://lemmy.world/post/31867381

#technology

threaded - newest

Zwuzelmaus@feddit.org on 23 Jun 15:16 next collapse

Teslas do still have steering wheels, after all

You don’t say!

skvlp@lemm.ee on 23 Jun 15:35 next collapse

Working as expected then.

DarrinBrunner@lemmy.world on 23 Jun 15:45 next collapse

Can’t wait to hop in a Robotaxi! /s

What’s that? They’ll have human drivers in them? Still maybe no.

finitebanjo@lemmy.world on 23 Jun 15:51 collapse

Pretty sure this one also had a driver in it.

A family in Pennsylvania

TheFeatureCreature@lemmy.ca on 23 Jun 15:50 next collapse

Tesla’s self-driving is pretty shite but they seem to have a particular problem with railway crossings, as also pointed out in the article. Of all of the obstacles for the self-driving system to fail to detect, the several thousand tons of moving steel is probably one of the worst outcomes.

kautau@lemmy.world on 23 Jun 16:47 next collapse

Maybe if they use LIDAR like they should have instead of just cameras it wouldn’t be such an issue, but they’re determined to minimize costs and maximize profits at the expense of consumers as are all publicly traded companies

Empricorn@feddit.nl on 23 Jun 21:06 collapse

You don’t understand. Musk likes how they look, we can’t disturb that for “safety”!

kautau@lemmy.world on 23 Jun 21:15 collapse

Or it no longer has anything to do with making a vehicle look cool.

<img alt="" src="https://lemmy.world/pictrs/image/4d35f494-3c69-447c-80dc-097e331e41b9.webp">

The Lucid Air is equipped with up to 32 on-board sensors, including long range Lidar radar, short-range radar, surround view monitoring cameras.

It’s because musk treats all his businesses like startups, and no matter how successful they get, in the interest of “trimming the fat” he’d like to keep people buying inferior products at a higher profit margin than thinking about better investment and long term growth, just like many companies.

PattyMcB@lemmy.world on 23 Jun 17:15 collapse

That, and little kids… and motorcycles… and school busses

wizardbeard@lemmy.dbzer0.com on 23 Jun 20:19 collapse

Don’t forget “styrofoam walls painted to look like tunnels”. Fucking looney tunes.

spankmonkey@lemmy.world on 23 Jun 15:52 next collapse

Paraphrasing:

“We only have the driver’s word they were in self driving mode…”

“This isn’t the first time a Tesla has driven onto train tracks…”

Since it isn’t the first time I’m gonna go ahead and believe the driver, thanks.

XeroxCool@lemmy.world on 23 Jun 16:06 next collapse

The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.

Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don’t like a company? I don’t think so. But if Tesla has proof fsd was off, we’ll know in a minute when they invade the driver’s privacy and release driving events

spankmonkey@lemmy.world on 23 Jun 16:10 next collapse

Tesla has constantly lied about their FSD for a decade. We don’t trust them because they are untrustworthy, not because we don’t like them.

BlueLineBae@midwest.social on 23 Jun 16:26 next collapse

I have no sources for this so take with a grain of salt… But I’ve heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it’s Tesla’s faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla’s fault if it was on at the time.

SoleInvictus@lemmy.blahaj.zone on 23 Jun 16:32 next collapse

That would require their self driving algorithm to actually detect an accident. I doubt it’s capable of doing so consistently.

spankmonkey@lemmy.world on 23 Jun 16:35 next collapse

On a related note, getting unstuck from something like train tracks is a pretty significant hurdles. The only real way is to back up IF turning onto the tracks wasn’t a drop down of the same depth as the rails. Someone who is caught off guard isn’t going to be able to turn a passenger car off the tracks because the rails are tall and getting an angle with the wheels to get over them isn’t really available.

So while in a perfect world the driver would have slammed on the brakes immediately before it got onto the tracks, getting even the front wheels onto the tracks because they weren’t fast enough may have been impossible to recover from and going forward might have been their best bet. Depends on how the track crossing is built.

roguetrick@lemmy.world on 23 Jun 17:03 next collapse

I guess I’m a train now.

ayyy@sh.itjust.works on 23 Jun 17:50 collapse

If you’re about to be hit by a train, driving forward through the barrier is always the correct choice. It will move out of the way and you stay alive to fix the scratches in your paint.

spankmonkey@lemmy.world on 23 Jun 17:58 next collapse

Maybe you should read the article.

ayyy@sh.itjust.works on 23 Jun 17:59 collapse

I meant more in the general sense, I recognize that cars can get stuck places.

Ledericas@lemm.ee on 24 Jun 07:51 collapse

not if your in a tesslar.

meco03211@lemmy.world on 23 Jun 16:38 next collapse

Pretty sure they can tell the method used when disengaging fsd/ap. So they would know if it was manually turned off or if the system lost enough info and shut it down. They should be able to tell within a few seconds if accuracy the order of events. I can’t imagine a scenario that wouldn’t be blatantly obvious where the tesla was able to determine an accident was imminent and shut off fsd/ap wroth enough time to “blame it on the driver”. What might be possible is that the logs show fsd shut off like a millisecond before impact/event and then someone merely reported that fsd was not engaged at the time of the accident. Technically true and tesla lawyers might fight like hell to maintain that theory, but if an independent source is able to review the logs, I don’t see that being a possibility.

pixeltree@lemmy.blahaj.zone on 23 Jun 16:46 collapse

Of course they know, they’re using it to hide the truth. Stop giving a corporation the benefit of the doubt where public safety is concerned, especially when they’ve been shown to abuse it in the past

AA5B@lemmy.world on 23 Jun 17:36 collapse

They supposedly also have a threshold, like ten seconds - if FSD cuts out less than that threshold before the accident, it’s still FSD’s fault

AA5B@lemmy.world on 23 Jun 17:34 collapse

They promote it in ways that people sometimes trust it too much …. But in particular when releasing telemetry I do t remember tha ever being an accusation

ayyy@sh.itjust.works on 23 Jun 17:47 collapse

It’s more about when they don’t release it/only selectively say things that make them look good and staying silent when they look bad.

lka1988@lemmy.dbzer0.com on 23 Jun 16:16 next collapse

The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.

I owned an FJ80 Land Cruiser when that happened. I printed up a couple stickers for myself, and for a buddy who owned a Tacoma, that said “I’m not speeding, my pedal’s stuck!” (yes I’m aware the FJ80 was slow as dogshit, that didn’t stop me from speeding).

aramis87@fedia.io on 23 Jun 17:22 collapse

How is a manufacturer going to be held responsible for their flaws when musk DOGE'd <img alt="every single agency" src="https://i.imgur.com/t9AGtW7.jpg"> investigating his companies?

NuXCOM_90Percent@lemmy.zip on 23 Jun 16:37 next collapse

I mean… I have seen some REALLY REALLY stupid drivers so I could totally see multiple people thinking they found a short cut or not realizing the road they are supposed to be on is 20 feet to the left and there is a reason their phone is losing its shit all while their suspension is getting destroyed.

But yeah. It is the standard tesla corp MO. They detect a dangerous situation and disable all the “self driving”. Obviously because it is up to the driver to handle it and not because they want the legal protection to say it wasn’t their fault.

AA5B@lemmy.world on 23 Jun 17:42 collapse

At my local commuter rail station the entrance to the parking lot is immediately next to the track. It’s easily within margin of error for GPS and if you’re only focusing immediately in front of you the pavement at the entrance probably look similar.

There are plenty of cues so don’t rolled shouldn’t be fooled but perhaps FSD wouldn’t pay attention to them since it’s a bit of an outlier.

That being said, I almost got my Subaru stuck once because an ATV trail looked like the dirt road to a campsite from the GPS, and I missed any cues there may have been

NuXCOM_90Percent@lemmy.zip on 23 Jun 17:51 next collapse

You uh… don’t need to tell people stuff like that.

XeroxCool@lemmy.world on 24 Jun 04:55 collapse

Sounds reasonable to mix up dirt roads at a campsite. Idk why the other commenter had to be so uptight. I get the mixup in the lot if it’s all paved and smooth, especially if say you make a left into the lot and the rail has a pedestrian crossing first. Shouldn’t happen, but there’s significant overlap in appearance of the ground. The average driver is amazingly inept, inattentive, and remorseless.

I’d be amused if your lot is the one I know of where the train pulls out of the station, makes a stop for the crosswalk, then proceeds to just one other station.

But the part of rail that’s not paved between? That should always be identifiable as a train track. I can’t understand when people just send it down the tracks. And yet, it still happens. Even at the station mentioned above where they pulled onto the 100mph section. Unreal.

Pika@sh.itjust.works on 23 Jun 16:56 next collapse

Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn’t go public saying it wasn’t means that it was in self-drive mode and they want to save the PR face and liability.

IphtashuFitz@lemmy.world on 23 Jun 17:04 next collapse

I have a nephew that worked at Tesla as a software engineer for a couple years (he left about a year ago). I gave him the VIN to my Tesla and the amount of data he shared with me was crazy. He warned me that one of my brake lights was regularly logging errors. If their telemetry includes that sort of information then clearly they are logging a LOT of data.

atomicbocks@sh.itjust.works on 23 Jun 18:54 collapse

Modern cars (in the US) are required to have an OBD-II Port for On-Board Diagnostics. I always assumed most cars these days were just sending some or all of the real-time OBD data to the manufacturer. GM definitely has been.

Pika@sh.itjust.works on 23 Jun 19:30 collapse

Dude, in today’s world we’re lucky if they stop at the manufacturer. I know of a few insurances that have contracts through major dealers and they just automatically get the data that’s registered via the cars systems. That way they can make better decisions regarding people’s car insurance.

Nowadays it’s a red flag if you join a car insurance and they don’t offer to give you a discount if you put something like drive pass on which logs you’re driving because it probably means that your car is already getting that data to them.

CmdrShepard49@sh.itjust.works on 24 Jun 23:31 collapse

We just got back from a road trip in a friend’s '25 Tundra and it popped up a TPMS warning for a faulty sensor then minutes later he got a text from the dealership telling him about it and to bring it in for service.

catloaf@lemm.ee on 23 Jun 17:33 collapse

I’ve heard they also like to disengage self-driving mode right before a collision.

sturmblast@lemmy.world on 23 Jun 17:35 next collapse

That sounds a lot more like a rumor to me… it would be extremely suspicious and would leave them open to GIGANTIC liability issues.

sem@lemmy.blahaj.zone on 23 Jun 17:37 next collapse

It’s been well documented. It lets them say in their statistics that the owner was in control of the car during the crash

sturmblast@lemmy.world on 23 Jun 23:59 collapse

That’s my whole point

ayyy@sh.itjust.works on 23 Jun 17:39 next collapse

How so? The human in the car is always ultimately responsible when using level 3 driver assists. Tesla does not have level 4/5 self-driving and therefore doesn’t have to assume any liability.

Pika@sh.itjust.works on 23 Jun 19:12 next collapse

This right here is another fault in regulation that eventually will catch up because Especially with level three where it’s primarily the vehicle driving and the driver just gives periodic input It’s not the driver that’s in control most of the time. It’s the vehicle so therefore It should not be the driver at fault

Honestly, I think everything up to level two should be drivers at fault because those levels require a constant driver’s input. However, level three conditional driving and higher should be considered liability of the company unless the company can prove that the autonomous control, handed control back to the driver in a human-capable manner (i.e Not within the last second like Tesla currently does)

sturmblast@lemmy.world on 23 Jun 23:59 collapse

If you are monkeying with the car right before it crashes… wouldn’t that raise suspicion?

catloaf@lemm.ee on 23 Jun 17:39 collapse

In the report, the NHTSA spotlights 16 separate crashes, each involving a Tesla vehicle plowing into stopped first responders and highway maintenance vehicles. In the crashes, it claims, records show that the self-driving feature had “aborted vehicle control less than one second prior to the first impact”

futurism.com/tesla-nhtsa-autopilot-report

sylver_dragon@lemmy.world on 23 Jun 17:50 collapse

That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it’s unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a “fail safe” option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a “human take the wheel” followed by a “slam the brakes” if no input is detected in 2-3 seconds. While an emergency stop isn’t always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.

elucubra@sopuli.xyz on 23 Jun 18:08 next collapse

I don’t know if that is still the case, but many electronic stuff in the US had warnings, with pictures, like “don’t put it in the bath”, and the like .

People are dumb, and you should take that into account.

catloaf@lemm.ee on 23 Jun 18:20 next collapse

Yeah but I googled it after making that comment, and it was sometimes less than one second before impact: futurism.com/tesla-nhtsa-autopilot-report

zaphod@sopuli.xyz on 23 Jun 18:24 next collapse

That actually sounds like a reasonable response.

If you give the driver enough time to act, which tesla doesn’t. They turn it off a second before impact and then claim it wasn’t in self-driving mode.

whotookkarl@lemmy.world on 23 Jun 20:41 collapse

Not even a second, it’s sometimes less than 250-300ms. If I wasn’t already anticipating it to fail and disengage as it went though the 2-lane wide turn I would have gone straight into oncoming traffic

nthavoc@lemmy.today on 23 Jun 19:04 collapse

So, maybe a “human take the wheel” followed by a “slam the brakes” if no input is detected in 2-3 seconds.

I have seen reports where Tesla logic appears as “Human take the wheel since the airbag is about to deploy in the next 2 micro seconds after solely relying on camera object detection and this is totally YOUR fault, kthxbai!” If there was an option to allow the bot to physically bail out of the car as it rolls you onto the tracks while you’re still sitting in the passenger seat, that’s how I would envision how this auto pilot safety function works.

Mouselemming@sh.itjust.works on 23 Jun 17:21 next collapse

Since the story has 3 separate incidents where “the driver let their Tesla turn left onto some railroad tracks” I’m going to posit:

Teslas on self-drive mode will turn left onto railroad tracks unless forcibly restrained.

Prove me wrong, Tesla

AA5B@lemmy.world on 23 Jun 17:33 next collapse

I mean …… Tesla self driving allegedly did this three times in three years but we don’t yet have public data to verify that’s what happened nor do we in any way compare it to what human drivers do.

Although one of the many ways I think I’m an above average driver (just like everyone else) is that people do a lot of stupid things at railroad crossings and I never would

Mouselemming@sh.itjust.works on 23 Jun 18:06 collapse

I’m pretty sure Tesla self-drive does a lot of stupid things you never would, too. That’s why they want you at the wheel, paying attention and ready to correct it in an instant! (Which defeats the whole benefit of self-drive mode imho, but whatever)

The fact that they can avoid all responsibilities and blame you for their errors is of course the other reason.

Tarquinn2049@lemmy.world on 23 Jun 19:22 collapse

Map data obtained and converted from other formats often ends up accidentally condensing labeling categories. One result is train tracks being categorized as generic roads instead of retaining their specific sub-heading. Another, unrelated to this, but common for people that play geo games is when forests and water areas end up being tagged as the wrong specific types.

Mouselemming@sh.itjust.works on 23 Jun 19:49 collapse

Aha. But that sounds correctable… So not having any people assigned to checking on railroads and making sure the system recognizes them as railroads would be due to miserliness on the part of Tesla then… And might also say something about why some Teslas have been known to drive into bodies of water (or children, but that’s probably a different instance of miserliness)

TheKingBee@lemmy.world on 23 Jun 22:07 collapse

Maybe I’m missing something, but isn’t it trivial to take it out of their bullshit dangerous “FSD” mode and take control? How does a car go approximately 40-50 feet down the tracks without the driver noticing and stopping it?

shaggyb@lemmy.world on 23 Jun 22:19 next collapse

Yes.

You hit the brake.

TachyonTele@piefed.social on 24 Jun 00:20 collapse

Ideally you hit the brakes before buyin the tesla.

spankmonkey@lemmy.world on 23 Jun 22:28 collapse

On some railroad crossings you might only need to go off the crossing to get stuck in the tracks and unable to back out. Trying to get out is another 30-40 feet.

Being caught off guard when the car isn’t supposed to do that is how to get stuck in the first place. Yeah, terrible driver trusting shit technology.

XeroxCool@lemmy.world on 23 Jun 15:58 next collapse

If only there was a way to avoid the place where trains drive.

I checked first. They didn’t make a turn into a crossing. It turned onto the tracks. Jalopnik says there’s no official statement that it was actually driving under FSD(elusion) but if it was strictly under human driving (or FSD turned itself off after driving off) I guarantee Tesla will invade privacy and slander the driver by next day for the sake of court of public opinion

egrets@lemmy.world on 23 Jun 19:17 collapse

They didn’t make a turn into a crossing. It turned onto the tracks.

Just to be clear for others, it did so at a crossing. That’s still obviously not what it should have done and it’s no defence of the self-driving feature, but I read your comment as suggesting it had found its way onto train tracks by some other route.

XeroxCool@lemmy.world on 23 Jun 22:27 collapse

Thanks. I could have clarified better myself. I meant “didn’t turn from a rail-parallel road onto a crossing to be met by a train it couldn’t reasonably detect due to bad road design”

Rooskie91@discuss.online on 23 Jun 16:47 next collapse

How symbolic

billwashere@lemmy.world on 23 Jun 16:59 next collapse

And who is going to willingly get into a Tesla Taxi?!?!

PattyMcB@lemmy.world on 23 Jun 17:12 next collapse

You still don’t have to get in for one to hit you. I ride a motorcycle and I’m always sketched out when there’s a Tesla behind me

billwashere@lemmy.world on 23 Jun 18:12 collapse

Very true.

Empricorn@feddit.nl on 23 Jun 21:08 collapse

I won’t get in one because I’m not giving a single dollar of business to Musk. He can go jump up his own asshole.

billwashere@lemmy.world on 23 Jun 22:09 collapse

Also a very valid point.

Bebopalouie@lemmy.ca on 23 Jun 17:06 next collapse

At this point, if anybody buys one of these vehicles from Tesla, they absolutely deserve what they get. It is absurd.

jsomae@lemmy.ml on 23 Jun 17:10 next collapse

Self-driving not being reliable yet is one of the biggest disappointments of the last decade.

explodicle@sh.itjust.works on 23 Jun 18:37 collapse

What did we even do all those ReCAPTCHAs for

apfelwoiSchoppen@lemmy.world on 23 Jun 17:10 next collapse

You could not pay me to drive a Tesla.

some_guy@lemmy.sdf.org on 23 Jun 17:12 next collapse

What a cool and futuristic car. It’s all computer!

I’m still waiting for Elon’s car to drive onto train tracks.

PartyAt15thAndSummit@lemmy.zip on 23 Jun 17:22 next collapse

Deregulation, ain’t it great.

explodicle@sh.itjust.works on 23 Jun 18:36 collapse

I’ll just fork it; we can do better than this.

sturmblast@lemmy.world on 23 Jun 17:34 next collapse

Tesla’s are death traps.

sundray@lemmus.org on 23 Jun 18:31 next collapse

Car drove itself on to the tracks, gets hit by a train. This is some Maximum Overdrive shit.

NotMyOldRedditName@lemmy.world on 23 Jun 18:35 next collapse

How the fuck do you let any level 2 system go 40 to 50 fucking feet down the railroad tracks.

We’re they asleep?

MBech@feddit.dk on 23 Jun 19:29 next collapse

I’m not sure I’d be able to sleep through driving on the railroad tracks. I’m going to guess this person was simply incredibly fucking stupid, and thought the car would figure it out, instead of doing the bare fucking minimum of driving their goddamn 2 ton heavy death machine themself.

NotMyOldRedditName@lemmy.world on 23 Jun 19:36 next collapse

LOL right? Like deciding to see what the car will do ON RAILWAY TRACKS is absolutely fucking bonkers.

6nk06@sh.itjust.works on 24 Jun 06:06 collapse

I’m going to guess this person was simply incredibly fucking stupid

Well, the guy owned a Tesla, it was pretty obvious.

Darleys_Brew@lemmy.ml on 23 Jun 20:36 next collapse

I was gonna say it’s not so much the fact that the car was hit by a train, but that it turned on to the tracks …but 40 or 50 feet?

NotMyOldRedditName@lemmy.world on 23 Jun 21:03 collapse

Cop: WTF happened here?

Driver: It drove itself onto the tracks

Cop: Okay, but what about the other 49 feet of the 50 feet it’s on the tracks?

Driver: …

domi@lemmy.secnd.me on 23 Jun 21:06 collapse

youtu.be/x3LwHhDD6TY

NotMyOldRedditName@lemmy.world on 23 Jun 21:42 collapse

LOL wasn’t expecting that.

TeddE@lemmy.world on 23 Jun 18:39 next collapse

That … tracks

smeenz@lemmy.nz on 23 Jun 20:46 next collapse

Full stream ahead with the train puns

D_C@lemm.ee on 23 Jun 21:57 collapse

Whoa, that typo nearly knocked this discussion off the rails…

smeenz@lemmy.nz on 24 Jun 17:21 collapse

Ah crap. I guess my fingers need more train ing, and I will have to leave it there now.

pyre@lemmy.world on 23 Jun 23:31 collapse
Grandwolf319@sh.itjust.works on 23 Jun 20:39 next collapse

It simply saw a superior technology and decided to attack.

[deleted] on 24 Jun 00:07 collapse

.

merdaverse@lemmy.world on 23 Jun 21:02 next collapse

Damn. I hope the train is ok

EtherWhack@lemmy.world on 23 Jun 21:19 next collapse

Meanwhile my sister’s fiancee drank the whole pitcher and is back working there totally believing Elon is some super genius and that the cars are capable of full self-driving. (I really have to bite my tongue when listening to him)

My camry can self-drive too, same outcome. (just off a cliff, rather than in front of a train)

shaggyb@lemmy.world on 23 Jun 22:23 next collapse

Driver failed to control their car and avoid a collision.

FTFY.

I’m sure the car did actually take the action. But there are TONS of unavoidable warnings and reminders to the driver to supervise and take control when FSD goes wrong.

Which you can do by such super-technical means as “hitting the brake” or “steering the other way” or “flipping the right stalk up”. Rocket science, I know.

Driver’s fault. Bad technology, yes. Worse driver.

Deflated0ne@lemmy.world on 24 Jun 00:05 next collapse

I’d have bailed out and waited for the insurance check. Then got a different car.

shaggyb@lemmy.world on 24 Jun 00:14 collapse

I wouldn’t have had time between pumping iron and getting head, myself.

[deleted] on 24 Jun 00:06 next collapse

.

faythofdragons@slrpnk.net on 24 Jun 00:53 collapse

I just don’t know how they’re getting away with calling it ‘full self driving’ if it’s not fully self driving.

shaggyb@lemmy.world on 24 Jun 02:15 collapse

I didn’t keep track of how that lawsuit turned out.

That said, it is labeled “Full Self-Driving (Supervised)” on everything in the car.

Supervised as in you have to be ready to stop this kind of thing. That’s what the supervision is.

altphoto@lemmy.today on 23 Jun 23:38 next collapse

Honey, are those train tracks? … Yes looks like we’ll turn left on to the tracks for 1/2 a mile. Its a detour.

[deleted] on 24 Jun 02:03 next collapse

.

NikkiDimes@lemmy.world on 24 Jun 02:03 collapse

Yeaaaaah, I mean fuck Tesla for a variety of reasons, but right here we’re looking at a car that drove itself onto a set of train tracks, continued down the train tracks, and the people inside did…nothing? Like, they grabbed their shit and got out when it got stuck. The car certainly should not have done this, but this isn’t really a Tesla problem. It’ll definitely be interesting when robotaxis follow suit though.

nanook@friendica.eskimo.com on 24 Jun 00:07 next collapse

@Davriellelouna I am sure it was all monitored in real time and a revised algorithm will be included in a future update.

6nk06@sh.itjust.works on 24 Jun 06:05 collapse

And maybe an update to the firmware on those expensive Logitech webcams powering the AI.

Cornelius_Wangenheim@lemmy.world on 24 Jun 00:29 next collapse

Also, the robotaxi has been live for all of a day and there’s already footage of it driving on the wrong side of the road: www.youtube.com/watch?v=_s-h0YXtF0c&t=420s

jj4211@lemmy.world on 24 Jun 14:31 collapse

The thing that strikes me about both this story and the thing you posted is that the people in the Tesla seem to be like “this is fine” as the car does some pretty terrible stuff.

In that one, Tesla failing to honor a forced left turn instead opting to go straight into oncoming lanes and waggle about causing things to honk at them, the human just sits there without trying to intervene. Meanwhile they describe it as “navigation issue/hesitation” which really understates what happened there.

The train one didn’t come with video, but I can’t imagine just letting my car turn itself onto tracks and going 40 feet without thinking.

My Ford even thinks about going too close to another lane and I’m intervening even if it was really going to be no big deal. I can’t imagine this level of “oh well”.

Tesla drivers/riders are really nuts…

kieron115@startrek.website on 24 Jun 01:38 next collapse

You all don’t seem to understand, this is just the cost of progress!

J52@lemmy.nz on 24 Jun 01:47 next collapse

Hope no one was hurt, regardless whether they’re stupid, distracted or whatever! If we can’t build fail-saves into cars, what are our chances for real AI?

danhab99@programming.dev on 24 Jun 04:23 collapse

Okay I don’t want to directly disagree with you I just want to add a thought experiment:

If it is a fundamental truth of the universe, a human can literally not program a computer to be smarter than a human (because of some Neil deGrasse Tyson-esq interpretation of entropy), then no matter what AI’s will crash cars as often as real people.

And the question of who is responsible for the AI’s actions will always be the person because people can take responsibility and AI’s are just machine-tools. This basically means that there is a ceiling to how autonomous self-driving cars will ever be (because someone will have to sit at the controls and be ready to take over) and I think that is a good thing.

Honestly I’m in this camp that computers can never truly be “smarter” than a person in all respects. Maybe you can max out an ai’s self-driving stats but then you’ll have no points left over for morality, or you can balance the two out and it might just get into less morally challenging accidents more often ¯\_(ツ)_/¯. There are lots of ways to look at this

mojofrododojo@lemmy.world on 24 Jun 07:31 collapse

a human can literally not program a computer to be smarter than a human

I’d add that a computer vision system can’t integrate new information as quickly as a human, especially when limited to vision-only sensing - which Tesla is strangely obsessed with when the cost of these sensors is dropping and their utility has been proven by waymo’s excellent record.

All in all, I see no reason to attempt to replace humans when we have billions. This is doubly so for ‘artistic’ ai purposes - we have billions of people, let artists create the art.

show me an AI driven system that can clean my kitchen, or do my laundry. that’d be WORTH it.

cy_narrator@discuss.tchncs.de on 24 Jun 01:52 next collapse

I am all in on testing these devices and improving waay before they recommend fully giving into AI

DempstersBox@lemmy.world on 24 Jun 01:56 next collapse

Where’s the video?

atlien51@lemm.ee on 24 Jun 05:07 next collapse

Elongated Musketon: UM THAT WAS JUST 1 FAULTY MODEL STOP CHERRY PICKING GUYS JUST BUY IT!!!1

ChairmanMeow@programming.dev on 24 Jun 05:58 next collapse

Clearly the train didn’t yield properly, time to ban trains.

hydroptic@sopuli.xyz on 24 Jun 14:06 collapse

I mean, he did specifically come up with his idiotic “Hyperloop” concept to kill California’s high speed rail project

lsibilla@lemm.ee on 24 Jun 07:56 collapse

For as much as I’d like to see Tesla stock crash these days, and without judging on the whole autonomous car topic, this IS cherrypicking.

Human drivers aren’t exactly flawless either, but we won’t ban human driven cars because some acts recklessly or other had a seizure while driving.

If statistically self driving cars are safer, I’d rather have them and reduce the risk of coming across another reckless driver.

leftytighty@slrpnk.net on 24 Jun 17:27 collapse

yes we should be doing more to reduce driving, it’s relatively unsafe and I’m sick of our lived environments being designed for cars and not people.

Eww@lemmy.world on 24 Jun 05:32 next collapse

Tesla’s have a problem with the lefts.

RedditIsDeddit@lemmy.world on 24 Jun 11:15 next collapse

Tesla’s new automatic suicide feature

hydroptic@sopuli.xyz on 24 Jun 14:06 next collapse

I think that’s called “murder”

Akasazh@feddit.nl on 24 Jun 16:17 collapse

Full Self Destruction

mintiefresh@lemmy.ca on 24 Jun 14:15 next collapse

How could the left do this /s

dollfacemenace@lemmy.world on 24 Jun 14:23 collapse

Next up: “Train is a communistic tool to restrict vehicular freedom. Banish trains! More highways!”

el_bhm@lemm.ee on 24 Jun 15:53 next collapse

Train often frequented by Hamas supporters canceled a Tesla.

daellat@lemmy.world on 24 Jun 16:27 next collapse

Just one more lane, bro!

stephen01king@lemmy.zip on 25 Jun 00:22 collapse

This is unironocally something I’ve heard people argue about public transport in general, that its a tool to control people’s movement.

you_are_it@lemmy.sdf.org on 24 Jun 15:20 next collapse

Every thing seems to turn to shit

Etterra@discuss.online on 24 Jun 15:49 collapse

To be fair it is a Tesla. It started out as shit.

SkyezOpen@lemmy.world on 24 Jun 16:39 collapse

It started out promising, then was consigned to be shit when Elon swore off LIDAR. If he kept his shitty little hands away from management and let the engineers do their thing, it could’ve been great.

carpelbridgesyndrome@sh.itjust.works on 24 Jun 15:58 next collapse

It’s stupider than I thought reading the headline. That car started driving down the fing tracks

HugeNerd@lemmy.ca on 24 Jun 16:10 collapse

He wants to make us a multi transport mode species.