Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis (fuelarc.com)
from KayLeadfoot@fedia.io to technology@lemmy.world on 02 Apr 06:58
https://fedia.io/m/technology@lemmy.world/t/1997571

TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

#autonomy #fsd #selfdriving #technology #tesla

threaded - newest

captainastronaut@seattlelunarsociety.org on 02 Apr 07:12 next collapse

Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).

KayLeadfoot@fedia.io on 02 Apr 07:44 next collapse

Accurate.

Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.

  1. The car's cameras don't detect the biker, or it just doesn't stop for some reason.
  2. The driver isn't paying attention to detect the system failure.
  3. The Tesla's driver alertness tech fails to detect that the driver isn't paying attention.

Taking out the driver will make this already-unacceptably-lethal system even more lethal.

jonne@infosec.pub on 02 Apr 07:51 next collapse

  1. Self-driving turns itself off seconds before a crash, giving the driver an impossibly short timespan to rectify the situation.
KayLeadfoot@fedia.io on 02 Apr 08:03 next collapse

... Also accurate.

God, it really is a nut punch. The system detects the crash is imminent.

Rather than automatically try to evade... the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.

jonne@infosec.pub on 02 Apr 08:06 collapse

Yep, that one was purely about hitting a certain KPI of ‘miles driven on autopilot without incident’. If it turns off before the accident, technically the driver was in control and to blame, so it won’t show up in the stats and probably also won’t be investigated by the NTSB.

NeoNachtwaechter@lemmy.world on 02 Apr 08:53 collapse

so it won’t show up in the stats

Hopefully they wised up by now and record these stats properly…?

jonne@infosec.pub on 02 Apr 09:13 next collapse

If they ever fixed it, I’m sure Musk fired whomever is keeping score now. He’s going to launch the robotaxi stuff soon and it’s going to kill a bunch of people.

KayLeadfoot@fedia.io on 02 Apr 09:13 collapse

NHTSA collects data if self-driving tech was active within 30 seconds of the impact.

The companies themselves do all sorts of wildcat shit with their numbers. Tesla's claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that's what they say on their stock earnings calls. Of course, that's not true, not based on any data I've seen, they haven't published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).

b3an@lemmy.world on 03 Apr 05:37 next collapse

Fascinating! I don’t know all this. Thanks

KayLeadfoot@fedia.io on 03 Apr 05:50 collapse

Any time :)

NotMyOldRedditName@lemmy.world on 03 Apr 06:50 collapse

So to drive with FSD is 8x safer than your average human driver.

WITH a supervising human.

Once it reaches a certain quality, it should be safer if a human is properly supervising it, because if the car tries to do something really stupid, the human takes over. The vast vast vast majority of crashes are from inattentive drivers, which is obviously a problem and they need to keep improving the attentiveness monitoring, but it should be safer than a human with human supervision because it can also detect things the human will ultimately miss.

Now, if you take the human entirely out of the equation, I very much doubt that FSD is safer than a human at it’s current state.

[deleted] on 02 Apr 08:51 next collapse

.

NeoNachtwaechter@lemmy.world on 02 Apr 08:51 collapse

Even when it is just milliseconds before the crash, the computer turns itself off.

Later, Tesla brags that the autopilot was not in use during this ( terribly, overwhelmingly) unfortunate accident.

br3d@lemmy.world on 02 Apr 13:16 collapse

There’s at least two steps before those three:

-1. Society has been built around the needs of the auto industry, locking people into car dependency

  1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
grue@lemmy.world on 02 Apr 13:43 collapse

  1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody

That’s a good thing, because the alternative would be flipping the notion of property rights on its head. Making the owner not responsible for his property would be used to justify stripping him of his right to modify it.

You’re absolutely right about point -1 though.

explodicle@sh.itjust.works on 02 Apr 15:54 collapse

build, sell and drive

You two don’t seem to strongly disagree. The driver is liable but should then sue the builder/seller for “self driving” fraud.

grue@lemmy.world on 02 Apr 17:04 collapse

Maybe, if that two-step determination of liability is really what the parent commenter had in mind.

I’m not so sure he’d agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.

explodicle@sh.itjust.works on 02 Apr 20:31 collapse

I would assume everyone here would agree with that 😘

grue@lemmy.world on 02 Apr 21:43 collapse

I mean, maybe, but previously when I’ve said that it’s typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it’s somehow suddenly too dangerous to allow owners to control their property just because software is involved.

monarch@lemm.ee on 03 Apr 06:27 collapse

Lemmy is super pro FOSS.

ascense@lemm.ee on 02 Apr 09:08 next collapse

Most frustrating thing is, as far as I can tell, Tesla doesn’t even have binocular vision, which makes all the claims about humans being able to drive with vision only even more blatantly stupid. At least humans have depth perception. And supposedly their goal is to outperform humans?

TheGrandNagus@lemmy.world on 02 Apr 09:31 collapse

Tesla’s argument of “well human eyes are like cameras therefore we shouldn’t use LiDAR” is so fucking dumb.

Human eyes have good depth perception and absolutely exceptional dynamic range and focusing ability. They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised, certainly more so than any computer added to a car.

And even with all those advantages humans have, we still crash from time to time and make smaller mistakes regularly.

NABDad@lemmy.world on 02 Apr 12:12 next collapse

They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised

A neural network that has been in development for 650 million years.

explodicle@sh.itjust.works on 02 Apr 15:51 collapse

Ok, maybe project managers are good for something.

bluGill@fedia.io on 02 Apr 14:34 collapse

Anyone who has driven (or walked) into a sunrise/sunset knows that human vision is not very good. I've also driven in blizzards, heavy rain, and fog - all times when human vision is terrible. I've also not seen green lights (I'm colorblind).

explodicle@sh.itjust.works on 02 Apr 15:49 next collapse

Bro I’m colorblind too and if you’re not sure what color the light is, you have to stop. Don’t put that on the rest of us.

bluGill@fedia.io on 02 Apr 16:58 collapse

I can see red clearly and so not sure means I can go.

I've only noticed issues in a few situations. When I'm driving at night and suddenly the weirdly aimed streetlight turns yellow - until it changed I didn't even know there was a stoplight there. The second was I was making a left turn at sunset (sun behind me) and the green arrow came on but the red light remained on so I couldn't see it was time/safe to go until my wife alerted me.

TheGrandNagus@lemmy.world on 02 Apr 15:52 collapse

Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.

Human eyes are so far beyond it’s hard to even quantify.

And bullshit on you not being able to see the lights. They’re specifically designed so that’s not an issue for colourblind people.

bluGill@fedia.io on 02 Apr 17:03 next collapse

And bullshit on you not being able to see the lights. They're specifically designed so that's not an issue for colour blind people

Some lights are, but not all of them are. I often say I go when the light turns blue. However not all lights have that blue tint and so I often cannot tell the difference between a white light and a green light by color. (but white is not used in a stoplight and I can see red/yellow just fine) Where I live all stoplights have green on the bottom so that is always a cheat I use, but that only works if I can see the relative position - in an otherwise dark situation I only see a light in front of me and not the rest of the structure and so I cannot tell. I have driven where stoplights are not green on bottom and I can never remember if green is left/right.

Even when the try though, not all colorblind is the same. There may not be a mitigation that will work from two different people with different aspects of colorblind.

bluGill@fedia.io on 02 Apr 17:06 collapse

Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.

Why are you trying to limit cars to just vision? That is all I have as a human. However robots have radar, lidar, radio, and other options, there is no reasons they can't use them and get information eyes cannot. Every option has limits.

TheGrandNagus@lemmy.world on 02 Apr 17:18 collapse

Please read my comments before you respond to them.

scarabic@lemmy.world on 02 Apr 16:18 next collapse

These fatalities are a Tesla business advantage. Every one is a data point they can use to program their self-driving intelligence. No one has killed as many as Tesla, so no one knows more about what kills people than Tesla. We don’t have to turn this into a bad thing just because they’re killing people /s

Ledericas@lemm.ee on 03 Apr 00:14 collapse

they originally had lidar, or radar, but musk had them disabled in the older models.

NotMyOldRedditName@lemmy.world on 03 Apr 06:43 collapse

They had radar. Tesla has never had lidar, but they do use lidar on test vehicles to ground truth their camera depth / velocity calculations.

keesrif@lemmy.world on 02 Apr 07:43 next collapse

On a quick read, I didn’t see the struck motorcycles listed. Last I heard, a few years ago, was that this mainly affected motorcycles with two rear lights that are spaced apart and fairly low to the ground. I believe this is mostly true for Harleys.

The theory I recall was that this rear light configuration made the Tesla assume it was looking (remember, only cameras without depth data) at a car that was further down the road - and acceleration was safe as a result. It miscategorised the motorcycle so badly that it misjudged it’s position entirely.

jonne@infosec.pub on 02 Apr 07:54 next collapse

Whatever it is, it’s unacceptable and they should really ban Tesla’s implementation until they fix some fundamental issues.

KayLeadfoot@fedia.io on 02 Apr 07:58 next collapse

I also saw that theory! That's in the first link in the article.

The only problem with the theory: Many of the crashes are in broad daylight. No lights on at all.

I didn't include the motorcycle make and model, but I did find it. Because I do journalism, and sometimes I even do good journalism!

The models I found are: Kawasaki Vulcan (a cruiser bike, just like the Harleys you describe), Yamaha YZF-R6 (a racing-style sport bike with high-mount lights), and a Yamaha V-Star (a "standard" bike, fairly low lights, and generally a low-slung bike). Weirdly, the bike models run the full gamut of the different motorcycles people ride on highways, every type is represented (sadly) in the fatalities.

I think you're onto something with the faulty depth sensors. Sensing distance is difficult with optical sensors. That's why Tesla would be alone in the motorcycle fatality bracket, and that's why it would always be rear-end crashes by the Tesla.

littleomid@feddit.org on 02 Apr 11:08 next collapse

At least in EU, you can’t turn off motorcycle lights. They’re always on. In eu since 2003, and in US, according to the internet, since the 70s.

pirat@lemmy.world on 02 Apr 15:45 next collapse

I assume older motorcycles built before 2003 are still legal in the EU today, and that the drivers are responsible for turning on the lights when riding those.

KayLeadfoot@fedia.io on 02 Apr 18:17 collapse

Point taken: Feel free to amend my comment from "No lights at all" to "No lights visible at all."

grue@lemmy.world on 02 Apr 13:48 collapse

Because I do journalism, and sometimes I even do good journalism!

In that case, you wouldn’t happen to know whether or not Teslas are unusually dangerous to bicycles too, would you?

KayLeadfoot@fedia.io on 02 Apr 18:12 collapse

Surprisingly, there is a data bucket for accidents with bicyclists, but hardly any bicycle crashes are reported.

That either means that they are not occurring (woohoo!), or that means they are being lumped in as one of the multiple pedestrian buckets (not woohoo!), or they are in the absolutely fucking vast collection of "severity: unknown" accidents where we have no details and Tesla requested redaction to make finding the details very difficult.

grue@lemmy.world on 02 Apr 19:00 collapse

Thanks!

KayLeadfoot@fedia.io on 02 Apr 19:14 collapse

Any time :)

treadful@lemmy.zip on 02 Apr 07:59 next collapse

Still probably a good idea to keep an eye on that Tesla behind you. Or just let them past.

ExcessShiv@lemmy.dbzer0.com on 02 Apr 08:19 next collapse

The ridiculous thing is, it has 3 cameras pointing forward, you only need 2 to get stereoscopic depth perception with cameras…why the fuck are they not using that!?

Edit: I mean, I know why, it’s because it’s cameras with three different lenses used for different things (normal, wide angle, and telescopic) so they’re not suitable for it, but it just seems stupid to not utilise that concept when you insist on a camera only solution.

amorpheus@lemmy.world on 02 Apr 16:26 collapse

That seems like a spectacular oversight. How is it supposed to replicate human vision without depth perception?

KayLeadfoot@fedia.io on 02 Apr 20:52 next collapse

Little known fact: the Model S (P) actually stands for Polyphemus Edition, not Plaid Edition.

ExcessShiv@lemmy.dbzer0.com on 02 Apr 21:24 collapse

The video 0x0 linked to in another comment describes the likely method used to infer distance to objects without a stereoscopic setup, and why it (likely) had issues determining distance in the cases where they hit motorcycles.

NeoNachtwaechter@lemmy.world on 02 Apr 08:54 next collapse

Are you saying Harley drivers are fair game?

0x0@programming.dev on 02 Apr 11:03 collapse

This video proposes that theory.

keesrif@lemmy.world on 02 Apr 15:46 collapse

Ah, thanks for jogging my memory

misteloct@lemmy.world on 02 Apr 07:55 next collapse

I’m wondering how that stacks up to human drivers. Since the data is redacted I’m guessing not well at all.

KayLeadfoot@fedia.io on 02 Apr 08:00 collapse

If it were good, we'd be seeing regular updates on Twitter, I imagine.

Gork@lemm.ee on 02 Apr 07:57 next collapse

Lidar needs to be a mandated requirement for these systems.

echodot@feddit.uk on 02 Apr 10:36 next collapse

Or at least something other than just cameras. Even just adding ultrasonic senses to the front would be an improvement.

NotMyOldRedditName@lemmy.world on 03 Apr 07:00 collapse

The range on ultrasonics is too short. They only ever get used for parking type situations, not driving on the roadways.

HK65@sopuli.xyz on 02 Apr 11:32 next collapse

Honestly, emergency braking with LIDAR is mature and cheap enough at this point that is should be mandated for all new cars.

Nastybutler@lemmy.world on 02 Apr 16:39 collapse

No, emergency braking with radar is mature and cheap. Lidar is very expensive and relatively nascent

DarrinBrunner@lemmy.world on 02 Apr 12:29 collapse

How about we disallow it completely, until it’s proven to be SAFER than a human driver. Because, why even allow it if it’s only as safe?

explodicle@sh.itjust.works on 02 Apr 15:41 next collapse

As an engineer, I strongly agree with requirements based on empirical results rather than requiring a specific technology. The latter never ages well. Thank you.

scarabic@lemmy.world on 02 Apr 16:27 collapse

It’s hardly either / or though. What we have here is empirical data showing that cars without lidar perform worse. So it’s based in empirical results to mandate lidar. You can build a clear, robust requirement around a tech spec. You cannot build a clear, robust law around fatality statistics targets.

explodicle@sh.itjust.works on 02 Apr 20:34 collapse

We frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.

scarabic@lemmy.world on 02 Apr 21:41 next collapse

Those are ways to gather empirical results, though they rely on artificial, staged situations.

I think it’s fine to have both. Seat belts save lives. I see no problem mandating them. That kind of thing can still be well founded in data.

NotMyOldRedditName@lemmy.world on 03 Apr 07:07 collapse

You mean like this Euro NCAP testing, where Tesla does stop and most others don’t including some vehicles with lidar?

youtu.be/4Hsb-0v95R4

scarabic@lemmy.world on 02 Apr 16:24 collapse

This sounds good until you realize how unsafe human drivers are. People won’t accept a self-driving system that’s only 50% safer than humans, because that will still be a self-driving car that kills 20,000 Americans a year. Look at the outrage right here, and we’re nowhere near those numbers. I also don’t see anyone comparing these numbers to human drivers on any per-mile basis. Waymos compared favorably to human drivers in their most recently released data. Does anyone even know where Teslas stand compared to human drivers?

NotMyOldRedditName@lemmy.world on 03 Apr 07:13 collapse

There’s been 54 reported fatalities involving their software over the years in the US.

That’s around 10 billion AP miles (9 billion at end of 2024), and around 3.6 billion on the various version of FSD (beta / supervised). Most of the fatal accidents happened on AP though not FSD.

Lets just double those fatal accidents to 108 to make it for the world, but that probably skews high. Most of the fatal stuff I’ve seen is always in the US.

That equates to 1 fatal accident every 125.9 million miles.

The USA average per 100 million miles is 1.33 deaths, so even doubling the deaths it’s less than the current national average. That’s the equivalent of 1.33 deaths every 167 million miles with Tesla’s software.

Edit: I couldn’t math, fixed it. Also for FSD specifically, very few places have it. Mainly North America, and just recently, China. I wish we had fatalities for FSD specifically.

Buffalox@lemmy.world on 02 Apr 08:54 next collapse

Hey guys relax! It’s all part of the learning experience of Tesla FSD.
Some of you may die, but that’s a sacrifice I’m willing to make.

Regards
Elon Musk
CEO of Tesla

NeoNachtwaechter@lemmy.world on 02 Apr 10:26 next collapse

P.S. Volunteers needed for the Mars mission as well.

echodot@feddit.uk on 02 Apr 10:35 next collapse

Is musk going, because I vote to be on whatever planet he isn’t.

monarch@lemm.ee on 03 Apr 06:30 collapse

If there are only fElon fan boys going I’ll take the hit and go to open the airlock halfway through.

JasonDJ@lemmy.zip on 02 Apr 12:00 collapse

News on the first mission: Meteoroid crashes into full flying SpaceX rocket, killing all aboard.

[deleted] on 02 Apr 13:24 collapse

.

Gammelfisch@lemmy.world on 02 Apr 15:20 collapse

+1 for you. However, replace “Regards” with the more appropriate words from the German language. The first with an S, and the second an H. I will not type that shit, fuck Leon and I hope the fucking Nazi owned Tesla factory outside of Berlin closes.

Buffalox@lemmy.world on 02 Apr 18:11 collapse

Yes I’m not writing that shit, even in a sarcastic post. Bu I get your drift.
On the other hand, since you are from Germany, VW group is absolutely killing it on EV recently IMO.
They totally dominate top 10 EV here in Denmark, with 7 out of 10 top selling models!!
They are competitively priced, and they are the best combination of quality and range in their price ranges.

lnxtx@feddit.nl on 02 Apr 09:48 next collapse

Stop dehumanizing drivers who killed people.
Feature, wrongly called, Full Self-Driving, shall be supervised at any time.

SouthEndSunset@lemm.ee on 02 Apr 11:55 next collapse

If you’re going to say your car has “full self driving”, it should have that, not “full self driving (but needs monitoring.)” or “full self driving (but it disconnects 2 seconds before impact.)”.

Ulrich@feddit.org on 02 Apr 13:42 collapse

I think it’s important to call out inattentive drivers while also calling out the systems and false advertising that may lead them to become less attentive.

If these systems were marketed as “driver assistance systems” instead of “full self driving”, certainly more people would pay attention. The fact that they’ve been allowed to get away with this blatant false advertising is astonishing.

They’re also obviously not adequately monitoring for driver attentiveness.

expatriado@lemmy.world on 02 Apr 09:53 next collapse

as daily rider, i must add having a tesla behind to the list of road hazards to look out

TexasDrunk@lemmy.world on 02 Apr 12:18 next collapse

I’m on mine far more often than I’m in a car. I think Tesla found out that I point and laugh at any cyber trucks I see at red lights while I’m out and is trying to kill me.

ThomasCrappersGhost@feddit.uk on 02 Apr 15:28 next collapse

I feel like that as a driver. Tesla’s do not move at a consistent speed, which drives me mad

GoodLuckToFriends@lemmy.today on 03 Apr 05:15 collapse

You’re not wrong, but good luck watching out for a vehicle approaching you at a 30 mph differential (which is what I recall from fortnine covering the topic years ago) from behind.

Substance_P@lemmy.world on 02 Apr 10:34 next collapse

LIDAR vs Tesla.

jim3692@discuss.online on 02 Apr 12:28 collapse

I had ignored the video, as I didn’t expect Mark to expose Tesla

0x0@programming.dev on 02 Apr 10:59 next collapse

This is news? Fortnine talked about it two years ago.
TL;DR Tesla removed LIDAR to save a buck and the cameras see two red dots that the 'puter thinks it’s a far away car at night when indeed it’s a close motorcycle.

AnUnusualRelic@lemmy.world on 02 Apr 11:30 next collapse

It could be two motorcycles side by side.

LesserAbe@lemmy.world on 02 Apr 13:03 next collapse

It’s helpful to remember that not everyone has seen the same stories you have. If we want something to change, like regulators not allowing dangerous products, then raising public awareness is important. Expressing surprise that not everyone knows about something can be counterproductive.

Going beyond that, wouldn’t the new information here be the statistics?

bluGill@fedia.io on 02 Apr 14:32 next collapse

like regulators not allowing dangerous products,

I include human drivers in the list of dangerous products I don't want allowed. The question is self driving safer overall (despite possible regressions like this). I don't want regulators to pick favorites. I want them to find "the truth"

LesserAbe@lemmy.world on 02 Apr 19:37 collapse

Sure, we’re in agreement as far as that goes. My point was just the commenter above me was indicating it should be common knowledge that Tesla self driving hits motorcycles more than other self driving cars. And whether their comment was about this or some other subject, I think it’s counterproductive to be like “everyone knows that.”

[deleted] on 02 Apr 15:37 collapse

.

explodicle@sh.itjust.works on 02 Apr 15:37 next collapse

It can’t even perceive the depth of the lights?

0x0@programming.dev on 02 Apr 16:02 next collapse

Not with cameras alone, no.

AA5B@lemmy.world on 03 Apr 00:27 collapse

Why not? It’s got multiple cameras so could judge distances the same way humans do.

However there have been both hardware and software updates since most of those, so the critical question is how much of a problem is it still? The article had no info or speculation on that

EndlessNightmare@reddthat.com on 03 Apr 04:54 collapse

The argument is that humans can drive with just 2 eyes, so cameras are enough. I disagree with this position, given that the limitations of a camera-only system. But that’s what it is.

Different sensors excel at different tasks and different conditions, and cameras are not always it.

Visstix@lemmy.world on 02 Apr 11:35 next collapse

Why is self-driving even allowed?

kameecoding@lemmy.world on 02 Apr 12:16 next collapse

Because muh freedum, EU are a bunch of commies for not allowing this awesome innovation on their roads

(I fucking love living in the EU)

UnderpantsWeevil@lemmy.world on 02 Apr 13:19 next collapse

Bribes to local governments and police, mostly.

Bytemeister@lemmy.world on 02 Apr 13:22 next collapse

Because the march of technological advancement is inevitable?

In light of recent (and let’s face it, long ago cases) Tesla’s “Full Self Driving” needs to be downgraded to level 2 at best.

Level 2: Partial Automation

The vehicle can handle both steering and acceleration/deceleration, but the driver must remain engaged and ready to take control.

Pretty much the same level as other brands self driving feature.

AngryCommieKender@lemmy.world on 02 Apr 14:01 collapse

The other brands, such as Audi and VW, work much better than Tesla’s system. Their LIDAR systems aren’t blinded by fog, and rain the way the Tesla is. Someone recently tested an Audi with its system against a Tesla with its system. The Tesla failed either 3/5 or 4/5 tests. The Audi passed 3/5 or 4/5. Neither system is perfect, but the one that doesn’t rely on just cameras is clearly superior.

Edit: it was Mark Rober.

youtu.be/IQJL3htsDyQ

Bytemeister@lemmy.world on 02 Apr 14:42 collapse

It’s hard to tell, but from about 15 minutes of searching, I was unable to locate any consumer vehicles that include a LIDAR system. Lots of cars include RADAR, for object detection, even multiple RADAR systems for parking. There may be some which includes a TimeOfFlight sensor, which is like LIDAR, but static and lacks the resolution/fidelity. My Mach-E which has level 2 automation uses a combination of computer vision, RADAR and GPS. I was unable to locate a LIDAR sensor for the vehicle.

The LIDAR system in Mark’s video is quite clearly a pre-production device that is not affiliated with the vehicle manufacturer it was being tested on.

Adding, after more searching, it looks like the polestar 3, some trim levels of the Audi A8 and the Volvo EX90 include a LiDAR sensor. Curious to see how the consumer grade tech works out in real world.

Please do not mistake this comment as “AI/computer vision” evangelisim. I currently have a car that uses those technologies for automation, and I would not and do not trust my life or anyone else’s to that system.

AngryCommieKender@lemmy.world on 02 Apr 14:45 next collapse

The way I understand it, is that Audi, Volvo, and VW have had the hardware in place for a few years. They are collecting real world data about how we drive before they allow the systems to be used at all. There are also legal issues with liability.

KayLeadfoot@fedia.io on 02 Apr 18:33 collapse

Mercedes uses LiDAR. They also operate the sole Level 3 driver automation system in the USA. Two models only, the new S-Class and EQS sedans.

Tesla alleges they'll be Level 4+ in Austin in 60 days, and just skip Level 3 altogether. We'll see.

Bytemeister@lemmy.world on 02 Apr 18:51 collapse

Yeah, keep in mind that Elon couldn’t get level 3 working in a closed, pre-mapped circuit. The robotaxis were just remotely operated.

Rivalarrival@lemmy.today on 02 Apr 14:09 next collapse

Because the only thing worse than self driving is human driving.

bluGill@fedia.io on 02 Apr 14:30 next collapse

Humans are terrible drivers. The open question is are self driving cars overall safer than human driven cars. So far the only people talking either don't have data, or have reason cherry pick only parts of the data that make self driving look good. This is the one exception where someone seemingly independent has done analysis - the question is are they unbiased, or are they cherry picking data to make self driving look bad (I'm not familiar with the source so I can't answer that)

Either way more study is needed.

Rhaedas@fedia.io on 02 Apr 15:23 next collapse

Humans are terrible. The human eyes and brain are good at detecting certain things though that allow a reaction where computer vision, especially only using one method of detection, fails often. There are times when an automated system will prevent a problem before a human could even see it. So far neither is the clear winner, human driving just has a legacy that automation has to beat by a great length and not just be good enough.

On the topic of human drivers, I think most on the road drive reactively and not based on prediction and anticipation. Given the speed and possible detection methods, a well designed automated system should be excelling at this. It costs more and it more complex to design such a thing, so we're getting the bare bones of the best minimum tech can give us right now, which again is not a replacement for all cases.

KayLeadfoot@fedia.io on 02 Apr 18:38 collapse

I am absolutely biased. It's me, I'm the source :)

I'm a motorcyclist, and I don't want to die. Also just generally, motorcyclists deserve to get where they are going safely.

I agree with you. Self-driving cars will overall greatly improve highway safety.

I disagree with you when you suggest that pointing out flaws in the technology is evidence of bias, or "cherry picking to make self driving look bad." I think we can improve on the technology by pointing out its systemic defects. If it hits motorcyclists, take it off the road, fix it, and then save lives by putting it back on the road.

That's the intention of the coverage, at least: I am hoping to apply pressure to improve rather than remove. Read my Waymo coverage, I'm actually a big automation enthusiast, because fewer crashes is a good thing.

bluGill@fedia.io on 02 Apr 19:42 collapse

I wasn't trying to suggest that you are biased, only that I have no clue and so it is possible you are somehow unfairly doing something.

KayLeadfoot@fedia.io on 02 Apr 20:55 collapse

Perfectly fair. Sorry, I jumped the gun! Good on you for being incredulous and inspecting the piece for manipulation, that's smart.

Not_mikey@lemmy.dbzer0.com on 02 Apr 15:06 collapse

Robots don’t get drunk, or distracted, or text, or speed…

Anecdotally, I think the Waymos are more courteous than human drivers. Though waymo seems to be the best ones out so far, idk about the other services.

brygphilomena@lemmy.dbzer0.com on 02 Apr 15:23 collapse

Don’t waymos have remote drivers that take control in unexpected situationsml?

dogslayeggs@lemmy.world on 02 Apr 15:34 collapse

They have remote drivers that CAN take control in very corner case situations that the software can’t handle. The vast majority of driving is don’t without humans in the loop.

DragonTypeWyvern@midwest.social on 02 Apr 16:24 next collapse

So they say

NotMyOldRedditName@lemmy.world on 02 Apr 22:33 collapse

They don’t even do that, according to Waymo’s claims.

They can suggest what the car should do, but they aren’t actually doing it. The car is in complete control.

Its a nuanced difference, but it is a difference. A Waymo employee never takes control of or operates the vehicle.

KayLeadfoot@fedia.io on 02 Apr 22:40 collapse

Interesting! I did not know that - I assumed the teleoperators took direct control, but that makes much more sense for latency reasons (among others)

NotMyOldRedditName@lemmy.world on 02 Apr 22:47 collapse

I always just assumed it was their way to ensure the vehicle was really autonomous. If you have someone remotely driving it, you could argue it isn’t actually an AV. Your latency idea makes a lot of sense as well though. Imagine taking over and causing an accident due to latency? This way even if the operator gives a bad suggestion, it was the car that ultimately did it.

Ulrich@feddit.org on 02 Apr 12:18 next collapse

I’m not sure how that’s possible considering no one manufactures self-driving cars that I know of. Certainly not Tesla.

DarrinBrunner@lemmy.world on 02 Apr 12:26 next collapse

Five years ago, you could not have brought this up without Musk simps defending it.

Hominine@lemmy.world on 02 Apr 16:23 next collapse

There seems to be people/bots down-voting critical takes up and down this very thread. What chumps.

Akasazh@feddit.nl on 07 Apr 10:17 collapse

On Reddit perhaps

MedicPigBabySaver@lemmy.world on 02 Apr 12:45 next collapse

Musk = POS Nazi. Who couldn’t care less about people being killed by his shit companies.

werefreeatlast@lemmy.world on 02 Apr 13:21 next collapse

Every captcha…can you see the motorcycle? I would be afraid if they wanted all the squares with small babies or maybe just regular folk…can you pick all the hottie’s? Which of these are body parts?

theSisko@sh.itjust.works on 02 Apr 15:13 next collapse
AJ1@lemmy.ca on 02 Apr 16:39 collapse

can you pick all the hottie’s?

… the hottie’s what?

9488fcea02a9@sh.itjust.works on 02 Apr 13:37 next collapse

Sounds like NHTSA needs a visit from DOGE!

RememberTheApollo_@lemmy.world on 02 Apr 17:10 collapse

Gotta get rid of the evidence.

Litebit@lemmy.world on 02 Apr 15:36 next collapse

Elon needs to take responsibility for their death.

WanderingThoughts@europe.pub on 02 Apr 17:44 collapse

That’s why Tesla’s full self driving is officially still a level 2 cruise control. But of course they promise to jump directly to level 4 soon™.

Critical_Thinker@lemm.ee on 02 Apr 15:42 next collapse

Let’s get this out of the way: Felon Musk is a nazi asshole.

Anyway, It should be criminal to do these comparisons without showing human drivers statistics for reference. I’m so sick of articles that leave out hard data. Show me deaths per billion miles driven for tesla, competitors, and humans.

Then there’s shit like the boca raton crash, where they mention the car going 100 in a 45 and killing a motorcyclist, and then go on to say the only way to do that is to physically use the gas pedal and that it disables emergency breaking. Is it really a self driving car at that point when a user must actively engage to disable portions of the automation? If you take an action to override stopping, it’s not self driving. Stopping is a key function of how self driving tech self drives. It’s not like the car swerved to another lane and nailed someone, the driver literally did this.

Bottom line I look at the media around self driving tech as sensationalist. Danger drives clicks. Felon Musk is a nazi asshole, but self driving tech isn’t made by the guy. it’s made by engineers. I wouldn’t buy a tesla unless he has no stake in the business, but I do believe people are far more dangerous behind the wheel in basically all typical driving scenarios.

Gladaed@feddit.org on 02 Apr 16:10 next collapse

“Critical Thinker” Yikes. Somehow the right made that a forbidden word in my mind because they hide behind that as an excuse for asking terrible questions etc.

Anyway. Allegedly the statistics are rather mediocre for self driving cars. But sadly I haven’t seen a good statistic about that, either. The issue here is that automatable tasks are lower risk driving situations so having a good statistic is near impossible. E.g. miles driven are heavily skewed when you are only used on highways as a driver. There are no simple numbers that will tell you anything of worth.

That being said the title should be about the mistake that happened without fundamental statements (i.e. self driving is bad because motorcyclists die).

Critical_Thinker@lemm.ee on 02 Apr 18:17 collapse

Did I ask a terrible question, or do you just not like anything being objective about the issue? I’m so far over on the left side ideologically that you’d be hard pressed finding an issue that i’m conservative on. I don’t fit the dem mold though, i’m more of a bernie… though I am very critical in general. I don’t just take things at face value. Anywho…

Saying that the statistics aren’t great just lends credence to the fact that we can’t objectively determine how safe or unsafe anything is without good data.

Nastybutler@lemmy.world on 02 Apr 16:33 next collapse

He may not be an engineer, but he’s the one who made the decision to use strictly cameras rather than lidar, so yes, he’s responsible for these fatalities that other companies don’t have. You may not be a fan of Musk, but it sounds like you’re a fan of Tesla

KayLeadfoot@fedia.io on 02 Apr 18:22 collapse

In Boca Raton, I've seen no evidence that the self-driving tech was inactive. According to the government, it is reported as a self-driving accident, and according to the driver in his court filings, it was active.

Insanely, you can slam on the gas in Tesla's self-driving mode, accelerate to 100MPH in a 45MPH zone, and strike another vehicle, all without the vehicle's "traffic aware" automation effectively applying a brake.

That's not sensationalist. That really is just insanely designed.

Critical_Thinker@lemm.ee on 02 Apr 18:50 collapse

FTFA:

Certain Tesla self-driving technologies are speed capped, but others are not. Simply pressing the accelerator will raise your speed in certain modes, and as we saw in the police filings from the Washington State case, pressing the accelerator also cancels emergency braking.

That’s how you would strike a motorcyclist at such extreme speed, simply press the accelerator and all other inputs are apparently overridden.

If the guy smashes the gas, just like in cruise control I would not expect the vehicle to stop itself.

The guy admitted to being intoxicted and held the gas down… what’s the self driving contribution to that?

KayLeadfoot@fedia.io on 02 Apr 19:12 collapse

I know what's in the article, boss. I wrote it. No need to tell me FTFA.

TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car's sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.

I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.

Here's the manual, if you're curious. It doesn't work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that "The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control," so it's all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.

https://www.tesla.com/ownersmanual/2012_2020_models/en_us/GUID-50331432-B914-400D-B93D-556EAD66FD0B.html#:~:text=Traffic%2DAware%20Cruise%20Control%20determines,maintains%20a%20set%20driving%20speed.

Critical_Thinker@lemm.ee on 02 Apr 19:53 collapse

So do you expect self driving tech to override human action? or do you expect human action to override self driving tech?

I expect the human to override the system, not the other way around. Nobody claims to have a system that requires no human input, aside from limited and experimental implementations that are not road legal nationwide. I kind of expect human input to override the robot given the fear of robots making mistakes despite the humans behind them getting into them drunk and holding down the throttle until they turn motorcyclists into red mist. But that’s my assumption.

With the boca one specifically, the guy got in his car inebriated. That was the first mistake that caused the problem that should never have happened. If the car was truly self driving automated and had no user input, this wouldn’t have happened. It wouldn’t have gone nearly 2.5x the speed limit. It would have braked long in advance before hitting someone in the road.

I have a ninja 650. We all know the danger comes from things we cannot control, such as others. I’d trust an actually automated car over a human driver always, even with limited modern tech. The second the user gets an input though? zero trust.

KayLeadfoot@fedia.io on 02 Apr 20:42 collapse

The driver being drunk doesn't mean the self-driving feature should not detect motorcycles. The human is a fallback to the tech. The tech had to fail for this fatal crash to occur.

If the system is advertised as overrriding the human speed inputs ( traffic aware cruise control, it is supposed to brake when it detects traffic, regardless of pedal inputs), then it should function as advertised.

Incidentally, I agree, I broadly trust automated cars to act more predictably than human drivers. In the case of specifically Teslas and specifically motorcycles, it looks like something is going wrong. That's what the data says, anyhow. If the government were functioning how it should, the tech would be disabled during the investigation, which is ongoing.

SocialMediaRefugee@lemmy.world on 02 Apr 15:58 next collapse

Trucks in general have gotten so big they are pedestrian deathtraps

lka1988@lemmy.dbzer0.com on 02 Apr 16:51 next collapse

Good to know, I’ll stay away from those damn things when I ride.

Nicochucha@lemm.ee on 02 Apr 17:05 next collapse

Commuting in CA feels like I’m navigating a minefield 🤡

dual_sport_dork@lemmy.world on 02 Apr 20:01 next collapse

I already do. Flip a coin: Heads, the car is operating itself and is therefore being operated by a moron. Tails, the owner is driving it manually and therefore it is being operated by a moron.

Just be sure to carefully watch your six when you’re sitting at a stoplight. I’ve gotten out of the habit of sitting right in the center of the lane, because the odds are getting ever higher that I’ll have to scoot out of the way of some imbecile who’s coming in hot. That’s hard to do when your front tire is 24" away from the license plate of the car in front of you.

lka1988@lemmy.dbzer0.com on 02 Apr 20:14 collapse

For me it depends which bike I’m riding. If it’s my 49cc scooter, I’ll sit to the very right side of the lane for a quick escape while watching my mirrors like a hawk. On my XR500, I’ll just filter to the front (legal in Utah).

Korhaka@sopuli.xyz on 02 Apr 21:44 collapse

I filter to the front on my leg powered bike, most traffic light setups here have a region for bikes at the front of the cars.

EndlessNightmare@reddthat.com on 03 Apr 04:48 collapse

Good luck. They’re fucking everywhere, at least where I live.

[deleted] on 02 Apr 17:11 next collapse

.

Mustakrakish@lemmy.world on 02 Apr 18:02 next collapse

WHY CAN’T WE JUST HAVE PUBLIC TRANSIT, FUCK! TRAINS EXIST!

Knock_Knock_Lemmy_In@lemmy.world on 02 Apr 18:09 next collapse

Why? Crash rates for Self-Driving Cars (when adjusted for crash severity) are lower.

Removing sensors to save costs on self driving vehicles should be illegal

IDKWhatUsernametoPutHereLolol@lemmy.dbzer0.com on 02 Apr 21:21 next collapse

teslas aren’t even worthy of the designation “self-driving”. They use cheap cameras instead of LIDAR. It should be illegal to call such junk “self-driving”.

Lumbardo@reddthat.com on 02 Apr 21:44 collapse

Shouldn’t be an issue if drivers used it as a more advanced cruise control. Unless there is catastrophic mechanical or override failure, these things will always be the driver’s fault.

AnimalsDream@slrpnk.net on 02 Apr 17:18 next collapse

I imagine bicyclists must be æffected as well if they’re on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.

Time to go to Netherlands.

poopkins@lemmy.world on 02 Apr 17:33 next collapse

*affected

nulluser@lemmy.world on 02 Apr 18:02 next collapse

Thank you for your service.

AnimalsDream@slrpnk.net on 02 Apr 18:05 next collapse
NikkiDimes@lemmy.world on 02 Apr 18:11 collapse

Affectively, does it realy mater if someone has slite misstakes in there righting?

AngryRobot@lemmy.world on 02 Apr 21:36 collapse

I think i had a stroke reading that. Take your upvote and get out!

NikkiDimes@lemmy.world on 02 Apr 23:18 collapse

I’m not going to lie, I almost had a stroke writing it…

KayLeadfoot@fedia.io on 02 Apr 23:21 collapse

I upvoted every comment in this sub-thread shitshow and hated all of it.

You monster gif

AngryRobot@lemmy.world on 02 Apr 23:46 collapse

Yer welcome!

xor@lemmy.dbzer0.com on 02 Apr 21:19 next collapse

human driving cars still target bicyclists on purpose so i don’t know see how teslas could be any worse…

p.s. painting a couple lines on the side of the road does not make a safe bike lane… they need a physical barrier separating the road from them… like how curbs separate the road from sidewalks…

AnimalsDream@slrpnk.net on 03 Apr 00:33 collapse

I mean yeah, I just said above that someone almost killed me. They were probably a human driver. But that’s a “might happen, never know.” If self driving cars are rear-ending people, that’s an inherent artifact of it’s programming, even though it’s not intentionally programmed to do that.

So it’s like, things were already bad. I already do not feel safe doing any biking anymore. But as self driving cars become more prevalent, that threat upgrades to a kind of defacto, “Oh, these vast stretches of land are places where only cars and trucks are allowed. Everything else is roadkill waiting to happen.”

EndlessNightmare@reddthat.com on 03 Apr 04:45 collapse

this makes me never want to bike in the US again.

I live close enough to work for it to be a very reasonable biking distance. But there is no safe route. A high-speed “stroad” with a narrow little bike lane. It would only be a matter of time before some asshole with their face in their phone drifts into me.

I am deeply resentful of our automobile-centric infrastructure in the U.S. It’s bad for the environment, bad for our wallets, bad for our waistlines, and bad for physical safety.

IDKWhatUsernametoPutHereLolol@lemmy.dbzer0.com on 02 Apr 21:19 next collapse

Remember, you have the right to self-defence, against both rogue robots and rogue humans.

RaptorBenn@lemmy.zip on 02 Apr 21:36 collapse

How you plan to self defend against a vehicle?

Korhaka@sopuli.xyz on 02 Apr 21:39 next collapse

Propane cylinder. Mutually assured destruction.

RaptorBenn@lemmy.zip on 02 Apr 21:48 next collapse

Noice.

Eheran@lemmy.world on 03 Apr 05:51 collapse

It will do nothing. By the time a propane cylinder would rupture, even if we assume it actually ignites too, it would add very little to a massive crash that killed everyone and desintegrated everything.

mutual_ayed@sh.itjust.works on 03 Apr 11:23 next collapse

Don’t stop… I’m almost there

Simulation6@sopuli.xyz on 03 Apr 11:47 collapse

Claymore and trip wire?

KayLeadfoot@fedia.io on 03 Apr 20:40 collapse

The new "Start Seeing Motorcycles" merch just dropped!

IDKWhatUsernametoPutHereLolol@lemmy.dbzer0.com on 02 Apr 21:51 next collapse

The Arnold Method

RaptorBenn@lemmy.zip on 02 Apr 23:28 collapse

1000 fake internet point to you sir.

Test_Tickles@lemmy.world on 03 Apr 01:51 collapse

If it’s a Tesla truck, I guess I could splash it with half a Dixie cup full of water…

RaptorBenn@lemmy.zip on 02 Apr 21:35 next collapse

Makes sense, statistically smaller sample to be trained on, relatively easy fix, just retrain with more motorcycles in the data.

NotMyOldRedditName@lemmy.world on 02 Apr 22:05 next collapse

For what it’s worth, it really isn’t clear if this is FSD or AP based on the constant mention of self driving even when it’s older collisions when it would definitely been AP, and is even listed as AP if you click on the links to the crash.

So these may all be AP, or one or two might be FSD, it’s unclear.

Every Tesla has AP as well, so the likelihood of that being the case is higher.

psivchaz@reddthat.com on 02 Apr 23:58 next collapse

That’s not good though, right? “We have the technology to save lives, it works on all of our cars, and we have the ability to push it to every car in the fleet. But these people haven’t paid extra for it, so…”

NotMyOldRedditName@lemmy.world on 03 Apr 00:20 collapse

Well, only 1 or 2 of those were in a time frame where I’d consider FSD superior to AP, it’s a more recent development where that’s likely the case.

But to your point, at some point I expect Tesla to use the FSD software for AP for the exact reasons you mentioned. My guess is they’d just do something like disable making left/right turns , so you wouldn’t be able to use it outside of straight stretches like AP today.

AA5B@lemmy.world on 03 Apr 00:08 collapse

In this case, does it matter? Both are supposed to follow a vehicle at a safe distance

I’d be more interested in how it changes over time, as new software is pushed. While it’s important that know it had problems judging distance to a motorcycle, it’s more important to know whether it still does

NotMyOldRedditName@lemmy.world on 03 Apr 00:31 collapse

In this case, does it matter? Both are supposed to follow a vehicle at a safe distance

I think it does matter, while both are supposed to follow at safe distances, the FSD stack is doing it in a completely different way. They haven’t really been making any major updates to AP for many years now, all focus has been on FSD. I think the only real changes it’s had for quite awhile have been around making sure people are paying attention better.

AP is looking at the world frame by frame, each individual camera on it’s own, while FSD is taking the input of all cameras, turning into 3d vector space, and then driving based off that. Doing that on city streets and highways is only a pretty recent development. Updates for doing it this way on highway and streets only went out to all cars with FSD in the past few months. For a long time it was on city streets only.

I’d be more interested in how it changes over time, as new software is pushed.

I think that’s why it’s important to make a real distinction between AP and FSD today (and specifically which FSD versions)

They’re wholly different systems, one that gets older every day, and one that keeps getting better every few months. Making an article like this that groups them together over the span of years muddies the water on what / if any progress has been made.

KayLeadfoot@fedia.io on 03 Apr 07:17 collapse

Fair enough!

At least one of the fatalities is Full-Self Driving (it was cited by name in the police reports). The remainder are Autopilot. So, both systems kill motorcyclists. Tesla requests this data redacted from their NHTSA reporting, which specifically makes it difficult for consumers to measure which system is safer or if incremental safety improvements are actually being made.

You're placing a lot if faith that the incremental updates are improvements without equivalent regressions. That data is specifically being concealed from you, and I think you should probably ask why. If there was good news behind those redactions, they wouldn't be redactions.

I didn't publish the software version data point because I agree with AA5B, it doesn't matter. I honestly don't care how it works. I care that it works well enough to safely cohabit the road with my manual transmission cromagnon self.

I'm not a "Tesla reporter," I'm not trying to cover the incremental changes in their software versions. Plenty of Tesla fans doing that already. It only has my attention at all because it's killing vulnerable road users, and for that analysis we don't actually need to know which self-driving system version is killing people, just the make of car it is installed on.

NotMyOldRedditName@lemmy.world on 03 Apr 07:42 collapse

I’d say it’s a pretty important distinction to know if one or both systems have a problem and the level of how bad that problem is.

Also are you referencing the one in Seattle in 2024 for FSD? The CNBC article says FSD, but the driver said AP.

And especially back then, there’s also an important distinction of how they work.

FSD on highways wasn’t released until November 2024, and even then not everyone got it right away. So even if FSD was enabled, the crash may have been under AP.

Edit: Also if it was FSD for real (that 2024 crash would have had to happen on city streets, not a highway) then thats 1 motorcycle fatality in 3.6 billion miles. The other 4 happened over 10 billion miles. Is that not an improvement? (edit again: I should say we can’t tell it’s an improvement yet as we’d have to pass 5 billion, so the jury is still out I guess IF that crash was really on FSD)

Edit: I will cede though that as a motorcyclist, you can’t know what the Tesla is using, so you’d have to assume the worst.

Edit: Just correcting myself that I was wrong about FSD in 2024. The change over to neural nets happened in November, but FSD was still FSD on highways when this accident happened. It was even earlier than that when FSD became AP when you transitioned to higways

KayLeadfoot@fedia.io on 03 Apr 08:15 collapse

Police report for 2024 case attached, it is also linked in the original article: https://www.opb.org/article/2025/01/15/tesla-may-face-less-accountability-for-crashes-under-trump/

It was Full Self Driving, according to the police. They know because they downloaded the data off the vehicle's computer. The motorcyclist was killed on a freeway merge ramp.

All the rest is beyond my brief. Thought you might like the data to chew on, though.

NotMyOldRedditName@lemmy.world on 03 Apr 08:26 next collapse

The motorcyclist was killed on a freeway merge ramp.

I’d say that means it’s a very good chance that yes, while FSD was enabled, the crash happened under the older AP mode of driving, as it wasn’t until November 2024 that it was moved over to the new FSD neural net driving code.. I was wrong here, it actually was FSD then, it just wasn’t end to end neural nets then like it is now.

Also yikes… the report says the AEB kicked in, and the driver overrode it by pressing on the accelerator!

KayLeadfoot@fedia.io on 04 Apr 10:00 collapse

No shit on that yikes. That blew my fucking mind.

Half the time when your AEB activates, you are unconscious or dazed and you're just flailing around your cabin like a rag doll, because you've crashed. If your foot happens to flail into the accelerator, get ready for a very exciting (if short-lived) application of that impressive 0 to 60 time.

NotMyOldRedditName@lemmy.world on 03 Apr 08:41 collapse

Okay, so I’m going to edit my earlier replies but replying again so you see, as I was wrong.

Version 11/12 in 2023/2024 wasn’t using the AP code, it just wasn’t using the neural nets. So it was legitimately FSD, but it was running different code on the freeways (non neural net) vs on city streets (neural net)

But it was indeed FSD. Version 11.x was the change where it stopped using AP when you left city streets.

PalmTreeIsBestTree@lemmy.world on 02 Apr 22:29 next collapse

This is another reason I’ll never drive a motorcycle. Fuck that shit.

KayLeadfoot@fedia.io on 02 Apr 22:43 next collapse

It's like smoking: if you haven't started, don't XD

mutual_ayed@sh.itjust.works on 03 Apr 00:49 collapse

As a fellow meat crayon I agree

KayLeadfoot@fedia.io on 03 Apr 05:16 next collapse

Bahaha, that one is new to me.

Back when I worked on an ambulance, we called the no helmet guys organ donors.

This comment was brought to you by PTSD, and has been redacted in a rare moment of sobriety.

mutual_ayed@sh.itjust.works on 03 Apr 08:28 collapse

I also rammed 10cc spikes at the back of the bus, the world needs organ donors and motorcycles provide a great service for that. Hope your EMT career was short lived but rewarding.

KayLeadfoot@fedia.io on 03 Apr 10:09 collapse

My EMT career was both short lived and rewarding, right back at ya :)

Excrubulent@slrpnk.net on 03 Apr 06:05 next collapse

I remember finding a motorcycle community on reddit that called themselves “squids” or “squiddies” or something like that.

Their whole thing was putting road tyres on dirtbikes and riding urban environments like they were offroad obstacles. You know, ramping things, except on concrete.

They loved to talk about how dumb & short-lived they were. I couldn’t ever find that group again, so maybe I misremembered the “squid” name, but I wanted to find them again, not to ever try it - fuck that - but because the bikes looked super cool. I just have a thing for gender-bent vehicles.

real_squids@sopuli.xyz on 03 Apr 08:02 collapse

Calamari Racing Team. It’s mostly a counter-movement to r/Motorcycles, where most of the posters are seen as anti-fun. Their whole thing is that, not just a specific way to ride, they also have a legendary commenter that pays money for pics in full leather.

Excrubulent@slrpnk.net on 03 Apr 15:01 collapse

That’s the one! Thanks, that was un-googleable for me.

I guess the road-tyres-on-dirt-bikes thing was maybe a trend when I saw the sub.

Klear@lemmy.world on 03 Apr 06:33 collapse

Negative. I’m a meat popsicle.

mutual_ayed@sh.itjust.works on 03 Apr 08:30 collapse

Corbin?

Psythik@lemm.ee on 03 Apr 00:05 collapse

As someone who likes the open sky feeling, this is why I drive a convertible instead.

spacesatan@leminal.space on 02 Apr 22:54 next collapse

Unless it’s a higher rate than human drivers per mile or hours driven I do not care. Article doesn’t have those stats so it’s clickbait as far as I’m concerned

chetradley@lemm.ee on 02 Apr 23:18 next collapse

The fact that the other self driving brands logged zero motorcyclist fatalities means the technology exists to prevent more deaths. Tesla has chosen to allow more people to die in order to reduce cost. The families of those five dead motorcyclists certainly care.

KayLeadfoot@fedia.io on 02 Apr 23:24 collapse

[Edit: oh, my bad, I replied to you very cattily when I meant to reply to Satan. Sorry! Friendly fire! XD ]

KayLeadfoot@fedia.io on 02 Apr 23:19 next collapse

Thanks, 'Satan.

Do you know the number of miles driven by Tesla's self-driving tech? Because I don't, Tesla won't say, they're a remarkably non-transparent company where their tech is concerned. Near as I can tell, nobody does (other than folks locked up tight with NDAs). If the ratio of accidents-per-mile-driven looked good, you know as a flat fact that Elon would be Tweeting all about it.

Sorry you didn't find the death of 5 Americans newsworthy. I'll try harder for the next one.

spacesatan@leminal.space on 03 Apr 04:39 collapse

You’re right, 5 deaths isn’t newsworthy in the context of tens of thousands killed by human drivers each year.

Is it worse than human drivers is the only relevant point of comparison, which the article doesn’t make.

AA5B@lemmy.world on 03 Apr 00:19 next collapse

Same goes for the other vehicles. They didn’t even try to cover miles driven and it’s quite likely Tesla has far more miles of self-driving than anyone else.

I’d even go so far as to speculate the zero accidents of other self-driving vehicles could just be zero information because we don’t have enough information to call it zero

KayLeadfoot@fedia.io on 03 Apr 05:01 collapse

No, the zero accidents for other self-driving vehicles is actually zero :) You may have heard of this little boutique automotive manufacturer, Ford Motor Company. They're one of the primary competitors, and they are far above the mileage where you would expect a fatal accident if they were as safe as a human.

Ford has reported self-driving crashes (many of them!). Just no fatal crashes involving motorcycles, because I guess they don't fucking suck at making self-driving software.

I linked the data, it's all public governmental data, and only the Tesla crashes are heavily redacted. You could... IDK... read it, and then share your opinion about it?

AA5B@lemmy.world on 03 Apr 13:20 collapse

And how did it compare self-driving time or miles? Because on the surface if Tesla is responsible for 5 such accidents and Ford zero, but Tesla has significantly more than five times the self-driving time or miles, then we just don’t have data yet …… and I see an announcement that Ford expects full self driving in 2026, so it can’t have been used much yet

KayLeadfoot@fedia.io on 03 Apr 20:35 collapse

I don't think anyone has reliable public data on miles travelled. If it existed, I would use it. The fact that it doesn't exist tells you what you need to know about Level 2 ADAS system safety ;)

The only folks who are being real open with their data, near as I can tell, is Waymo. And Waymo has zero motorcycle fatalities, operating mostly in California, where the motorcycle driving culture is... absolutely fucking nuts uniquely risk-accepting.

kreskin@lemmy.world on 03 Apr 00:23 collapse

Cybertrucks have 17 times the mortality rate of the ford pinto.

motherjones.com/…/report-cybertruck-safety-ford-p…

spacesatan@leminal.space on 03 Apr 04:44 next collapse

Completely irrelevant to whether or not FSD is safer than human drivers.

KayLeadfoot@fedia.io on 03 Apr 04:54 collapse

I wrote the original analysis Mother Jones is citing there. Hah, how about that! Delights me to see it cited in the wild.

kreskin@lemmy.world on 04 Apr 04:41 collapse

nice work, worth feeling a bit of pride over.

KayLeadfoot@fedia.io on 04 Apr 07:40 collapse

Thanks! :j

Ledericas@lemm.ee on 03 Apr 00:13 next collapse

the cybertruck is sharp enough to cut a deer in half, surely a biker is just as vulnerable.

RagingRobot@lemmy.world on 03 Apr 00:56 collapse

I wonder if it’s happened yet

Ledericas@lemm.ee on 03 Apr 08:31 collapse

There was an article where he sliced a deer in half

kreskin@lemmy.world on 03 Apr 00:19 next collapse

I wonder if a state court judge could mandate its use as unsafe?

slaneesh_is_right@lemmy.org on 03 Apr 06:36 collapse

They are illegal in every developed country.

merdaverse@lemmy.world on 03 Apr 08:44 collapse

But muh innovation! How are genius CEOs supposed to innovate if they can’t use the public at large as guinea pigs??

arrakark@lemmy.ca on 03 Apr 06:05 next collapse

What bike is that in the photo?

KayLeadfoot@fedia.io on 03 Apr 06:39 next collapse

My partner and I were actually debating that exact question before I posted it!

It's just stock art, but of a rider in the Midwest. Custom exhaust, custom saddle and rack for that cafe racer look, and I just barely can't make out the model on the engine fairing.

Here it is all big, let me know if you can figure it out: https://unsplash.com/photos/a-person-riding-a-motorcycle-on-a-city-street-kPfwWyUWubA

Looks hot, that's why I picked it.

arrakark@lemmy.ca on 11 Apr 14:56 collapse

Found it! Thanks to your image.

It’s a 2016 - 2018 SYM Wolf Classic 150

lando55@lemmy.world on 03 Apr 13:39 collapse

It looks a great deal like a Royal Enfield, but I couldn’t tell you which model. A Bullet, maybe?

arrakark@lemmy.ca on 11 Apr 14:56 collapse

I think it’s a 2016 - 2018 SYM Wolf Classic 150

Redex68@lemmy.world on 03 Apr 11:33 next collapse

Cuz other self driving cars use LIDAR so it’s basically impossible for them to not realise that a bike is there.

jdeath@lemm.ee on 04 Apr 11:45 collapse

unless it’s foggy, etc.

the_three_tomatoes@lemmy.world on 03 Apr 12:00 next collapse

You mean they are providing organ donations more than any other car. Silver lining. /s

KayLeadfoot@fedia.io on 03 Apr 20:41 collapse

They call it the Model 3 because the Tesla Organ-Harvester didn't translate well to Chinese

SkunkWorkz@lemmy.world on 03 Apr 13:29 next collapse

It’s because the system has to rely on visual cues, since Tesla’s have no radar. The system looks at the tail light when it’s dark to gauge the distance from the vehicle. And since some bikes have a double light the system thinks it’s a car in front of them that is far away, when in reality it’s a bike up close. Also remember the ai is trained on human driving behavior which Tesla records from their customers. And we all know how well the average human drives around two wheeled vehicles.

aeternum@lemmy.blahaj.zone on 11 Apr 03:46 collapse

if only there was a government department to investigate these kinds of things… Too soon?

KayLeadfoot@fedia.io on 11 Apr 03:49 collapse

Disbanded!

... for effieciency!