Robot cars can be crashed with tinfoil and painted cardboard (www.theregister.com)
from sverit@lemmy.ml to technology@lemmy.world on 07 Jun 2024 13:10
https://lemmy.ml/post/16577168

A team of researchers from prominent universities – including SUNY Buffalo, Iowa State, UNC Charlotte, and Purdue – were able to turn an autonomous vehicle (AV) operated on the open sourced Apollo driving platform from Chinese web giant Baidu into a deadly weapon by tricking its multi-sensor fusion system, and suggest the attack could be applied to other self-driving cars.

#technology

threaded - newest

Fiivemacs@lemmy.ca on 07 Jun 2024 13:26 next collapse

And people are demanding for Chinese EVs…people don’t realize it’s not a car anymore, but a computer

snooggums@midwest.social on 07 Jun 2024 13:30 next collapse

This is a general issue with foolling sensors, not an issue specific to Chinese EVa.

NegativeInf@lemmy.world on 07 Jun 2024 14:31 collapse

I just want a cheap non internal combustion engine. I don’t care about self driving bullshit. I have eyes, arms, legs, and a brain, I’ll do the driving.

Atelopus-zeteki@kbin.run on 07 Jun 2024 14:36 next collapse

I'm thinking Toyota Hi-Lux EV.

SpaceNoodle@lemmy.world on 07 Jun 2024 14:36 next collapse

You’re also easily distracted, get tired, and emotional.

XeroxCool@lemmy.world on 07 Jun 2024 14:48 collapse

FUCK YOU NO I’M NOT 😢

BalooWasWahoo@links.hackliberty.org on 08 Jun 2024 17:00 collapse

FUCK YOU NO I’M NOT. HANG ON… yaaaaaaaaawn NOW WHERE FUCKING WAS I?!

Imgonnatrythis@sh.itjust.works on 07 Jun 2024 17:34 collapse

Self driving bullshit will save more lives than penicillin.

NegativeInf@lemmy.world on 07 Jun 2024 18:20 collapse

I’m not saying I don’t want it ever. I’m saying I don’t want it yet. An ev is not a self driving car inherently. I’m saying the feature does not yet fully exist, so don’t force half assed versions as safety features.

SpaceNoodle@lemmy.world on 07 Jun 2024 19:24 collapse

Actual autonomous driving won’t really be feasible as an add-on feature to consumer vehicles given the cost and maintenance requirements. It’ll be offered as a service like Uber, but without a human driver.

NegativeInf@lemmy.world on 07 Jun 2024 20:09 collapse

Awesome! But we aren’t there yet. So just let me buy a non self driving electric vehicle until we are.

Infynis@midwest.social on 07 Jun 2024 13:46 next collapse

This is the real reason Elon Musk doesn’t want people tracking his plane. If we know where he is, Wile E Coyote could catch up to him and trick his car into crashing into a brick wall, by painting a tunnel on it

finley@lemm.ee on 07 Jun 2024 14:35 collapse

What a great analogy

EvilBit@lemmy.world on 07 Jun 2024 13:57 next collapse

xkcd.com/1958/

TL;DR: faking out a self-driving system is always going to be possible, and so is faking out humans. But doing so is basically attempted murder, which is why the existence of an exploit like this is not interesting or new. You could also cut the brake lines or rig a bomb to it.

admin@lemmy.my-box.dev on 07 Jun 2024 14:59 next collapse

Awwww. Why did you have to break the circlejerk? People were enjoying it!

Eggyhead@kbin.run on 07 Jun 2024 16:43 collapse

I was so close to finishing, too.
Time to look for another doomsday thread, I guess.

ArbitraryValue@sh.itjust.works on 07 Jun 2024 15:07 next collapse

People seem to hold computers to a higher standard than other people when performing the same task.

phdepressed@sh.itjust.works on 07 Jun 2024 16:38 next collapse

Because humans have more accountability. Also it has implications for military/police use of self-guided stuff.

lolcatnip@reddthat.com on 07 Jun 2024 17:27 collapse

What is the purpose of accountability other than to force people to do better? If the lack of accountability doesn’t stop a computer from outperforming a human, why worry about it?

medgremlin@midwest.social on 07 Jun 2024 17:49 collapse

The lack of accountability means that there is nothing and no one to take responsibility when the robot/computer inevitably kills someone. A human can be faced with legal ramifications for their actions, the companies that make these computers have shown thus far that they are exempt from such consequences.

lolcatnip@reddthat.com on 07 Jun 2024 19:19 next collapse

That is simply not true. The law since basically forever had held that manufacturers are liable if their product malfunctions and hurts someone when it’s being operated in accordance with their instructions.

Edit: I hope all y’all who think the rule of law doesn’t exist are gonna vote against the felony party.

Kanzar@sh.itjust.works on 07 Jun 2024 20:18 next collapse

Excuse us for being sceptical that businesses will actually be held accountable. We know legally they are, but will forced arbitration or delayed court proceedings mean people too poor to afford a good lawyer for long will have to fuck off?

medgremlin@midwest.social on 07 Jun 2024 20:55 collapse

The current court cases show that the manufacturers are trying to fob off responsibility onto the owners of the vehicles by way of TOS agreements with lots of fine print and Tesla in particular is getting slammed for false advertising about the capabilities of their self-driving features while they simultaneously try to force all legal liability onto the drivers that believed their advertising.

Turun@feddit.de on 07 Jun 2024 21:36 collapse

That is true for most current “self driving” systems, because they are all just glorified assist features. Tesla is misleading its customers massively with their advertisement, but on paper it’s very clear that the car will only assist in safe conditions, the driver needs to be able to react immediately at all times and therefore is also liable.

However, Mercedes (I think it was them) have started to roll out a feather where they will actually take responsibility for any accidents that happen due to this system. For now it’s restricted to nice weather and a few select roads, but the progress is there!

medgremlin@midwest.social on 08 Jun 2024 00:49 collapse

The driverless robo-taxis are also a concern. When one of them killed someone in San Francisco there was not a clear responsible entity to charge with the crime.

Fedizen@lemmy.world on 08 Jun 2024 17:55 collapse

I think human responses vary too much: could you follow a strategy that makes 50% of human drivers crash reliably? probably. Could you follow a strategy to make 100% of autonomous vehicles crash reliably? Almost certainly.

Imgonnatrythis@sh.itjust.works on 07 Jun 2024 17:32 next collapse

Or if it’s a Tesla you could hack someone’s weather app and thus force them to drive in the rain.

Beryl@lemmy.world on 07 Jun 2024 18:02 next collapse

You don’t even have to rig a bomb, a better analogy to the sensor spoofing would be to just shine a sufficiently bright light in the driver’s eyes from the opposite side of the road. Things will go sideways real quick.

EvilBit@lemmy.world on 07 Jun 2024 19:46 collapse

It’s not meant to be a perfect example. It’s a comparable principle. Subverting the self-driving like that is more or less equivalent to any other means of attempting to kill someone with their car.

Beryl@lemmy.world on 07 Jun 2024 19:59 collapse

I don’t disagree, i’m simply trying to present a somewhat less extreme (and therefore i think more appealing) version of your argument

uriel238@lemmy.blahaj.zone on 08 Jun 2024 04:30 collapse

More exciting would be an exploit that renders an unmoving car useless. But exploits like this absolutely will be used in cases were tire-slashing might be used, such as harassing genocidal vips or disrupting police services, especially if it’s difficult to trace the drone to its controller.

MeatPilot@lemmy.world on 07 Jun 2024 15:26 next collapse

<img alt="" src="https://lemmy.world/pictrs/image/450db603-f985-4c2c-a1f2-20483d599ad4.jpeg">

NeoNachtwaechter@lemmy.world on 07 Jun 2024 17:25 next collapse

It is old.

I mean, not this certain attack, but the principle is well known.

The solution is also known: any sensor (or at least any critically important sensor) in a robotic system must be able to recognize it’s own state of “blindness”. The system must react accordingly. (For example, with the camera behind the windshield, it would activate the wipers and the heating in the windshield to remove possible rain, snow or dirt). If several sensors go “blind” at the same time, the system must do a safe stop of the car.

CheeseNoodle@lemmy.world on 07 Jun 2024 20:12 next collapse

Its still a problem, A sheet held across the road on a string would show up as a wall to both cameras and lidar. I for one am buffalo buffalo buffalo buffalo buffalo looking forward to the emerging profession of road pirates robbing automated trucks this way.

NeoNachtwaechter@lemmy.world on 07 Jun 2024 20:42 collapse

road pirates robbing automated trucks

Ok but the problem of road pirates isn’t new either, is it? Let’s watch ‘Herbie’ again :-)

There is just one risk that is kinda new (but actually coming with every automation): systematic errors could bring vulnerabilities that get exploited in large numbers.

Gustephan@lemmy.world on 07 Jun 2024 20:36 collapse

It’s basically chaff, lol. We’ve known chaff is an effective radar countermeasure since the 40s, and it seems like the researchers have found the lidar and optical equivalents of chaff. What really scares me is the idea of this evolving into more sophisticated deception attacks like range or velocity gate pulls. No idea how you’d do that with lidar or optically, but I’d bet money that’s a line item on a black budget somewhere

KISSmyOSFeddit@lemmy.world on 07 Jun 2024 20:43 next collapse

Human-driven cars can be crashed with a brick, or a quart of oil.

simplejack@lemmy.world on 08 Jun 2024 00:30 collapse

Wait until you see what my uncle Jerry can do with a 5th of vodka and his Highlander.

the_doktor@lemmy.zip on 08 Jun 2024 08:00 collapse

Just ban the goddamn things already.