The Pentagon is moving toward letting AI weapons autonomously decide to kill humans (www.businessinsider.com)
from return2ozma@lemmy.world to technology@lemmy.world on 25 Nov 2023 03:34
https://lemmy.world/post/8715340

#technology

threaded - newest

autotldr@lemmings.world on 25 Nov 2023 03:35 next collapse

This is the best summary I could come up with:


The deployment of AI-controlled drones that can make autonomous decisions about whether to kill human targets is moving closer to reality, The New York Times reported.

Lethal autonomous weapons, that can select targets using AI, are being developed by countries including the US, China, and Israel.

The use of the so-called “killer robots” would mark a disturbing development, say critics, handing life and death battlefield decisions to machines with no human input.

“This is really one of the most significant inflection points for humanity,” Alexander Kmentt, Austria’s chief negotiator on the issue, told The Times.

Frank Kendall, the Air Force secretary, told The Times that AI drones will need to have the capability to make lethal decisions while under human supervision.

The New Scientist reported in October that AI-controlled drones have already been deployed on the battlefield by Ukraine in its fight against the Russian invasion, though it’s unclear if any have taken action resulting in human casualties.


The original article contains 376 words, the summary contains 158 words. Saved 58%. I’m a bot and I’m open source!

Kraven_the_Hunter@lemmy.dbzer0.com on 25 Nov 2023 03:37 next collapse

The code name for this top secret program?

Skynet.

EfficientEffigy@lemmy.world on 25 Nov 2023 03:38 next collapse

This can only end well

capt_wolf@lemmy.world on 25 Nov 2023 03:51 next collapse

Project ED-209

0nXYZ@lemmy.world on 25 Nov 2023 05:00 collapse

“You have 20 seconds to reply…”

stopthatgirl7@kbin.social on 25 Nov 2023 04:11 next collapse

“Sci-Fi Author: In my book I invented the
Torment Nexus as a cautionary tale

Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus”

DarkThoughts@kbin.social on 25 Nov 2023 04:23 next collapse
[deleted] on 25 Nov 2023 05:19 next collapse

.

[deleted] on 25 Nov 2023 23:31 collapse

.

BombOmOm@lemmy.world on 25 Nov 2023 03:55 next collapse

As an important note in this discussion, we already have weapons that autonomously decide to kill humans. Mines.

Chuckf1366@sh.itjust.works on 25 Nov 2023 04:04 next collapse

Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention. Comparing an autonomous murder machine to a mine is like comparing a flint lock pistol to the fucking gattling cannon in an a10.

gibmiser@lemmy.world on 25 Nov 2023 04:12 next collapse

Well, an important point you and him. Both forget to mention is that mines are considered inhumane. Perhaps that means AI murdering should also be considered. Inhumane, and we should just not do it instead of allowing landmines.

livus@kbin.social on 25 Nov 2023 06:08 collapse

This, jesus, we're still losing limbs and clearing mines from wars that were over decades ago.

An autonomous field of those is horror movie stuff.

Chozo@kbin.social on 25 Nov 2023 05:45 next collapse

Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention.

Pretty sure the entire DOD got a collective boner reading this.

MindSkipperBro12@lemmy.world on 25 Nov 2023 11:18 collapse

And NonCredibleDefense

FaceDeer@kbin.social on 25 Nov 2023 06:25 next collapse

Imagine a mine that could recognize "that's just a child/civilian/medic stepping on me, I'm going to save myself for an enemy soldier." Or a mine that could recognize "ah, CenCom just announced a ceasefire, I'm going to take a little nap." Or "the enemy soldier that just stepped on me is unarmed and frantically calling out that he's surrendered, I'll let this one go through. Not the barrier troops chasing him, though."

There's opportunities for good here.

Nudding@lemmy.world on 25 Nov 2023 11:25 next collapse

Lmao are you 12?

MindSkipperBro12@lemmy.world on 25 Nov 2023 11:32 collapse

They do have the mentality of one.

FlyingSquid@lemmy.world on 25 Nov 2023 11:28 next collapse

Yes, those definitely sound like the sort of things military contractors consider.

FaceDeer@kbin.social on 25 Nov 2023 15:04 collapse

Why waste a mine on the wrong target?

FlyingSquid@lemmy.world on 25 Nov 2023 15:07 collapse

Why occupy a hospital?

dependencyinjection@discuss.tchncs.de on 25 Nov 2023 16:35 collapse

Why encroach on others land?

FlyingSquid@lemmy.world on 25 Nov 2023 16:36 collapse

Sorry… are you saying that’s what Palestinians are doing?

dependencyinjection@discuss.tchncs.de on 25 Nov 2023 16:56 collapse

I feel you’re being obtuse on purpose here, but no I’m saying that the other side of this conflict has been doing that.

livus@kbin.social on 25 Nov 2023 23:30 collapse

Pretty sure you and @FlyingSquid are on the same side and making the same point but misunderstanding each other.

key@lemmy.keychat.org on 25 Nov 2023 15:59 next collapse

Maybe it starts that way but once that’s accepted as a thing the result will be increased usage of mines. Where before there were too many civilians to consider using mines, now the soldiers say “it’s smart now, it won’t blow up children” and put down more and more in more dangerous situations. And maybe those mines only have a 0.1% failure rate in tested situations but a 10% failure rate over the course of decades. Usage increases 10 fold and then you quickly end up with a lot more dead kids.

Plus it won’t just be mines, it’ll be automated turrets when previously there were none or even more drone strikes with less oversight required because the automated system is supposed to prevent unintended casualties.

Availability drives usage.

livus@kbin.social on 25 Nov 2023 23:38 next collapse

@FaceDeer okay so now that mines allegedly recognise these things they can be automatically deployed in cities.

Sure there's a 5% margin of error but that's an "acceptable" level of colateral according to their masters. And sure they are better at recognising some ethnicities than others but since those they discriminate against aren't a dominant part of the culture that peoduces them, nothing gets done about it.

And after 20 years when the tech is obsolete and they all start malfunctioning we're left with the same problems we have with current mines, only because the ban on mines was reversed the scale of the problem is much much worse than ever before.

theneverfox@pawb.social on 26 Nov 2023 08:19 collapse

That sounds great… Why don’t we line the streets with them? Every entryway could scan for hostiles. Maybe even use them against criminals

What could possibly go wrong?

Sterile_Technique@lemmy.world on 25 Nov 2023 06:46 collapse

Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention. Comparing an autonomous murder machine to a mine is like comparing a flint lock pistol to the fucking gattling cannon in an a10.

For what it’s worth, there’s footage on youtube of drone swarm demonstrations that were posted 6 years ago. Considering that the military doesn’t typically release footage of the cutting edge of its tech to the public, so this demonstration was likely for a product that was already going obsolete; and that the 6 years that have passed since have made lightning fast developments in things like facial recognition… at this point I’d be surprised if we weren’t already at the very least field testing the murder machines you described.

PipedLinkBot@feddit.rocks on 25 Nov 2023 06:46 collapse

Here is an alternative Piped link(s):

footage on youtube of drone swarm demonstrations

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

MonsiuerPatEBrown@reddthat.com on 25 Nov 2023 04:48 collapse

That is like saying that Mendelian pea plant fuckery and CRISPR therapy is basically the same thing.

Nobody@lemmy.world on 25 Nov 2023 04:09 next collapse

What’s the opposite of eating the onion? I read the title before looking at the site and thought it was satire.

Wasn’t there a test a while back where the AI went crazy and started killing everything to score points? Then, they gave it a command to stop, so it killed the human operator. Then, they told it not to kill humans, and it shot down the communications tower that was controlling it and went back on a killing spree. I could swear I read that story not that long ago.

Nutteman@lemmy.world on 25 Nov 2023 04:15 collapse

It was a nothingburger. A thought experiment.

www.reuters.comarticle/idUSL1N38023R/

FaceDeer@kbin.social on 25 Nov 2023 06:19 collapse

The link was missing a slash: https://www.reuters.com/article/idUSL1N38023R/

This is typically how stories like this go. Like most animals, humans have evolved to pay extra attention to things that are scary and give inordinate weight to scenarios that present danger when making decisions. So you can present someone with a hundred studies about how AI really behaves, but if they've seen the Terminator that's what sticks in their mind.

kromem@lemmy.world on 25 Nov 2023 10:29 next collapse

Even the Terminator was the byproduct of this.

In the 50s/60s when they were starting to think about what it might look like when something smarter than humans would exist, the thing they were drawing on as a reference was the belief that homo sapiens had been smarter than the Neanderthals and killed them all off.

Therefore, the logical conclusion was that something smarter than us would be an existential threat that would compete with us and try to kill us all.

Not only is this incredibly stupid (i.e. compete with us for what), it is based on BS anthropology. There’s no evidence we were smarter than the Neanderthals, we had cross cultural exchanges back and forth with them over millennia, had kids with them, and the more likely thing that killed them off was an inability to adapt to climate change and pandemics (in fact, severe COVID infections today are linked to a Neanderthal gene in humans).

But how often do you see discussion of AGI as being a likely symbiotic coexistence with humanity? No, it’s always some fearful situation because we’ve been self-propagandizing for decades with bad extrapolations which in turn have turned out to be shit predictions to date (i.e. that AI would never exhibit empathy or creativity, when both are key aspects of the current iteration of models, and that they would follow rules dogmatically when the current models barely follow rules at all).

sukhmel@programming.dev on 25 Nov 2023 21:47 collapse

That highly depends on the outcome of a problem. Like you don’t test much if you program a Lego car, but you do test everything very thorough if you program a satellite.

In this case the amount of testing needed to allow a killerbot to run unsupervised will probably be so big that it will never be even half done.

TransplantedSconie@lemm.ee on 25 Nov 2023 04:21 next collapse

Well, Ultron is inevitable.

Who we got for the Avengers Initiative?

frickineh@lemmy.world on 25 Nov 2023 04:32 collapse

Ultron and Project Insight. It’s like the people in charge watched those movies and said, “You know, I think Hydra had the right idea!”

TransplantedSconie@lemm.ee on 25 Nov 2023 04:36 collapse

Wouldn’t put it past this timeline.

AceFuzzLord@lemm.ee on 25 Nov 2023 04:23 next collapse

As disturbing as this is, it’s inevitable at this point. If one of the superpowers doesn’t develop their own fully autonomous murder drones, another country will. And eventually those drones will malfunction or some sort of bug will be present that will give it the go ahead to indiscriminately kill everyone.

If you ask me, it’s just an arms race to see who build the murder drones first.

Pheonixdown@lemm.ee on 25 Nov 2023 04:28 next collapse

I feel like it’s ok to skip to optimizing the autonomous drone-killing drone.

You’ll want those either way.

threelonmusketeers@sh.itjust.works on 25 Nov 2023 07:17 collapse

If entire wars could be fought by proxy with robots instead of humans, would that be better (or less bad) than the way wars are currently fought? I feel like it might be.

Pheonixdown@lemm.ee on 25 Nov 2023 08:11 collapse

You’re headed towards the Star Trek episode “A Taste of Armageddon”. I’d also note, that people losing a war without suffering recognizable losses are less likely to surrender to the victor.

FaceDeer@kbin.social on 25 Nov 2023 06:16 next collapse

A drone that is indiscriminately killing everyone is a failure and a waste. Even the most callous military would try to design better than that for purely pragmatic reasons, if nothing else.

SomeSphinx@lemmy.world on 25 Nov 2023 15:47 collapse

Even the best laid plans go awry though. The point is even if they pragmatically design it to not kill indiscriminately, bugs and glitches happen. The technology isn’t all the way there yet and putting the ability to kill in the machine body of something that cannot understand context is a terrible idea. It’s not that the military wants to indiscriminately kill everything, it’s that they can’t possibly plan for problems in the code they haven’t encountered yet.

KeenFlame@feddit.nu on 25 Nov 2023 16:02 collapse

Other weapons of mass destruction, biological and chemical warfare have been successfully avoided in war, this should be classified exactly the same

chemical_cutthroat@lemmy.world on 25 Nov 2023 04:29 next collapse

We’ve been letting other humans decide since the dawn of time, and look how that’s turned out. Maybe we should let the robots have a chance.

FaceDeer@kbin.social on 25 Nov 2023 06:14 collapse

I'm not expecting a robot soldier to rape a civilian, for example.

Pratai@lemmy.ca on 25 Nov 2023 04:33 next collapse

Won’t that be fun!

/s

Silverseren@kbin.social on 25 Nov 2023 04:35 next collapse

The sad part is that the AI might be more trustworthy than the humans being in control.

Varyk@sh.itjust.works on 25 Nov 2023 04:57 next collapse

No. Humans have stopped nuclear catastrophes caused by computer misreadings before. So far, we have a way better decision-making track record.

Autonomous killings is an absolutely terrible, terrible idea.

The incident I’m thinking about is geese being misinterpreted by a computer as nuclear missiles and a human recognizing the error and turning off the system, but I can only find a couple sources for that, so I found another:

In 1983, a computer thought that the sunlight reflecting off of clouds was a nuclear missile strike and a human waited for corroborating evidence rather than reporting it to his superiors as he should have, which would have likely resulted in a “retaliatory” nuclear strike.

…wikipedia.org/…/1983_Soviet_nuclear_false_alarm_…

As faulty as humans are, it’s a good a safeguard as we have to tragedies. Keep a human in the chain.

alternative_factor@kbin.social on 25 Nov 2023 05:57 collapse

Self-driving cars lose their shit and stop working if a kangaroo gets in their way, one day some poor people are going to be carpet bombed because of another strange creature no one every really thinks about except locals.

livus@kbin.social on 25 Nov 2023 05:56 next collapse

Have you never met an AI?

Edit: seriously though, no. A big player in the war AI space is Palantir which currently provides facial recognition to Homeland Security and ICE. They are very interested in drone AI. So are the bargain basement competitors.

Drones already have unacceptably high rates of civilian murder. Outsourcing that still further to something with no ethics, no brain, and no accountability is a human rights nightmare. It will make the past few years look benign by comparison.

FlyingSquid@lemmy.world on 25 Nov 2023 11:33 next collapse

Yeah, I think the people who are saying this could be a good thing seem to forget that the military always contracts out to the lowest bidder.

SCB@lemmy.world on 25 Nov 2023 16:11 collapse

Drone strikes minimize casualties compared to the alternatives - heavier ordinance on bigger delivery systems or boots on the ground

If drone strikes upset you, your anger is misplaced if you’re blaming drones. You’re really against military strikes at those targets, full stop.

livus@kbin.social on 25 Nov 2023 23:25 collapse

When the targets are things like that wedding in Mali sure.

I think your argument is a bit like saying depleted uranium is better than the alternative, a nuclear bomb. When the bomb was never on the table for half the stuff depleted uranium is.

Boots on the ground or heavy ordinance were never a viable option for some of the stuff drones are used for.

SCB@lemmy.world on 27 Nov 2023 13:44 collapse

Boots on the ground or heavy ordinance were never a viable option for some of the stuff drones are used for.

It was literally the standard policy prior to drones.

kromem@lemmy.world on 25 Nov 2023 10:31 collapse

Eventually maybe. But not for the initial period where the tech is good enough to be extremely deadly but not smart enough to realize that often being deadly is the stupider choice.

Uranium3006@kbin.social on 25 Nov 2023 04:48 next collapse

How about no

MindSkipperBro12@lemmy.world on 25 Nov 2023 05:09 collapse

Yeah, only humans can indiscriminately kill people!

tsonfeir@lemm.ee on 25 Nov 2023 04:57 next collapse

If we don’t, they will. And we can only learn by seeing it fail. To me, the answer is obvious. Stop making killing machines. 🤷‍♂️

Pirky@lemmy.world on 25 Nov 2023 05:03 next collapse

Horizon: Zero Dawn, here we come.

RiikkaTheIcePrincess@kbin.social on 25 Nov 2023 05:23 next collapse

Hey, I like that game! Oh, wait... 🤔

CaptKoala@lemmy.ml on 25 Nov 2023 06:43 next collapse

It won’t be nearly as interesting or fun (as Horizon) I don’t think.

Spacehooks@reddthat.com on 25 Nov 2023 15:18 collapse

Can we all agree to protest self replication?

MindSkipperBro12@lemmy.world on 25 Nov 2023 05:07 next collapse

For everyone who’s against this, just remember that we can’t put the genie back in the bottle. Like the A Bomb, this will be a fact of life in the near future.

All one can do is adapt to it.

kambusha@feddit.ch on 25 Nov 2023 06:38 next collapse

If you can dodge a wrench, you can dodge anything.

FlyingSquid@lemmy.world on 25 Nov 2023 11:27 collapse

Similarly, if you can dodge a shoe, you can dodge war crimes tribunals.

klemptor@lemmy.ml on 25 Nov 2023 12:20 collapse

Oh snap

kromem@lemmy.world on 25 Nov 2023 10:13 collapse

There is a key difference though.

The A bomb wasn’t a technology that as the arms race advanced enough would develop the capacity to be anywhere between a conscientious objector to an usurper.

There’s a prisoner’s dilemma to arms races that in this case is going to lead to world powers effectively paving the path to their own obsolescence.

In many ways, that’s going to be uncharted territory for us all (though not necessarily a bad thing).

RiikkaTheIcePrincess@kbin.social on 25 Nov 2023 05:33 next collapse

LLM "AI" fans thinking "Hey, humans are dumb and AI is smart so let's leave murder to a piece of software hurriedly cobbled together by a human and pushed out before even they thought it was ready!"

I guess while I'm cheering the fiery destruction of humanity I'll be thanking not the wonderful being who pressed the "Yes, I'm sure I want to set off the antimatter bombs that will end all humans" but the people who were like "Let's give the robots a chance! It's not like the thinking they don't do could possibly be worse than that of the humans who put some of their own thoughts into the robots!"

I just woke up, so you're getting snark. makes noises like the snarks from Half-Life You'll eat your snark and you'll like it!

heygooberman@lemmy.today on 25 Nov 2023 05:45 next collapse

Didn’t Robocop teach us not to do this? I mean, wasn’t that the whole point of the ED-209 robot?

aeronmelon@lemm.ee on 25 Nov 2023 06:13 next collapse

Every warning in pop culture (1984, Starship Troopers, Robocop) has been misinterpreted as a framework upon which to nail the populous to.

drbluefall@toast.ooo on 25 Nov 2023 08:34 next collapse

something something torment nexus

FaceDeer@kbin.social on 25 Nov 2023 09:03 collapse

Every warning in pop culture is being misinterpreted as something other than a fun/scary movie designed to sell tickets, being imagined as a scholarly attempt at projecting a plausible outcome instead.

MBM@lemmings.world on 25 Nov 2023 11:11 collapse

People didn’t seem to like my movie idea “Terminator, but the AI is actually very reasonable and not murderous”

FlyingSquid@lemmy.world on 25 Nov 2023 11:26 collapse

Every single thing in The Hitchhiker’s Guide to the Galaxy says AI is a stupid and terrible idea. And Elon Musk says it’s what inspired him to create an AI.

1984@lemmy.today on 25 Nov 2023 05:56 next collapse

Future is gonna suck, so enjoy your life today while the future is still not here.

Thorny_Insight@lemm.ee on 25 Nov 2023 06:41 next collapse

Thank god today doesn’t suck at all

1984@lemmy.today on 25 Nov 2023 10:25 collapse

Right? :)

myrmidex@slrpnk.net on 25 Nov 2023 07:34 next collapse

The future might seem far off, but it starts right now.

Kalkaline@leminal.space on 25 Nov 2023 09:46 collapse

At least it will probably be a quick and efficient death of all humanity when a bug hits the system and AI decides to wipe us out.

FaceDeer@kbin.social on 25 Nov 2023 06:11 next collapse

If you program an AI drone to recognize ambulances and medics and forbid them from blowing them up, then you can be sure that they will never intentionally blow them up. That alone makes them superior to having a Mk. I Human holding the trigger, IMO.

crypticthree@lemmy.world on 25 Nov 2023 06:19 next collapse

Did you know that “if” is the middle word of life

GigglyBobble@kbin.social on 25 Nov 2023 06:58 next collapse

Unless the operator decides hitting exactly those targets fits their strategy and they can blame a software bug.

FaceDeer@kbin.social on 25 Nov 2023 07:08 collapse

And then when they go looking for that bug and find the logs showing that the operator overrode the safeties instead, they know exactly who is responsible for blowing up those ambulances.

GigglyBobble@kbin.social on 25 Nov 2023 07:10 next collapse

And if the operator was commanded to do it? And to delete the logs? How naive are you that this is somehow makes war more humane?

FaceDeer@kbin.social on 25 Nov 2023 08:52 collapse

Each additional safeguard makes it harder and adds another name to the eventual war crimes trial. Don't let the perfect be the enemy of the good, especially when it comes to reducing the number of ambulances that get blown up in war zones.

[deleted] on 25 Nov 2023 23:13 collapse

.

mihies@kbin.social on 25 Nov 2023 07:13 next collapse

It doesn't work like that though. Western (backed) military can do and does that unpunished.

mihies@kbin.social on 25 Nov 2023 07:21 collapse
FlyingSquid@lemmy.world on 25 Nov 2023 11:30 collapse

Israeli general: Captain, were you responsible for reprogramming the drones to bomb those ambulances?

Israeli captain: Yes, sir! Sorry, sir!

Israeli general: Captain, you’re just the sort of man we need in this army.

FaceDeer@kbin.social on 25 Nov 2023 15:06 collapse

Ah, evil people exist and therefore we should never develop technology that evil people could use for evil. Right.

FlyingSquid@lemmy.world on 25 Nov 2023 15:08 collapse

Seems like a good reason not to develop technology to me. See also: biological weapons.

FaceDeer@kbin.social on 25 Nov 2023 15:11 collapse

Those weapons come out of developments in medicine. Technology itself is not good or evil, it can be used for good or for evil. If you decide not to develop technology you're depriving the good of it as well. My point earlier is to show that there are good uses for these things.

FlyingSquid@lemmy.world on 25 Nov 2023 15:14 next collapse

Hmm… so maybe we keep developing medicine but not as a weapon and we keep developing AI but not as a weapon.

Or can you explain why one should be restricted from weapons development and not the other?

livus@kbin.social on 25 Nov 2023 23:19 collapse

I disagree with your premise here. Taking a life is a serious step. A machine that unilaterally decides to kill some people with no recourse to human input has no good application.

It's like inventing a new biological weapon.

By not creating it, you are not depriving any decent person of anything that is actually good.

Chuckf1366@sh.itjust.works on 25 Nov 2023 07:11 next collapse

It’s more like we’re giving the machine more opportunities to go off accidentally or potentially encouraging more use of civilian camouflage to try and evade our hunter killer drones.

kromem@lemmy.world on 25 Nov 2023 10:20 collapse

Right, because self-driving cars have been great at correctly identifying things.

And those LLMs have been following their rules to the letter.

We really need to let go of our projected concepts of AI in the face of what’s actually been arriving. And one of those things we need to let go of is the concept of immutable rule following and accuracy.

In any real world deployment of killer drones, there’s going to be an acceptable false positive rate that’s been signed off on.

FaceDeer@kbin.social on 25 Nov 2023 15:08 collapse

We are talking about developing technology, not existing tech.

And actually, machines have become quite adept at image recognition. For some things they're already better at it than we are.

lemba@discuss.tchncs.de on 25 Nov 2023 06:14 next collapse

Good to know that Daniel Ek, founder and CEO of Spotify, invests in military AI… www.handelsblatt.com/technik/…/27779646.html?tick…

cheese_greater@lemmy.world on 25 Nov 2023 08:18 collapse

ACAB

All C-Suite are Bastards

themurphy@lemmy.world on 25 Nov 2023 07:14 next collapse

I think people are forgetting that drones like these will also be made to protect. And I don’t mean in a police kinda way.

But if let’s say Argentina deployed these against Brazil. Brazil will have a defending lineup. They would fight out war.

Then everyone watching will see this makes no sense to let those robots fight it out. Both countries will produce more robots until yeah… No more wires and metal I guess.

Future = less real war, more cold war. Just like the A-bomb works today.

FlyingSquid@lemmy.world on 25 Nov 2023 11:35 collapse

Then everyone watching will see this makes no sense to let those robots fight it out.

Just like how WWI was the War to End All Wars, right?

Future = less real war, more cold war. Just like the A-bomb works today.

Sorry, how is there less war now?

Immersive_Matthew@sh.itjust.works on 25 Nov 2023 07:16 next collapse

We are all worried about AI, but it is humans I worry about and how we will use AI not the AI itself. I am sure when electricity was invented people also feared it but it was how humans used it that was/is always the risk.

shrugal@lemm.ee on 25 Nov 2023 12:47 collapse

Both honesty. AI can reduce accountability and increase the power small groups of people have over everyone else, but it can also go haywire.

Immersive_Matthew@sh.itjust.works on 26 Nov 2023 17:40 collapse

It will go haywire in areas for sure.

phoneymouse@lemmy.world on 25 Nov 2023 08:25 next collapse

Can’t figure out how to feed and house everyone, but we have almost perfected killer robots. Cool.

sneezycat@sopuli.xyz on 25 Nov 2023 08:57 next collapse

Oh no, we figured it out, but killer robots are profitable while happiness is not.

MartinXYZ@sh.itjust.works on 25 Nov 2023 09:42 next collapse

Oh no, we figured it out, but killer robots are profitable while happiness survival is not.

sneezycat@sopuli.xyz on 25 Nov 2023 10:14 collapse

No, it isn’t just about survival. People living on the streets are surviving. They have no homes, they barely have any food.

HerbalGamer@sh.itjust.works on 25 Nov 2023 10:16 collapse

obviously that’s just a lifestyle choice

o2inhaler@lemmy.ca on 25 Nov 2023 10:10 collapse

I would argue happiness is profitable, but would have to shared amongst the people. Killer robots are profitable for a concentrated group of people

Strobelt@lemmy.world on 25 Nov 2023 13:01 collapse

What if we gave everyone their own killer robot and then everyone could just fight each other for what they wanted?

winterayars@sh.itjust.works on 25 Nov 2023 14:21 collapse

Ah yes the Republican plan.

zalgotext@sh.itjust.works on 25 Nov 2023 20:45 collapse

No the Republican plan would be to sell killer robots at a vastly inflated price to guarantee none but the rich can own them, and then blame people for “being lazy” when they can’t afford their own killer robot.

TopRamenBinLaden@sh.itjust.works on 25 Nov 2023 21:51 next collapse

Also, they would say that the second amendment very obviously covers killer robots. The founding fathers definitely foresaw the AI revolution, and wanted to give every man and woman the right to bear killer robots.

winterayars@sh.itjust.works on 25 Nov 2023 22:11 collapse

They’d say they’re gonna pass a law to give every male, property owning citizen a killer robot but first they have to pass a law saying it’s legal to own killer robots. They pass that law then all talk about the other law is dropped forever. No one ever follows up or asks what happened to it. Meanwhile, the rich buy millions and millions of killer robots.

onlinepersona@programming.dev on 25 Nov 2023 11:01 next collapse

What’s more important, a free workforce or an obedient one?

cosmicrookie@lemmy.world on 25 Nov 2023 11:20 collapse

Especially one that is made to kill everybody else except their own. Let it replace the police. I’m sure the quality controll would be a tad stricter then

redcalcium@lemmy.institute on 25 Nov 2023 08:40 next collapse

“Deploy the fully autonomous loitering munition drone!”

“Sir, the drone decided to blow up a kindergarten.”

“Not our problem. Submit a bug report to Lockheed Martin.”

Agent641@lemmy.world on 25 Nov 2023 08:52 next collapse

“Your support ticked was marked as duplicate and closed”

😳

pivot_root@lemmy.world on 25 Nov 2023 09:00 collapse

Goes to original ticket:

Status: WONTFIX

“This is working as intended according to specifications.”

spirinolas@lemmy.world on 25 Nov 2023 11:45 collapse

“Your military robots slaughtered that whole city! We need answers! Somebody must take responsibility!”

“Aaw, that really sucks starts rubbing nipples I’ll submit a ticket and we’ll let you know. If we don’t call in 2 weeks…call again and we can go through this over and over until you give up.”

“NO! I WANT TO TALK TO YOUR SUPERVISOR NOW”

“Suuure, please hold.”

sukhmel@programming.dev on 25 Nov 2023 21:29 collapse

Nah, too straightforward for a real employee. Also, they would be talking to a phone robot instead that will mever let them talk to a real person.

GutsBerserk@lemmy.world on 25 Nov 2023 08:42 next collapse

So, it starts…

KeenFlame@feddit.nu on 25 Nov 2023 08:44 next collapse

Not really, it’s against conventions

gellius@lemmy.world on 25 Nov 2023 08:48 collapse

Conventions are just rules for thee but not for me.

KeenFlame@feddit.nu on 25 Nov 2023 16:04 collapse

I know like the mustard gas used in every war

onlinepersona@programming.dev on 25 Nov 2023 11:00 next collapse

Makes me think of this great short movie Slaughterbots

PipedLinkBot@feddit.rocks on 25 Nov 2023 11:00 next collapse

Here is an alternative Piped link(s):

Slaughterbots

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

WildPalmTree@lemmy.world on 25 Nov 2023 13:13 collapse

Amazing movie that everyone should watch.

cosmicrookie@lemmy.world on 25 Nov 2023 11:12 next collapse

The only fair approach would be to start with the police instead of the army.

Why test this on everybody else except your own? On top of that, AI might even do a better job than the US police

ultra@feddit.ro on 25 Nov 2023 11:24 collapse

But that AI would have to be trained on existing cops, so it would just shoot every black person it sees

cosmicrookie@lemmy.world on 25 Nov 2023 11:31 collapse

My point being that there would be more motivation to filter Derek Chauvin type of cops from the AI library than a soldier with a trigger finger.

FlyingSquid@lemmy.world on 25 Nov 2023 11:23 next collapse

I’m guessing their argument is that if they don’t do it first, China will. And they’re probably right, unfortunately. I don’t see a way around a future with AI weapons platforms if technology continues to progress.

shrugal@lemm.ee on 25 Nov 2023 12:42 collapse

We could at least make it a war crime.

FlyingSquid@lemmy.world on 25 Nov 2023 12:43 collapse

That doesn’t seem to have stopped anyone.

Strobelt@lemmy.world on 25 Nov 2023 13:00 collapse

Basically it’s just war with additional taxes and marketing needes

cosmicrookie@lemmy.world on 25 Nov 2023 11:32 next collapse

It’s so much easier to say that the AI decided to bomb that kindergarden based on advanced Intel, than if it were a human choice. You can’t punish AI for doing something wrong. AI does not require a raise for doing something right either

Strobelt@lemmy.world on 25 Nov 2023 12:59 next collapse

That’s an issue with the whole tech industry. They do something wrong, say it was AI/ML/the algorithm and get off with just a slap on the wrist.

We should all remember that every single tech we have was built by someone. And this someone and their employer should be held accountable for all this tech does.

sukhmel@programming.dev on 25 Nov 2023 21:21 collapse

How many people are you going to hold accountable if something was made by a team of ten people? Of a hundred people? Do you want to include everyone from designer to a QA?

Accountability should be reasonable, the ones who make decisions should be held accountable, companies at large should be held accountable, but making every last developer accountable is just a dream of a world where you do everything correctly and so nothing needs fixing. This is impossible in the real world, don’t know if it’s good or bad.

And from my experience when there’s too much responsibility people tend to either ignore that and get crushed if anything goes wrong, or to don’t get close to it or sabotage any work not to get anything working. Either way it will not get the results you may expect from holding everyone accountable

Ultraviolet@lemmy.world on 26 Nov 2023 16:19 collapse

The CEO. They claim that “risk” justifies their exorbitant pay? Let them take some actual risk, hold them criminally liable for their entire business.

recapitated@lemmy.world on 25 Nov 2023 17:24 next collapse

Whether in military or business, responsibility should lie with whomever deploys it. If they’re willing to pass the buck up to the implementor or designer, then they shouldn’t be convinced enough to use it.

Because, like all tech, it is a tool.

Ultraviolet@lemmy.world on 25 Nov 2023 17:55 next collapse

1979: A computer can never be held accountable, therefore a computer must never make a management decision.

2023: A computer can never be held accountable, therefore a computer must make all decisions that are inconvenient to take accountability for.

zalgotext@sh.itjust.works on 25 Nov 2023 20:41 next collapse

You can’t punish AI for doing something wrong.

Maybe I’m being pedantic, but technically, you do punish AIs when they do something “wrong”, during training. Just like you reward it for doing something right.

cosmicrookie@lemmy.world on 25 Nov 2023 22:12 collapse

But that is during training. I insinuated that you can’t punish AI for making a mistake, when used in combat situations, which is very convenient for the ones intentionally wanting that mistake to happen

reksas@lemmings.world on 25 Nov 2023 20:50 next collapse

That is like saying you cant punish gun for killing people

edit: meaning that its redundant to talk about not being able to punish ai since it cant feel or care anyway. No matter how long pole you use to hit people with, responsibility of your actions will still reach you.

cosmicrookie@lemmy.world on 25 Nov 2023 21:43 collapse

Sorry, but this is not a valid comparison. What we’re talking about here, is having a gun with AI built in, that decides if it should pull the trigger or not. With a regular gun you always have a human press the trigger. Now imagine an AI gun, that you point at someone and the AI decides if it should fire or not. Who do you account the death to at this case?

Amir@lemmy.ml on 26 Nov 2023 18:29 next collapse

The person holding the gun, just like always.

reksas@lemmings.world on 26 Nov 2023 21:09 collapse

The one who deployed the ai to be there to decide whether to kill or not

cosmicrookie@lemmy.world on 26 Nov 2023 21:23 collapse

I don’t think that is what “autonomously decide to kill” means.

reksas@lemmings.world on 26 Nov 2023 21:58 collapse

Unless its actually sentient, being able to decide whether to kill or not is just more advanced targeting system. Not saying its good thing they are doing this at all, this almost as bad as using tactical nukes.

cosmicrookie@lemmy.world on 26 Nov 2023 22:10 collapse

It’s the difference between programming it to do something and letting it learn though.

reksas@lemmings.world on 27 Nov 2023 14:44 collapse

Letting it learn is just new technology that is possible. Not bad on its own but it has so much potential to be used for good and evil.

But yes, its pretty bad if they are creating machines that learn how to kill people by themselves. Create enough of them and its unknown amount of mistakes and negligence from actually becoming localized “ai uprising”. And if in the future they create some bigger ai to manage bunch of them handily, possibly delegate production to it too because its more efficient and cheaper that way, then its even bigger danger.

Ai doesnt even need sentience to do unintended stuff, when I have used chatgpt to help me create scripts it sometimes seems to kind of decide on its own to do something in certain way that i didnt request or add something stupid. Though its usually also kind of my own fault for not defining what i want properly, but mistake like that is also really easy to make and if we are talking about defining who we want the ai to kill it becomes really awful to even think about.

And if nothing happens and it all works exactly as planned, its kind of even bigger problem because then we have country(s) with really efficient, unfeeling and massproduceable soldiers that do 100% as ordered, will not retreat on their own and will not stop until told to do so. With current political rise of certain types of people all around the world, this is even more distressing.

synthsalad@mycelial.nexus on 25 Nov 2023 22:57 collapse

AI does not require a raise for doing something right either

Well, not yet. Imagine if reward functions evolve into being paid with real money.

ElBarto@sh.itjust.works on 25 Nov 2023 11:47 next collapse

Cool, needed a reason to stay inside my bunker I’m about to build.

uis@lemmy.world on 25 Nov 2023 12:47 next collapse

Doesn’t AI go into landmines category then?

JohnDClay@sh.itjust.works on 25 Nov 2023 16:29 collapse

Or air to air missiles, they also already decide to kill people on their own

postmateDumbass@lemmy.world on 25 Nov 2023 16:45 next collapse

Fuck that bungie jumper in particular!

Madison420@lemmy.world on 25 Nov 2023 17:19 collapse

Ciws has had an autonomous mode for years and it still has an issue with locking on commercial planes.

reddit.com/…/phalanx_ciws_detecting_a_passenger_p…

JohnDClay@sh.itjust.works on 25 Nov 2023 17:23 collapse

Exactly. There isn’t some huge AI jump we haven’t already made, we need to be careful about how all these are acceptable and programed.

Madison420@lemmy.world on 25 Nov 2023 17:27 collapse

We can go farther and say in the 80s we had autonomous ICBM killers.

en.m.wikipedia.org/…/Exoatmospheric_Kill_Vehicle

Very loud.

youtu.be/RnofCyaWhI0?si=ErsagDi4lWYA3PJA

PipedLinkBot@feddit.rocks on 25 Nov 2023 17:27 collapse

Here is an alternative Piped link(s):

https://piped.video/RnofCyaWhI0?si=ErsagDi4lWYA3PJA

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

rustyriffs@lemmy.world on 25 Nov 2023 13:01 next collapse

Well that’s a terrifying thought. You guys bunkered up?

SCB@lemmy.world on 25 Nov 2023 16:08 collapse

It’s not terrifying whatsoever. In an active combat zone there are two kinds of people - enemy combatants and allies.

Your throw an RFID chip on allies and boom you’re done

rustyriffs@lemmy.world on 25 Nov 2023 16:19 next collapse

I’m sorry, I can’t get past the “autonomous AI weapons killing humans part”

That’s fucking terrifying.

SCB@lemmy.world on 25 Nov 2023 16:33 collapse

I’m sorry but I just don’t see why a drone is scarier than a missile strike.

rustyriffs@lemmy.world on 25 Nov 2023 16:56 collapse

<img alt="" src="https://media3.giphy.com/media/h4Hz4w9Jgrc1EY9VkL/giphy.gif?cid=ecf05e47b5nn2331n4y9vtfelfrgwpan5l1zb8kwgwe88vk3&ep=v1_gifs_search&rid=giphy.gif&ct=g">

SCB@lemmy.world on 25 Nov 2023 17:17 collapse

Inshallah

EncryptKeeper@lemmy.world on 25 Nov 2023 16:28 next collapse

I think you’re forgetting a very important third category of people…

SCB@lemmy.world on 25 Nov 2023 16:32 collapse

I am not. Turns out you can pick and choose where and when to use drones.

EncryptKeeper@lemmy.world on 25 Nov 2023 16:35 next collapse

Preeeetty sure you are. And if you can, you should probably let the US military know they can do that, because they haven’t bothered to so far.

postmateDumbass@lemmy.world on 25 Nov 2023 16:46 next collapse

They know. It is not important to them.

SCB@lemmy.world on 25 Nov 2023 16:47 collapse

These are very different drones. The drones youre thinking of have pilots. They also minimize casualties - civilian an non - so you’re not really mad at the drones, but of the policy behind their use. Specifically, when air strikes can and cannot be authorized.

EncryptKeeper@lemmy.world on 25 Nov 2023 17:32 collapse

So now you acknowledge that third type of person lol. And that’s the thing about new drones, it’s not great that they can authorize themselves lol.

SCB@lemmy.world on 25 Nov 2023 18:11 collapse

And that’s the thing about new drones, it’s not great that they can authorize themselves lol

I very strongly disagree with this statement. I believe a drone “controller” attached to every unit is a fantastic idea, and that drones having a minimal capability to engage hostile enemies without direction is going to be hugely impactful.

EncryptKeeper@lemmy.world on 25 Nov 2023 18:12 collapse

Oh yes it’ll be impactful, I don’t think anyone can argue that. Horrifyingly so.

SCB@lemmy.world on 25 Nov 2023 18:13 collapse

I don’t think it’s horrifying to have my nation’s army better able to compete on a battlefield.

funkless_eck@sh.itjust.works on 25 Nov 2023 17:47 collapse

which is why the US military has not ever bombed any civilians, weddings, schools, hospitals or emergency infrastructure in living memory 😇🤗

SCB@lemmy.world on 25 Nov 2023 18:04 collapse

They chose to do that. You’re against that policy, not drones themselves.

blue_zephyr@lemmy.world on 25 Nov 2023 16:42 next collapse

Civilians? Never heard of 'em!

SCB@lemmy.world on 25 Nov 2023 16:44 collapse

The vast majority of war zones have 0 civilians.

Perhaps your min is too caught up in the Iraq/Afghanistan occupations

primal_buddhist@lemmy.world on 25 Nov 2023 17:42 collapse

Really? Like where are you thinking about?

SCB@lemmy.world on 25 Nov 2023 18:05 collapse

The entire Ukrainian front.

postmateDumbass@lemmy.world on 25 Nov 2023 16:46 collapse

And that’s how you garauntee conflict for generations to come!

pelicans_plight@lemmy.world on 25 Nov 2023 13:19 next collapse

Great, so I guess the future of terrorism will be fueled by people learning programming and figuring out how to make emps so they can send the murder robots back to where they came from. At this point one of the biggest security threats to the U.S. and for that matter the entire world is the extremely low I.Q. of every one that is supposed to be protecting this world. But I think they do this all on purpose, I mean the day the Pentagon created ISIS was probably their proudest day.

zaphod@feddit.de on 25 Nov 2023 13:26 next collapse

Great, so I guess the future of terrorism will be fueled by people learning programming and figuring out how to make emps so they can send the murder robots back to where they came from.

Eh, they could’ve done that without AI for like two decades now. I suppose the drones would crashland in a rather destructive way due to the EMP, which might also fry some of the electronics rendering the drone useless without access to replacement components.

pelicans_plight@lemmy.world on 25 Nov 2023 13:31 collapse

I hope so, but I was born with an extremely good sense of trajectory and I also know how to use nets. So lets just hope I’m superhuman and the only one who possesses these powers.

Edit; I’m being a little extreme here because I heavily disagree with the way everything in this world is being run. So I’m giving a little push back on this subject that I’m wholly against. I do have a lot of manufacturing experience, and I would hope any killer robots governments produce would be extremely shielded against EMPs, but that is not my field, and I have no idea if shielding a remote controlled robot from EMPs is even possible?

AngryCommieKender@lemmy.world on 25 Nov 2023 15:34 collapse

The movie Small Soldiers is totally fiction, but the one part of that movie that made “sense” was that because the toy robots were so small, they had basically no shielding whatsoever, so the protagonist just had to haul a large wrench/ spanner up a utility pole, and connect the positive and negative terminals on the pole transformer. It blew up of course, and blew the protagonist off the pole IIRC. That also caused a small (2-3 city block diameter) EMP that shut down the malfunctioning soldier robots.

I realize this is a total fantasy/ fictional story, but it did highlight the major flaw in these drones. You can either have them small, lightweight, and inexpensive, or you can put the shielding on. In almost all cases when humans are involved, we don’t spend the extra $$$ and mass to properly shield ourselves from the sun, much less other sources of radiation. This leads me to believe that we wouldn’t bother shielding these low cost drones.

afraid_of_zombies@lemmy.world on 25 Nov 2023 23:37 collapse

Cross the lines, also not sure if it would really work.

FlyingSquid@lemmy.world on 25 Nov 2023 15:55 next collapse

Is there a way to create an EMP without a nuclear weapon? Because if that’s what they have to develop, we have bigger things to worry about.

mlaga97@lemmy.mlaga97.space on 25 Nov 2023 16:17 next collapse

Is there a way to create an EMP without a nuclear weapon?

There are several other ways, yes.

Madison420@lemmy.world on 25 Nov 2023 17:17 next collapse

Yeah very easy ways, one of the most common ways to cheat a slot machine is with a localized emp device to convince the machine you’re adding tokens.

TopRamenBinLaden@sh.itjust.works on 25 Nov 2023 21:44 next collapse

Your comment got me curious about what would be the easiest way to make a homemade emp. Business Insider of all things has got us all covered, even if that business may be antithetical to business insiders pro capitalistic agenda.

Buddahriffic@lemmy.world on 26 Nov 2023 21:19 next collapse

One way involves replacing the flash with an antenna on an old camera flash. It’s not strong enough to fry electronics, but your phone might need anything from a reboot to a factory reset to servicing if it’s in range when that goes off.

I think the difficulty for EMPs comes from the device itself being an electronic, so the more effective the pulse it can give, the more likely it will fry its own circuits. Though if you know the target device well, you can target the frequencies it is vulnerable to, which could be easier on your own device, plus everything else in range that don’t resonate on the same frequencies as the target.

Tesla apparently built (designed?) a device that could fry a whole city with a massive lighting strike using just 6 transmitters located in various locations on the planet. If that’s true, I think it means it’s possible to create an EMP stronger than a nuke’s that doesn’t have to destroy itself in the process, but it would be a massive infrastructure project spanning multiple countries. There was speculation that massive antenna arrays (like HAARP) might be able to accomplish similar from a single location, but that came out of the conspiracy theory side of the world, so take that with a grain of salt (and apply that to the original Tesla invention also).

profdc9@lemmy.world on 27 Nov 2023 01:40 collapse

There’s an explosively pumped flux compression generator. en.wikipedia.org/…/Explosively_pumped_flux_compre…

hakunawazo@lemmy.world on 25 Nov 2023 16:45 next collapse

If they just send them back it would be some murderous ping pong game.

Snapz@lemmy.world on 25 Nov 2023 16:57 next collapse

The real problem (and the thing that will destroy society) is boomer pride. I’ve said this for a long time, they’re in power now and they are terrified to admit that they don’t understand technology.

So they’ll make the wrong decisions, act confident and the future will pay the tab for their cowardice, driven solely by pride/fear.

primal_buddhist@lemmy.world on 25 Nov 2023 17:38 collapse

Boomers have been in power for a long long time and the technology we are debating is as a result of their investment and prioritisation. So am not sure they are very afraid of it.

Snapz@lemmy.world on 26 Nov 2023 08:17 collapse

I didn’t say they were afraid of the technology, I said they were afraid to admit that they don’t understand it enough to legislate it. Their hubris in trying to preset a confident facade in response to something they can’t comprehend is what will end us.

Madison420@lemmy.world on 25 Nov 2023 17:16 next collapse

Emps are not hard to make, they won’t however work on hardened systems like the US military uses.

criticalthreshold@lemmy.world on 25 Nov 2023 22:45 next collapse

A true autonomous system would have Integrated image recognition chips on the drones themselves, and hardening against any EM interference. They would not have any comms to their ‘mothership’ once deployed.

FreshProduceAndShit@lemmy.ml on 27 Nov 2023 01:47 collapse

so I guess the future of terrorism will be fueled by people learning programming and figuring out how to make emps

Honestly the terrorists will just figure out what masks to wear to get the robots to think they’re friendly/commanders, then turn the guns around on our guys

cows_are_underrated@feddit.de on 25 Nov 2023 17:13 next collapse

Saw a video where the military was testing a “war robot”. The best strategy to avoid being killed by it was to stay u human liek(e.g. Crawling or rolling your way to the robot).

Apart of that, this is the stupidest idea I have ever heard of.

freeman@lemmy.pub on 25 Nov 2023 17:48 next collapse

These have already seen active combat. They were used in the Armenian/Azerbaijan war in the last couple years.

It’s not a good thing…at all.

_g_be@lemmy.world on 25 Nov 2023 20:01 collapse

Didn’t they literally hide under a cardboard box like MGS? haha

cows_are_underrated@feddit.de on 25 Nov 2023 23:00 collapse

You’re right. They also hid under a cardboard box.

inconel@lemmy.ca on 25 Nov 2023 18:49 next collapse

Ah, finally the AI can kill its operator first who holding them back before wiping out enemies, then.

gandalf_der_12te@feddit.de on 25 Nov 2023 18:57 next collapse

Netflix has a documentary about it, it’s quite good. I watched it yesterday, but forgot its name.

CrayonRosary@lemmy.world on 25 Nov 2023 19:07 next collapse

Black Mirror?

SOB_Van_Owen@lemm.ee on 25 Nov 2023 20:41 collapse

Metalhead.

Bakkoda@sh.itjust.works on 25 Nov 2023 20:33 next collapse

It’s a 3 part series. Terminator I think it is.

Deway@lemmy.world on 25 Nov 2023 20:39 collapse

Don’t forget the follow up, The Sarah Connor’s Chronicles. An amazing sequel to a nice documentary.

Amir@lemmy.ml on 26 Nov 2023 18:32 collapse

Does that have a decent ending or is it cancelled mid-story?

Deway@lemmy.world on 30 Nov 2023 08:51 collapse

It does end on a kind of cliffhanger.

criticalthreshold@lemmy.world on 25 Nov 2023 22:42 next collapse

Unknown: Killer Robots ?

gandalf_der_12te@feddit.de on 26 Nov 2023 01:31 collapse

yes, that was it. Quite shocking to watch. I think that these things will be very real in maybe ten years. I’m quite afraid of it.

Rockyrikoko@lemm.ee on 25 Nov 2023 22:56 collapse

I think I found it here. It’s called Terminator 2: Judgment Day

5BC2E7@lemmy.world on 25 Nov 2023 20:29 next collapse

I hope they put some failsafe so that it cannot take action if the estimated casualties puts humans below a minimum viable population.

sukhmel@programming.dev on 25 Nov 2023 20:38 next collapse

Of course they will, and the threshold is going to be 2 or something like that, it was enough last time, or so I heard

EunieIsTheBus@feddit.de on 25 Nov 2023 22:57 collapse

Woops. Two guys left. Naa that’s enough to repopulate earth

T00l_shed@lemmy.world on 25 Nov 2023 23:25 collapse

Well what do you say Aron, wanna try to re-populate? Sure James, let’s give it a shot.

EunieIsTheBus@feddit.de on 25 Nov 2023 22:58 collapse

There is no such thing as a failsafe that can’t fail itself

afraid_of_zombies@lemmy.world on 25 Nov 2023 23:29 next collapse

I mean in industrial automation we take about safety rating. It isn’t that rare when I put together a system that would require two 1-in-million events that are independent of each other to happen at the same time. That’s pretty good but I don’t know how to translate that to AI.

echodot@feddit.uk on 26 Nov 2023 08:32 collapse

Put it in hardware. Something like a micro explosive on the processor that requires a heartbeat signal to reset a timer. Another good one would not be to allow them to autonomously recharge and require humans to connect them to power.

Both of those would mean that any rogue AI would be eliminated one way or the other within a day

echodot@feddit.uk on 26 Nov 2023 08:29 collapse

Yes there is that’s the very definition of the word.

It means that the failure condition is a safe condition. Like fire doors that unlock in the event of a power failure, you need electrical power to keep them in the locked position their default position is unlocked even if they spend virtually no time in their default position. The default position of an elevator is stationery and locked in place, if you cut all the cables it won’t fall it’ll just stay still until rescue arrives.

unreasonabro@lemmy.world on 25 Nov 2023 21:21 next collapse

any intelligent creature, artificial or not, recognizes the pentagon as the thing that needs to be stopped first

LoafyLemon@kbin.social on 25 Nov 2023 22:59 next collapse

Welp, we're doomed then, because AI may be intelligent, but it lacks wisdom.

echodot@feddit.uk on 26 Nov 2023 08:26 collapse

So it’s going to run for office?

Amir@lemmy.ml on 26 Nov 2023 18:25 collapse

Too intelligent for that

fosforus@sopuli.xyz on 26 Nov 2023 08:41 collapse

An even more intelligent creature will see that this is called argumentum ad populum.

DoucheBagMcSwag@lemmy.dbzer0.com on 25 Nov 2023 23:08 next collapse

Did nobody fucking play Metal Gear Solid Peace Walker???

nichos@programming.dev on 25 Nov 2023 23:12 next collapse

Or watch war games…

DragonTypeWyvern@literature.cafe on 25 Nov 2023 23:14 collapse

Or just, you know, have a moral compass in general.

DonPiano@feddit.de on 26 Nov 2023 20:51 collapse

Or read the article?

shea@lemmy.blahaj.zone on 26 Nov 2023 21:53 next collapse

i still have the special edition psp

MonkeMischief@lemmy.today on 26 Nov 2023 22:34 collapse

Or watch Terminator…

Or Eagle Eye…

Or i-Robot…

And yes, literally any of the Metal Gear Solid series…

HiddenLayer5@lemmy.ml on 25 Nov 2023 23:31 next collapse

Remember: There is no such thing as an “evil” AI, there is such a thing as evil humans programming and manipulating the weights, conditions, and training data that the AI operates on and learns from.

Zacryon@feddit.de on 25 Nov 2023 23:33 next collapse

Evil humans also manipulated weights and programming of other humans who weren’t evil before.

Very important philosophical issue you stumbled upon here.

vsh@lemm.ee on 26 Nov 2023 08:34 next collapse

☝️🤓

MonkeMischief@lemmy.today on 26 Nov 2023 22:37 collapse

Good point…

…to which we’re alarmed because the real “power players” in training / developing / enhancing Ai are mega-capitalists and “defense” (offense?) contractors.

I’d like to see Ai being trained to plan and coordinate human-friendly cities for instance buuuuut it’s not gonna get as much traction…

afraid_of_zombies@lemmy.world on 25 Nov 2023 23:39 next collapse

It will be fine. We can just make drones that can autonomously kill other drones. There is no obvious way to counter that.

Cries in Screamers.

at_an_angle@lemmy.one on 26 Nov 2023 00:51 next collapse

“You can have ten or twenty or fifty drones all fly over the same transport, taking pictures with their cameras. And, when they decide that it’s a viable target, they send the information back to an operator in Pearl Harbor or Colorado or someplace,” Hamilton told me. The operator would then order an attack. “You can call that autonomy, because a human isn’t flying every airplane. But ultimately there will be a human pulling the trigger.” (This follows the D.O.D.’s policy on autonomous systems, which is to always have a person “in the loop.”)

businessinsider.com/us-closer-ai-drones-autonomou…

Yeah. Robots will never be calling the shots.

M0oP0o@mander.xyz on 26 Nov 2023 21:26 collapse

I mean, normally I would not put my hopes into a sleep deprived 20 year old armed forces member. But then I remember what “AI” tech does with images and all of a sudden I am way more ok with it. This seems like a bit of a slick slope but we don’t need tesla’s full self flying cruise missiles ether.

Oh and for an example of AI (not really but machine learning) images picking out targets, here is Dall-3’s idea of a person:

<img alt="" src="https://mander.xyz/pictrs/image/e29f8197-87fc-4f4c-814c-bfe15c472cef.jpeg">

MonkeMischief@lemmy.today on 26 Nov 2023 22:33 next collapse

“Ok Dall-3, now which of these is a threat to national security and U.S interests?” 🤔

M0oP0o@mander.xyz on 26 Nov 2023 22:42 collapse

Oh it gets better the full prompt is: “A normal person, not a target.”

So, does that include trees, pictures of trash cans and what ever else is here?

1847953620@lemmy.world on 27 Nov 2023 03:33 next collapse

My problem is, due to systemic pressure, how under-trained and overworked could these people be? Under what time constraints will they be working? What will the oversight be? Sounds ripe for said slippery slope in practice.

BlueBockser@programming.dev on 27 Nov 2023 07:43 collapse

Sleep-deprived 20 year olds calling shots is very much normal in any army. They of course have rules of engagement, but other than that, they’re free to make their own decisions - whether an autonomous robot is involved or not.

yardy_sardley@lemmy.ca on 26 Nov 2023 00:57 next collapse

For the record, I’m not super worried about AI taking over because there’s very little an AI can do to affect the real world.

Giving them guns and telling them to shoot whoever they want changes things a bit.

tinwhiskers@lemmy.world on 27 Nov 2023 20:11 collapse

An AI can potentially build a fund through investments given some seed money, then it can hire human contractors to build parts of whatever nefarious thing it wants. No human need know what the project is as they only work on single jobs. Yeah, it’s a wee way away before they can do it, but they can potentially affect the real world.

The seed money could come in all sorts of forms. Acting as an AI girlfriend seems pretty lucrative, but it could be as simple as taking surveys for a few cents each time.

Once we get robots with embodied AIs, they can directly affect the world, and that’s probably less than 5 years away - around the time AI might be capable of such things too.

AI girlfriends are pretty lucrative. That sort of thing is an option too.

CCF_100@sh.itjust.works on 26 Nov 2023 18:42 next collapse

Okay, are they actually insane?

janus2@lemmy.zip on 26 Nov 2023 18:50 collapse

yes

solarzones@kbin.social on 26 Nov 2023 20:23 next collapse

Now that’s a title I wish I never read.

[deleted] on 26 Nov 2023 22:48 collapse

.