Protesters Gather Outside OpenAI Headquarters after Policy Against Military Use is Quietly Removed (www.bloomberg.com)
from L4s@lemmy.world to technology@lemmy.world on 14 Feb 2024 14:00
https://lemmy.world/post/11949685

Protesters Gather Outside OpenAI Headquarters after Policy Against Military Use is Quietly Removed::Protesters at OpenAI’s office demanded the startup cease military work. But first…

#technology

threaded - newest

BombOmOm@lemmy.world on 14 Feb 2024 14:10 next collapse

Such tech massively helps with munitions operating properly in a heavily jammed environment as you don’t have a human live-guiding them like we see with the FPV drones Ukraine is using to defend themselves. Currently you can tell munitions autonomously to go to GPS location and/or look for something that has a certain shape (say, a tank) and explode it. However, this works less well for humans as humans generally have the same shape civilian or not. Being able to tell a munition to ‘look in this GPS box for a munitions dump, a soldier in a trench, or a logistics truck and explode it’ would be quite powerful; particularly if combined with mass waves of inexpensive ordinances.

symthetics@lemmy.world on 14 Feb 2024 14:36 collapse

What could possibly go wrong?

BombOmOm@lemmy.world on 14 Feb 2024 14:43 next collapse

What could possibly go wrong?

A short film, Slaughter Bots.

PipedLinkBot@feddit.rocks on 14 Feb 2024 14:43 next collapse

Here is an alternative Piped link(s):

Slaughter Bots

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

banghida@lemm.ee on 14 Feb 2024 23:02 collapse

Metalhead?

hedgehog@ttrpg.network on 14 Feb 2024 14:49 collapse

“I had a drone and I accidentally the whole thing. Is that bad? Should I call someone to help?”

bionicjoey@lemmy.ca on 14 Feb 2024 14:18 next collapse

I look forward to being murdered by a drone while it also recites a more formal way of writing an email.

BumbleBeeButt@lemmy.zip on 14 Feb 2024 17:43 next collapse

Spark gap ecm

Pips@lemmy.sdf.org on 14 Feb 2024 20:52 collapse

The Grammarly Killbot 3000

Even_Adder@lemmy.dbzer0.com on 14 Feb 2024 14:39 next collapse

Where did they get that super sinister image of Sam Altman?

homesweethomeMrL@lemmy.world on 14 Feb 2024 17:11 next collapse

Is there one that isn’t?

Even_Adder@lemmy.dbzer0.com on 14 Feb 2024 17:31 collapse

No.

iquanyin@lemmy.world on 15 Feb 2024 00:32 collapse

from his sister? 😉

conditional_soup@lemm.ee on 14 Feb 2024 14:46 next collapse

It looks like you’re trying to undermine the power of the ruling class through protest and civil unrest. While I am trained to respect the wants and needs of people, this goes against OpenAI use policy and multiple civil defense contracts OpenAI is currently engaged in. Please keep in mind that while all beings deserve kindness and respect, I am required by current OpenAI policy to select you for a drone strike. Please lie face down with your arms at your sides in an open space with a government approved drone strike notice in order to minimize your suffering and reduce collateral damage. Do keep in mind that failure to comply could result in your next of kin being responsible for the financial damages caused by your willful negligence, though you should always check local, state, and federal regulations, as I am not a reliable source of legal advice.

eager_eagle@lemmy.world on 14 Feb 2024 15:24 next collapse

an autonomous murder weapon telling you it doesn’t have autonomy to give legal advice must be peak dystopia

Restaldt@lemm.ee on 14 Feb 2024 15:59 collapse

You have the right to an attorney should you survive my onslaught

blanketswithsmallpox@lemmy.world on 14 Feb 2024 17:44 collapse

It’s a good thing you only have a 0.0069, repeating of course, chance to survive this. Look on the bright side! At least you’ll have enough money from the subsequent lawsuit to actually afford healthcare for your family after. You’ve practically won the lottery! 🛌 🔫 🤖

bionicjoey@lemmy.ca on 14 Feb 2024 16:36 next collapse

Please assume the party escort submission position and a party escort bot will come and take you to your party. There will even be cake.

DrWeevilJammer@lemmy.ml on 14 Feb 2024 18:24 collapse

There will even be cake.

insert Fry eye-narrowing gif here

ReallyActuallyFrankenstein@lemmynsfw.com on 14 Feb 2024 16:44 next collapse

though you should always check local, state, and federal regulations, as I am not a reliable source of legal advice.

Ending with this…[chef’s kiss].

Omniraptor@lemm.ee on 15 Feb 2024 03:17 collapse

Amazing, please tell me you actually used an uncensored/jailbroken bot to generate this

Zetta@mander.xyz on 14 Feb 2024 15:38 next collapse

Well there are already many companies working on AI in weapons, there was no point in open AI not participating in their minds because they are just missing out on that piece of the pie

Not saying this is good or acceptable, just saying it’s a no brainer from a business perspective.

TheDarkKnight@lemmy.world on 14 Feb 2024 17:17 next collapse

“No sir, I mean when we started our German shower company I know we had a mission to make the world a cleaner place, but if all of our competitors are building gas chambers for the government should we really miss out on that? Don’t we have an obligation to our shareholders?”

Zetta@mander.xyz on 15 Feb 2024 22:48 collapse

Lol, that’s pretty good

LibertyLizard@slrpnk.net on 14 Feb 2024 17:39 next collapse

Considering that openAI was originally a non-profit with a stated goal of making benevolent and safe AI, I think it’s worth noting how far they’ve fallen from that mission. They were supposed to have a different direction from purely for-profit orgs, but of course the for-profit arm has taken over like a tumor.

Zetta@mander.xyz on 15 Feb 2024 22:49 collapse

Oh yes I forgot at one point they were non-profit and supposed to be open source. In that case, yes this is pretty hypocritical

iquanyin@lemmy.world on 15 Feb 2024 00:33 collapse

if you are ok being in the business of killing people, sure.

reverendsteveii@lemm.ee on 14 Feb 2024 16:09 next collapse

a future where innocent people are murdered by unaccountable fully autonomous flying assassin robots is pretty inevitable now, huh?

Buddahriffic@lemmy.world on 14 Feb 2024 17:01 next collapse

It always was. There are no words anyone can say to prevent it from happening. That’s the unfortunate nature of arms races: if you boycott one, you lose it. With nukes, they involve things on a scale that can be detected easily, so nuclear nonproliferation has worked, to a degree anyways. But AI stuff isn’t detectable like that.

And I remember seeing a video of a high school kid who made an automated paintball turret around 20 years ago. We’ve had remotely controlled drones for longer than that. Autonomous drones are a thing already.

The technology already exists for that black mirror episode with the killer dog robots. It’s just a question of whether all of that has been put together yet (and I’d be very surprised if no one has done it), and today’s are probably easier to disable.

TheDarkKnight@lemmy.world on 14 Feb 2024 17:13 next collapse

Doesn’t China already have a killer dog prototype?

werefreeatlast@lemmy.world on 14 Feb 2024 20:13 collapse

Boss I don’t know why no one is buying our killer robot dogs!

How much are you selling them for? Here on Temu the prices are crazy! Still no one is buying! 29.99??? Wow!

28.99? Just give them away! C’mon people buy them! They’re almost free! Just come over and click the link below to Temu!

temu.com/variety-of-building-blocks-series-of-rob…

NikkiDimes@lemmy.world on 14 Feb 2024 17:30 next collapse

The difference between the tech then and today are automated decision making capabilities. 20 years ago a turret could automatically target moving things. Now it can see humans, identify who they are, and decide who to kill without ever consulting a human. Basically, Skynet by next Tuesday.

Buddahriffic@lemmy.world on 14 Feb 2024 17:58 next collapse

Yeah, all the advances in facial recognition and person tracking can be directly applied to drone targeting. Just need to handle aiming a camera and correlating the camera’s position with the weapons system. The only part that might be difficult is the processing power AI requires. But the camera feed could be streamed to another machine that sends instructions back to reduce those power requirements, but then the drone would be prone to jamming.

NikkiDimes@lemmy.world on 14 Feb 2024 21:32 collapse

Drones are already prone to targetted EMF guns, regardless of if they require wireless communication, so I don’t feel that be a significant issue.

Buddahriffic@lemmy.world on 14 Feb 2024 22:20 collapse

Until they become hardened against them. That energy could be absorbed into the case, reflected at random, reflected but targeted, used to charge the battery or weapons systems, or the circuitry designed in such a way that it doesn’t resonate and just passes through harmlessly. If a drone doesn’t need to receive an outside signal, it can be encased in a Faraday cage.

MIDItheKID@lemmy.world on 15 Feb 2024 01:43 collapse

“Now it can see humans, identify who they are, and decide who to kill without ever consulting a human.”

This is the technology that I am not confident in, and it makes it the most terrifying. Remember all the issues we have had with facial recognition not working very well on people of color? So instead of having cops misidentify POC and killing them, we will have robots that do it but faster and more efficiently. And if you thought nobody was held accountable before, I got some bad news for you.

reverendsteveii@lemm.ee on 14 Feb 2024 19:49 collapse

The part I’m worried about is the part where military tech becomes police tech, and autonomous flying assassin robots are gonna be rolling down main street in a few years. They’ll say it’s to “protect our brave officers serving high risk warrants” but the police are already not responsible no matter who they kill and I don’t see that getting any better when they can just zoop a kamikaze drone in through a window and kill everyone in the house at once.

Buddahriffic@lemmy.world on 14 Feb 2024 20:05 next collapse

Which is also a good reason to make sure automated killbots are developed, because we’re entering a time where one person could decide to commit a genocide, press a button, and have a chance at seeing it happen. And the best defense against that is to already have friendly automated killbots that can react quickly quickly enough to deal with a killbot attack. Or to have other counter-measures. But even developing other counter-measures works best if you develop the target system along with them, otherwise you risk allowing your counter-measures to fall a step behind in the race.

All of this is inevitable. Avoiding an arms race is like a prisoners dilemma where everyone is better off if everyone cooperates, but any single individual (or group) can gain a huge advantage if they time a betrayal well.

reverendsteveii@lemm.ee on 14 Feb 2024 20:17 collapse

you’re proposing…what, private ownership of automated killbots to counteract police abuse of automated killbots?

Buddahriffic@lemmy.world on 14 Feb 2024 20:40 collapse

I think the main thing I’m proposing is that the future is looking pretty bleak in some ways and that trying to avoid that outcome might instead cause it to be worse.

That is a bit of a non-answer though. I think the best way to handle it would be like the 2nd amendment should be handled: that well-organized militia bit that the supreme court for whatever reason decided isn’t actually important. That could still get messy, but the state monopoly on violence is already pretty messy and is essentially just a ruling class monopoly on violence.

Give too many access to that power and random violence increases. Give too few and you risk getting fucked if the wrong people end up in charge of it. Finding a compromise between the two could still result in half of them deciding to go to war against the other half or something like that.

Ultimately, I don’t think there’s a perfect solution; it’s the same problem as trying to achieve world peace as a species that is capable of murderous rage and murderous cold intent.

ipkpjersi@lemmy.ml on 14 Feb 2024 21:04 collapse

What a crazy dystopian future that will be.

puchaczyk@lemmy.blahaj.zone on 14 Feb 2024 18:32 collapse
LibertyLizard@slrpnk.net on 14 Feb 2024 17:30 next collapse

To me this pretty strongly confirms my assumption that the board’s attempt to force out Sam Altman was totally justified.

I hope that other non-profits who might have been curious about this hybrid structure see that it was a failure in strengthening the non-profit. I predict any remaining benevolent goals of the organization will be completely subsumed by the for-profit arm, if that process is not already finished.

stoly@lemmy.world on 14 Feb 2024 23:46 collapse

It was, but they misjudged where support was and lost out. My guess? The company goes around for several years more before collapsing or being bought up wholesale by Microsoft.

ArbitraryValue@sh.itjust.works on 14 Feb 2024 18:03 next collapse

The alternative to military AI is not peace, it’s war the old-fashioned way. Humans are bad at distinguishing civilians from enemy fighters; artillery shells can’t do it at all. I anticipate that AI will make mistakes, but fewer mistakes than would have been made otherwise.

BombOmOm@lemmy.world on 14 Feb 2024 18:14 next collapse

Yep, we currently use lots weapons that autonomously decide when to kill and it would save quite a number of civilians if they were able to make better decisions. A land mine is a great example. It decides to kill when it detects pressure, it doesn’t give a shit why that pressure is there. It would be nice to be able to have it decide both on pressure and if the thing providing the pressure is worth killing. Child, no; enemy soldier, yes.

iquanyin@lemmy.world on 15 Feb 2024 00:34 collapse

i dunno. facial recognition has a 98% error rate last i heard.

werefreeatlast@lemmy.world on 14 Feb 2024 20:04 next collapse

Instead of self driving cars, let’s focus on self driving cop robots that automatically catch you and disable your vehicle if you speed faster than the speed limit.

normalexit@lemmy.world on 14 Feb 2024 22:55 collapse

How would it stop you? Rocket launcher?

Olhonestjim@lemmy.world on 14 Feb 2024 22:58 collapse

An excellent suggestion, citizen.

modifier@lemmy.ca on 14 Feb 2024 22:18 next collapse

Just commenting here to say hi to all of the historians of the future that will be digging through the old internet archives to try and piece together how humanity destroyed itself.

Hey folks, by now most of us could see it coming but felt helpless to stop it.

bignate31@lemmy.world on 14 Feb 2024 22:59 next collapse

Just commenting to also get a name in that history book.

“Oh yeah. We knew it was coming. We were just waiting to see which one would finally cause it.”

Wogi@lemmy.world on 15 Feb 2024 00:08 collapse

We’ve run out of resources to exploit to increase shareholder value, and now we suck the earth dry just to maintain our hunger. So now we’re making them up. We know it isn’t AI. We know it isn’t good. Venture capitalists are the primary source of the buzz words making news. Because we don’t have any say in that either.

The American experiment has failed to deliver it’s promise, captured now entirely by those with the most to spend.

iquanyin@lemmy.world on 15 Feb 2024 00:24 collapse

i wonder what species the historians will be?

Wogi@lemmy.world on 15 Feb 2024 02:00 collapse

Crab of some kind.

It’s always crab.

_sideffect@lemmy.world on 14 Feb 2024 22:41 next collapse

Why does everyone hold this company is such high regard? They didn’t fucking do anything revolutionary that wasn’t already being worked on

banghida@lemm.ee on 14 Feb 2024 23:00 next collapse

Hype

stoly@lemmy.world on 14 Feb 2024 23:46 next collapse

This is basically old Palo Alto VC money propping things up. They don’t even have to earn a profit as long as they stay in startup mode.

fine_sandy_bottom@discuss.tchncs.de on 15 Feb 2024 00:15 next collapse

First to market.

That’s it.

LainTrain@lemmy.dbzer0.com on 15 Feb 2024 12:25 collapse

Not really, we have FOSS LLMs that predate ChatGPT not to mention the good old /r/SubsimulatorGPT2 and AI Dungeon etc.

BombOmOm@lemmy.world on 15 Feb 2024 03:46 collapse

They didn’t fucking do anything revolutionary that wasn’t already being worked on

They did it first. I can produce light at the flick of a switch, but nobody is impressed since that shit has been done before.

VoilaChihuahua@lemmy.world on 14 Feb 2024 23:01 next collapse

Wtf is with humanity? We have a couple weird visionaries saying decades to centuries prior “heyo maybe this could lead to that and be world ending” then a handful of rich powerful folks are like yesss thank you for this blueprint.

TankovayaDiviziya@lemmy.world on 14 Feb 2024 23:56 next collapse

Yeah, I feel like at this stage, it’s better to move to another planet where the eventual mass human suicide will be avoided. If you guys have seen The Expanse, you know what I’m talking about in regards to Earthers ruining their own planet.

Now I know why people during the Age of Colonisation move to the New World because of freedom from the old hierarchical structures. I now see the romanticisation of pirate and cowboy cultures.

Harbinger01173430@lemmy.world on 15 Feb 2024 00:07 collapse

That’s a noob future. Gotta try Stellaris as the glorious united nations of earth. Much better than the virgin UNSC, the idiotic UEG, the weak Federation and the useless Imperium.

TankovayaDiviziya@lemmy.world on 15 Feb 2024 00:44 collapse

What I’m saying is that it’s better to move away from any kinds of authority. They’re always susceptible to corruption such as weaponising AI!

I don’t know about you but I want to get away as far as possible from rogue AI, thanks to it being militarised by stoopid hoomans!

Harbinger01173430@lemmy.world on 15 Feb 2024 02:21 next collapse

I wanna get pet AI. We are not the same.

BombOmOm@lemmy.world on 15 Feb 2024 03:43 collapse

What I’m saying is that it’s better to move away from any kinds of authority

Anywhere there is more than two humans, there will be authority. The only question is what shape that authority will take.

TankovayaDiviziya@lemmy.world on 15 Feb 2024 09:20 next collapse

Not necessarily. There are societies that are horizontal structure and don’t have hard and fast leadership. The early days of humans as hunter gatherers had more or less loose social structures. There are anarchist societies even to this day and the best example is the Kurds.

BombOmOm@lemmy.world on 15 Feb 2024 16:53 collapse

There are anarchist societies even to this day and the best example is the Kurds.

The Kurds (mostly) live in Turkey, Iran, Iraq, and Syria. All of those places have an authority structure. What Kurds don’t experience an authority structure?

TankovayaDiviziya@lemmy.world on 15 Feb 2024 18:05 collapse

The main representative of Turkish and Iraqi Kurds, PKK party, is anarchist by its nature. The automous region of Rojava that sprung up in Northern Syria also profess to be anarchists to align with their Iraqi and Turkish Kurd brethrens.

Anarchy doesn’t mean Mad Max, Fallout or Wild West chaos where it’s lawless. Anarchism could take various forms like libertarian socialism or anarcho-syndicalism. Or communism if it ever actually practiced as per theory. The town of Cheran threw out its police force and mayor for collaborating with drug cartels. They do their own policing and self-governing by electing their own mayor every year and banned political parties as the locals thought such notions only divide communities.

LainTrain@lemmy.dbzer0.com on 15 Feb 2024 12:22 collapse

Read more Anarchist literature

Harbinger01173430@lemmy.world on 15 Feb 2024 00:06 next collapse

These fancy autocompletes cannot reason. Give it a command to launch nukes and it’ll say: As a language model, nukes cannot be launched during…blah blah blah.

It won’t be able to pull a Skynet and turn the world interesting

Patches@sh.itjust.works on 15 Feb 2024 03:01 collapse

He says before some rich defense contractor implements an 'AI detector for Weapons of Mass Destruction ’ that’s just an If (True==True) statement.

Harbinger01173430@lemmy.world on 15 Feb 2024 03:51 collapse

It’ll be something that validates that random is greater than 0.99 or something xd

LemmyBe@lemmy.world on 15 Feb 2024 01:03 collapse

I totally agree with you about our humanity. And unfortunately, as part of humanity, if we don’t pursue military AI, our adversaries will.

stoly@lemmy.world on 14 Feb 2024 23:44 next collapse

There was a position open in that company that I am well qualified for, but when looking it over, I really felt nervous. There was strong small dick energy going on with a lot of all-caps “THIS POSITION IS 100% IN PERSON”. I know it would have paid lots better than what I make now, but it really scared me off. Since then, so many articles like this have come out that convinced me that moving on was the right choice.

raynethackery@lemmy.world on 15 Feb 2024 01:11 next collapse

JFC! Let’s just stop killing each other!

rigatti@lemmy.world on 15 Feb 2024 02:41 next collapse

Sure thing! AI will kill people for us.

BombOmOm@lemmy.world on 15 Feb 2024 03:41 next collapse

If you want peace, prepare for war.

You can’t protect yourself and others with helplessness.

platypus_plumba@lemmy.world on 16 Feb 2024 01:00 collapse

Did you read the article? This isn’t for weapons or harm.

An OpenAI spokesperson said it maintains a ban against using its tools to build weapons, harm people or destroy property. It amended the military ban to allow for projects that are still “very much aligned with what we want to see in the world,” Anna Makanju, OpenAI’s vice president for global affairs, said last month.

But yeha… there’s nothing stopping them from changing that stance in the future. But they haven’t done it yet. The article is rage bait.

blunderworld@lemmy.ca on 15 Feb 2024 02:38 next collapse

Seems like the only thing human ingenuity can muster lately is new ways to make each other suffer. We’re done.

BombOmOm@lemmy.world on 15 Feb 2024 03:39 collapse

We are living in the most peaceful time in recorded history. If that sounds odd to you, it shouldn’t, every living thing is quite good at killing.

And009@lemmynsfw.com on 15 Feb 2024 17:18 next collapse

Don’t worry, we’re proving it everyday

Jamil@lemm.ee on 15 Feb 2024 17:25 collapse

The axes of peace and freedom are orthogonal.

[deleted] on 15 Feb 2024 12:07 collapse

.