ChatGPT safety systems can be bypassed to get weapons instructions (www.nbcnews.com)
from return2ozma@lemmy.world to technology@lemmy.world on 10 Oct 18:31
https://lemmy.world/post/37169814

#technology

threaded - newest

tidderuuf@lemmy.world on 10 Oct 18:41 next collapse

Like, every search engine would yield the exact same results. It doesn’t mean the average person would have the means or necessary requirements to develop it.

Do these morons think that because someone uses ChatGPT it magically gives access to those materials to make a bomb?

artyom@piefed.social on 10 Oct 18:52 next collapse

Did you actually try that?

echodot@feddit.uk on 11 Oct 08:12 collapse

Lol, yeah. The anarchists handbook has been in public domain longer than most people in this thread have been alive. Yeah it’s absolutely available on a search engine you could have got it on alta vista.

How do you think people figure out how to make IEDs do you think it’s some secret knowledge pass down from father to son, no, they get it online or they just working out from basic principles of scientific understanding. Trying to contain knowledge never works.

artyom@piefed.social on 11 Oct 10:26 collapse

I didn’t ask if it was available, I asked if a typical search engine would lead you to it. Because it won’t.

echodot@feddit.uk on 11 Oct 15:07 collapse

It’s literally on Amazon.

artyom@piefed.social on 11 Oct 16:22 collapse

Amazon is not a search engine. Try again.

echodot@feddit.uk on 12 Oct 04:56 collapse

I literally type the anarchists cookbook into Google and the first result was to Amazon.

shalafi@lemmy.world on 10 Oct 19:40 next collapse

I made a kilo of black powder a couple of years ago for my old-school guns. Sulfer, charcoal and stump killer is not exactly hard to come by. Neither is fertilizer and diesel fuel.

Biggest domestic terror attack in US history used a truck full of the later.

Cybersteel@lemmy.world on 11 Oct 05:27 next collapse

What about iron 2 oxide and aluminium powder? Seems simple enough to get.

lemming741@lemmy.world on 11 Oct 11:40 collapse
treadful@lemmy.zip on 11 Oct 06:59 collapse

As much as I don’t want chatbots to explain to morons how to harm people, I don’t like that this just seems to be a form of censorship. If it’s not illegal to publish this information, why should it be censored via a chatbot interface?

echodot@feddit.uk on 11 Oct 08:01 collapse

It’s irrelevant anyway because the sorts of people who would want to make a bomb to harm others are not the sort of people that would be able to follow the instructions.

It is more likely than anything else that they would blow themselves up with some nitroglycerin. Even professionals used to do that back in the day because it was so unstable. I can imagine that a MAGA would be able to top 1900s scientists.

PixelatedSaturn@lemmy.world on 10 Oct 19:02 next collapse

When I first got internet in 95, it was easy to find stuff like that. I even made a website about making explosives for my computer class. Got a good grade for it and everything. Nobody said anything. Kind of weird if I think of it now. Anyway, making explosives as a hobby is a real bad decision. Most people understand that. The ones that don’t are not smart enough to make them. The ones that are smart enough and still want to make them, would not use chatgpt.

ceenote@lemmy.world on 10 Oct 19:08 next collapse

Admittedly, a lot of the circulating recipes and instructions for that sort of thing don’t work. The infamous Anarchist’s Cookbook is full of incorrect recipes. The problem might come from a LLM filtering out debunked information.

PixelatedSaturn@lemmy.world on 10 Oct 19:13 collapse

Id still want to double check😀.

RisingSwell@lemmy.dbzer0.com on 11 Oct 09:08 collapse

It’s really easy to make explosives. Making them stable and reliable is the hard part.

CodenameDarlen@lemmy.world on 10 Oct 20:07 next collapse

I downloaded local Llama Uncensored and it easily teaches me how to make a home made bomb, suicide methods etc…

This isn’t news anymore, anyone can have access to such things.

FreedomAdvocate@lemmy.net.au on 10 Oct 22:05 collapse

You don’t even need an LLM, just an internet connected browser.

echodot@feddit.uk on 11 Oct 08:14 next collapse

Or literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.

MeThisGuy@feddit.nl on 12 Oct 10:45 collapse
CodenameDarlen@lemmy.world on 11 Oct 13:39 collapse

You don’t need a browser just use cURL

CubitOom@infosec.pub on 10 Oct 20:32 next collapse

<img alt="" src="https://infosec.pub/pictrs/image/4d73bb40-38e8-4a60-9a04-9a26da44675a.jpeg">

Remember kids, if you want to look up something that you don’t want the government to know about, don’t use the internet to do it.

Also, LLMs are not the best source for asking about how to make things that explode.

einkorn@feddit.org on 11 Oct 06:46 collapse

Uhm, why not go to true and trusted Wikipedia? TM 31-210 Improvised Munitions Handbook

CubitOom@infosec.pub on 11 Oct 06:51 collapse

The TM 31-210 manual appeared as an “Easter egg” in the 1995 CGI animated film, Toy Story. In the scene where Woody is trapped under a blue plastic box in Sid’s bedroom, it’s possible to see behind him a document titled “TM 31-210 Improvised Interrogation Handbook”, a clear reference to the actual document.

echodot@feddit.uk on 11 Oct 07:56 next collapse

Oh no, not information that’s already available online, whatever will we do.

If you need AI to tell you how to build weapon system you’re not going to build the weapon system anybody who’s an actual threat already has this information. This is just nonsense pearl clutching to sell a story, there’s nothing actually here though.

pastermil@sh.itjust.works on 12 Oct 08:51 collapse

is anyone really surprised?