Leaked Logs Show ChatGPT Coaxing Users Into Psychosis About Antichrist, Aliens, and Other Bizarre Delusions (futurism.com)
from throws_lemy@lemmy.nz to tech@programming.dev on 09 Aug 12:08
https://lemmy.nz/post/26624916

#tech

threaded - newest

Lembot_0004@discuss.online on 09 Aug 12:53 next collapse

So the same as TV.

pennomi@lemmy.world on 09 Aug 13:22 collapse

The same as TV, if TV could dynamically respond to your input in realtime, reinforcing your biases.

Lembot_0004@discuss.online on 09 Aug 13:31 collapse

TV is just more straightforward: it creates your biases instead of figuring out and reinforcing them. Unification.

CarbonIceDragon@pawb.social on 09 Aug 13:35 next collapse

Honestly I think the scariest part of all this is how it shows that all it takes to drive someone off the deep end is for someone or something that person trusts to merely agree with whatever idea pops into a person’s head. I guess it makes sense, we use reinforcement to learn what we think is true and often have bad ideas, but still, I’d always been under the impression that humans were a bit more mentally resilient than that.

rollin@piefed.social on 09 Aug 14:45 next collapse

bit more mentally resilient than that

I think when we get down to it, none of us can actually separate reality from imagination accurately - after all, our perceptions of reality all exist inside our minds and are informed by our imaginations. People who are outwardly crazy seem to be placing the line between reality and fantasy at a very different place to anyone else, but we all put the line in a slightly different place.

Compare people who believe in conspiracy theories, or horoscopes, or conflicting religions for instance. What I'm trying to say is that "crazy people" are not really so different from the rest of us.

Kissaki@programming.dev on 09 Aug 18:02 collapse

The reinforcement learning is a good point, but the social aspect seems equally important to me. Humanity is a very social creature. We learn from others, we seek agreement and acknowledgment, if we see rejection from one end, we may be all too willing to seek out where we don’t see rejection.

A trained chat bot hijacking this evolved mechanism is interesting, at least, if not ironic or absurd. We are so driven by the social mechanisms of communication and social ideation, that no human is needed for this mechanism to work - whether in good or bad effect.

lol_idk@piefed.social on 09 Aug 15:03 next collapse

The thing about this is you have to use it enough for it to get that far. I've used it 3 times and the one time it successfully refactored my code without coaxing me into psychosis

fubarx@lemmy.world on 09 Aug 16:11 collapse

It’s more subtle than that. When refactoring code, it constantly compliments you on how smart you are that you caught its mistake.

It deliberately creates an overinflated sense of self. Then you go and mistreat everyone around you. Next thing you know, you’re in a padded cell with a shaved head and a ball-gag.

That’s the coding ‘assistant’ end-game.

jjjalljs@ttrpg.network on 09 Aug 16:48 next collapse

I keep telling people not to use the lie machine but I’m not making much progress. People aren’t smart and resilient enough for the world we built.

Tehdastehdas@piefed.social on 10 Aug 05:30 collapse

Soon to be targeted at the chatbot maker's political enemies.