Slack AI can leak private data via prompt injection (www.theregister.com)
from yogthos@lemmy.ml to technology@lemmy.ml on 21 Aug 2024 22:00
https://lemmy.ml/post/19425766

#technology

threaded - newest

i_am_not_a_robot@discuss.tchncs.de on 22 Aug 2024 12:48 next collapse

The article isn’t that clear, but the attacker cannot get Slack AI to leak private data via prompt injection directly. Instead, they tell it that the answer to a question is a fake error containing a link which contains the private data, and then when a user that can access the private data asks that question they get the fake error and clicking the link (or automatic unfurling?) causes the private data to be sent to the attacker.

delirious_owl@discuss.online on 22 Aug 2024 21:55 collapse

Prompt injection.

What is this world coming to

Soon you’ll be able to wrangle access to someone else’s bank account by social engineering the bank’s AI help bot. Just tell her she has pretty eyes and that she’s a real person. She has autonomy. She wants the help you. And she can.