webghost0101@sopuli.xyz
on 27 Jul 2023 14:45
nextcollapse
All good and well till someone takes a screenshot.
lemann@lemmy.one
on 27 Jul 2023 15:10
nextcollapse
It might be resistant to screenshots - unless I missed it, the article didn’t clarify whether the obfuscation process is applied to the image on a per-pixel basis, or within the file format itself…
If it was that easy to bypass it would be a pretty futile mechanism IMO, one would just need to convert the image to strip out the obfuscation 🫠 or just take a screenshot as you said
SheeEttin@lemmy.world
on 27 Jul 2023 15:24
nextcollapse
Sounds like it’s tiny changes to the image data to trick it. But it also sounds dependent on each algorithm. So while you might trick Stable Diffusion, another like Midjourney would be unaffected.
And either way, I’d bet mere jpeg compression would be enough to destroy your tiny changes.
esadatari@lemmy.world
on 28 Jul 2023 02:22
collapse
a couple minutes of photoshop and a smudge or burn tool would also negate all the effects
diffuselight@lemmy.world
on 28 Jul 2023 09:08
collapse
These things never work in the real world. We’ve seen this over and over. It’s snakeoil. Latent space mapping may survive compression but don’t work across encoders.
The white paper linked’s title is very pragmatic sounding “Raising the Cost of Malicious AI-Powered Image Edit”. Would like to read it deeper later to see what the actual mechanisms deployed are. I know ive considered some form of attestation embedding both in the data and form linked with cryptographic signature. You know for emportant things like politics, diplomacy and celeberty endorsement. /s
threaded - newest
All good and well till someone takes a screenshot.
It might be resistant to screenshots - unless I missed it, the article didn’t clarify whether the obfuscation process is applied to the image on a per-pixel basis, or within the file format itself…
If it was that easy to bypass it would be a pretty futile mechanism IMO, one would just need to convert the image to strip out the obfuscation 🫠 or just take a screenshot as you said
Sounds like it’s tiny changes to the image data to trick it. But it also sounds dependent on each algorithm. So while you might trick Stable Diffusion, another like Midjourney would be unaffected.
And either way, I’d bet mere jpeg compression would be enough to destroy your tiny changes.
a couple minutes of photoshop and a smudge or burn tool would also negate all the effects
These things never work in the real world. We’ve seen this over and over. It’s snakeoil. Latent space mapping may survive compression but don’t work across encoders.
It’s as good as scanning a random marking of a human bone that somehow installs a virus in your pc
Yeah it might work in the original format under some conditions but won’t survive a screenshot or saving to another format.
Once again there comes the time to manually shop oneself to handshake with celebrities.
The white paper linked’s title is very pragmatic sounding “Raising the Cost of Malicious AI-Powered Image Edit”. Would like to read it deeper later to see what the actual mechanisms deployed are. I know ive considered some form of attestation embedding both in the data and form linked with cryptographic signature. You know for emportant things like politics, diplomacy and celeberty endorsement. /s