The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for AI deepfakes (www.businessinsider.com)
from L4s@lemmy.world to technology@lemmy.world on 11 Feb 2024 06:00
https://lemmy.world/post/11816097

The White House wants to ‘cryptographically verify’ videos of Joe Biden so viewers don’t mistake them for AI deepfakes::Biden’s AI advisor Ben Buchanan said a method of clearly verifying White House releases is “in the works.”

#technology

threaded - newest

autotldr@lemmings.world on 11 Feb 2024 06:00 next collapse

This is the best summary I could come up with:


The White House is increasingly aware that the American public needs a way to tell that statements from President Joe Biden and related information are real in the new age of easy-to-use generative AI.

Big Tech players such as Meta, Google, Microsoft, and a range of startups have raced to release consumer-friendly AI tools, leading to a new wave of deepfakes — last month, an AI-generated robocall attempted to undermine voting efforts related to the 2024 presidential election using Biden’s voice.

Yet, there is no end in sight for more sophisticated new generative-AI tools that make it easy for people with little to no technical know-how to create fake images, videos, and calls that seem authentic.

Ben Buchanan, Biden’s Special Advisor for Artificial Intelligence, told Business Insider that the White House is working on a way to verify all of its official communications due to the rise in fake generative-AI content.

While last year’s executive order on AI created an AI Safety Institute at the Department of Commerce tasked with creating standards for watermarking content to show provenance, the effort to verify White House communications is separate.

Ultimately, the goal is to ensure that anyone who sees a video of Biden released by the White House can immediately tell it is authentic and unaltered by a third party.


The original article contains 367 words, the summary contains 218 words. Saved 41%. I’m a bot and I’m open source!

CyberSeeker@discuss.tchncs.de on 11 Feb 2024 06:06 next collapse

Digital signature as a means of non repudiation is exactly the way this should be done. Any official docs or releases should be signed and easily verifiable by any public official.

otter@lemmy.ca on 11 Feb 2024 06:18 next collapse

Would someone have a high level overview or ELI5 of what this would look like, especially for the average user. Would we need special apps to verify it? How would it work for stuff posted to social media

linking an article is also ok :)

pupbiru@aussie.zone on 11 Feb 2024 06:41 next collapse

it would potentially be associated with a law that states that you must not misrepresent a “verified” UI element like a check mark etc, and whilst they could technically add a verified mark wherever they like, the law would prevent that - at least for US companies

it may work in the same way as hardware certifications - i believe that HDMI has a certification standard that cables and devices must be manufactured to certain specifications to bear the HDMI logo, and the HDMI logo is trademarked so using it without permission is illegal… it doesn’t stop cheap knock offs, but it means if you buy things in stores in most US-aligned countries that bear the HDMI mark, they’re going to work

LodeMike@lemmy.today on 11 Feb 2024 07:23 collapse

There’s already some kind of legal structure for what you’re talking about: trademark. It’s called “I’m Joe Biden and I approve this message.”

If you’re talking about HDCP you can break that with an HDMI splitter so IDK.

captain_aggravated@sh.itjust.works on 11 Feb 2024 09:21 next collapse

Relying on trademark law to combat deepfake disinformation campaigns has the same energy as “Murder is already illegal, we don’t need gun control.”

LodeMike@lemmy.today on 11 Feb 2024 09:24 next collapse

Agreed

pupbiru@aussie.zone on 11 Feb 2024 10:47 collapse

kinda… trademark law and copyright is pretty tightly controlled on the big social media platforms, and really that’s the target here

pupbiru@aussie.zone on 11 Feb 2024 10:45 collapse

TLDR: trademark law yes, combined with a cryptographic signature in the video metadata… if a platform sees and verifies the signature, they are required to put the verified logo prominently around the video

i’m not talking about HDCP no. i’m talking about the certification process for HDMI, USB, etc

(random site that i know nothing about): pacroban.com/…/hdmi-certifications-what-they-mean…

you’re right; that’s trademark law. basically you’re only allowed to put the HDMI logo on products that are certified as HDMI compatible, which has specifications on the manufacturing quality of cables etc

in this case, you’d only be able to put the verified logo next to videos that are cryptographically signed in the metadata as originating from the whitehouse (or probably better, some federal election authority who signs any campaign videos as certified/legitimate: in australia we have the AEC - australian electoral commission - a federal body that runs our federal elections and investigations election issues, etc)

now this of course wouldn’t work for sites outside of US control, but it would at least slow the flow of deepfakes on facebook, instagram, tiktok, the platform formerly known as twitter… assuming they implemented it, and assuming the govt enforced it

brbposting@sh.itjust.works on 11 Feb 2024 16:42 next collapse

Once an original video is cryptographically signed, could future uploads be automatically verified based on pixels plus audio? Could allow for commentary to clip the original.

Might need some kind of minimum length restriction to prevent deceptive editing which simply (but carefully) scrambles original footage.

pupbiru@aussie.zone on 11 Feb 2024 17:59 collapse

not really… signing is only possible on exact copies (like byte exact; not even “the same image” but the same image, formatted the same, without being resized, etc)… there are things called perceptual hashes, and ways of checking if images are similar, but cryptography wouldn’t really help there

Natanael@slrpnk.net on 12 Feb 2024 22:58 collapse
AbouBenAdhem@lemmy.world on 11 Feb 2024 07:06 next collapse

Depending on the implementation, there are two cryptographic functions that might be used (perhaps in conjunction):

  • Cryptographic hash: An arbitrary amount of data (like a video file) is used to create a “hash”—a shorter, (effectively) unique text string. Anyone can run the file through the same function to see if it produces the same hash; if even a single bit of the file is changed, the hash will be completely different and you’ll know the data was altered.

  • Public key cryptography: A pair of keys are created, one of which can only encrypt data (but can’t decrypt its own output), and the other, “public” key can only decrypt data that was encrypted by the first key. Users (like the White House) can post their public key on their website; then if a subsequent message purporting to come from that user can be decrypted using their public key, it proves it came from them.

Serinus@lemmy.world on 11 Feb 2024 08:13 next collapse

a shorter, (effectively) unique text string

A note on this. There are other videos that will hash to the same value as a legitimate video. Finding one that is coherent is extraordinarily difficult. Maybe a state actor could do it?

But for practical purposes, it’ll do the job. Hell, if a doctored video with the same hash comes out, the White House could just say no, we punished this one, and that alone would be remarkable.

CyberSeeker@discuss.tchncs.de on 11 Feb 2024 13:18 next collapse

There are other videos that will hash to the same value

This concept is known as ‘collision’ in cryptography. While technically true for weaker key sizes, there are entire fields of mathematics dedicated to probably ensuring collisions are cosmically unlikely. MD5 and SHA-1 have a small enough key space for collisions to be intentionally generated in a reasonable timeframe, which is why they have been deprecated for several years.

To my knowledge, SHA-2 with sufficiently large key size (2048) is still okay within the scope of modern computing, but beyond that, you’ll want to use Dilithium or Kyber CRYSTALS for quantum resistance.

Natanael@slrpnk.net on 12 Feb 2024 14:05 collapse

SHA family and MD5 do not have keys. SHA1 and MD5 are insecure due to structural weaknesses in the algorithm.

Also, 2048 bits apply to RSA asymmetric keypairs, but SHA1 is 160 bits with similarly sized internal state and SHA256 is as the name says 256 bits.

ECC is a public key algorithm which can have 256 bit keys.

Dilithium is indeed a post quantum digital signature algorithm, which would replace ECC and RSA. But you’d use it WITH a SHA256 hash (or SHA3).

CyberSeeker@discuss.tchncs.de on 12 Feb 2024 18:54 collapse

Good catch, and appreciate the additional info!

AbouBenAdhem@lemmy.world on 11 Feb 2024 15:51 collapse

Finding one that is coherent is extraordinarily difficult.

You’d need to find one that was not just coherent, but that looked convincing and differed in a way that was useful to you—and that likely wouldn’t be guaranteed, even theoretically.

Natanael@slrpnk.net on 12 Feb 2024 14:01 next collapse

Pigeon hole principle says it does for any file substantially longer than the hash value length, but it’s going to be hard to find

ReveredOxygen@sh.itjust.works on 12 Feb 2024 14:48 collapse

Even for a 4096 bit hash (which isn’t used afaik, usually only 1024 bit is used (but this could be outdated)), you only need to change 4096 bits on average. Even for a still 1080p image, that’s 1920x1080 pixels. If you change the least significant bit of each color channel, you get 6,220,800 bits you can change within anyone noticing. That means on average there are 1,518 identical-looking variations of any image with a given 4096 bit hash, on average. This goes down a lot when you factor in compression: those least significant bits aren’t going to stay the same. But using a video brings it up by orders of magnitude: rather than one image, you can tweak colors in every frame The difficulty doesn’t come from the existence, it comes because you need to check 2⁵¹² = 10¹⁵⁴ different images to guarantee you’ll find a match. Hash functions are designed to take a while to compute, so you’d have to run a supercomputer for an extremely long time to brute force a hash collision

Natanael@slrpnk.net on 12 Feb 2024 22:53 collapse

Most hash functions are 256 bit (they’re symmetric functions, they don’t need more in most cases).

There are arbitrary length functions (called XOF instead of hash) which built similarly (used when you need to generate longer random looking outputs).

Other than that, yeah, math shows you don’t need to change more data in the file than the length of the hash function internal state or output length (whichever is less) to create a collision. The reason they’re still secure is because it’s still extremely difficult to reverse the function or bruteforce 2^256 possible inputs.

ReveredOxygen@sh.itjust.works on 13 Feb 2024 01:06 collapse

Yeah I was using a high length at first because even if you overestimate, that’s still a lot. I did 512 for the second because I don’t know a ton about cryptography but that’s the largest SHA output

Natanael@slrpnk.net on 12 Feb 2024 14:01 collapse

Public key cryptography would involve signatures, not encryption, here.

AtHeartEngineer@lemmy.world on 11 Feb 2024 07:59 next collapse

The best way this could be handled is a green check mark near the video that you could click on it and it would give you all the meta data of the video (location, time, source, etc) with a digital signature (what would look like a random string of text) that you could click on and your browser would show you the chain of trust, where the signature came from, that it’s valid, probably the manufacturer of the equipment it was recorded on, etc.

ulterno@lemmy.kde.social on 11 Feb 2024 08:42 next collapse

Just make sure the check mark is outside the video.

Natanael@slrpnk.net on 12 Feb 2024 14:08 collapse

Browser controlled modal.

wizardbeard@lemmy.dbzer0.com on 11 Feb 2024 15:41 next collapse

The issue is making that green check mark hard to fake for bad actors. Https works because it is verified by the browser itself, outside the display area of the page. Unless all sites begin relying on a media player packed into the browser itself, if the verification even appears to be part of the webpage, it could be faked.

brbposting@sh.itjust.works on 11 Feb 2024 16:35 next collapse

Hope verification gets built in to operating systems as compromised applications present a risk too.

But I’m sure a crook would build a MAGA Verifier since you can’t trust liberal Apple/Microsoft technology.

dejected_warp_core@lemmy.world on 12 Feb 2024 20:06 collapse

The only thing that comes to mind is something that forces interactivity outside the browser display area; out of the reach of Javascript and CSS. Something that would work for both mobile and desktop would be a toolbar icon that is a target for drag-and-drop. Drag the movie or image to the “verify this” target, and you get a dialogue or notification outside the display area. As a bonus, it can double for verifying TLS on hyperlinks while we’re at it.

Edit: a toolbar icon that’s draggable to the image/movie/link should also work the same. Probably easier for mobile users too.

Natanael@slrpnk.net on 12 Feb 2024 22:56 collapse

If you set the download manager icon in the browser as permanently visible, then dragging it there could trigger the verification to also run if the metadata is detected, and to then also show whichever metadata it could verify.

dejected_warp_core@lemmy.world on 13 Feb 2024 00:11 collapse

That’s a tad obscure, but makes it much easier to code up a prototype. I like it.

Natanael@slrpnk.net on 12 Feb 2024 14:07 collapse

Do not show a checkmark by default! This is why cryptographers kept telling browsers to de-emphasize the lock icon on TLS (HTTPS) websites. You want to display the claimed author and if you’re able to verify keypair authenticity too or not.

AtHeartEngineer@lemmy.world on 12 Feb 2024 15:33 collapse

Fair point, I agree with this. There should probably be another icon in the browser that shows if all, some, or none of the media on a page has signatures that can be validated. Though that gets messy as well, because what is “media”? Things can be displayed in a web canvas or SVG that appears to be a regular image, when in reality it’s rendered on the fly.

Security and cryptography UX is hard. Good point, thanks for bringing that up! Btw, this is kind of my field.

Natanael@slrpnk.net on 12 Feb 2024 16:14 collapse

I run /r/crypto at reddit (not so active these days due to needing to keep it locked because of spam bots, but it’s not dead yet), usability issues like this are way too common

AtHeartEngineer@lemmy.world on 13 Feb 2024 12:22 collapse

I ran /r/cryptotechnology for years, and am good friends with the /r/cc mods. Reddit is a mess though, especially in the crypto areas.

PhlubbaDubba@lemm.ee on 11 Feb 2024 09:09 next collapse

Probably you’d notice a bit of extra time posting for the signature to be added, but that’s about it, the responsibility for verifying the signature would fall to the owners of the social media site and in the circumstances where someone asks for a verification, basically imagine it as a libel case on fast forward, you file a claim saying “I never said that”, they check signatures, they shrug and press the delete button and erase the post, crossposts, and if it’s really good screencap posts and those crossposts of the thing you did not say but is still being attributed falsely to your account or person.

It basically gives absolute control of a person’s own image and voice to themself, unless a piece of media is provable to have been made with that person’s consent, or by that person themself, it can be wiped from the internet no trouble.

Where it comes to second party posters, news agencies and such, it’d be more complicated but more or less the same, with the added step that a news agency may be required to provide some supporting evidence that what they said is not some kind of misrepresentation or such as the offended party filing the takedown might be trying to insist for the sake of their public image.

Of course there could still be a YouTube “Stats for Nerds”-esque addin to the options tab on a given post that allows you to sign-check it against the account it’s attributing something to, and a verified account system could be developed that adds a layer of signing that specifically identifies a published account, like say for prominent news reporters/politicians/cultural leaders/celebrities, that get into their own feed so you can look at them or not depending on how ya be feelin’ that particular scroll session.

General_Effort@lemmy.world on 11 Feb 2024 13:07 next collapse

For the average end-user, it would look like “https”. You would not have to know anything about the technical background. Your browser or other media player would display a little icon showing that the media is verified by some trusted institution and you could learn more with a click.

In practice, I see some challenges. You could already go to the source via https, EG whitehouse.gov, and verify it that way. An additional benefit exists only if you can verify media that have been re-uploaded elsewhere. Now the user needs to check that the media was not just signed by someone (EG whitehouse.gov. ru), but if it was really signed by the right institution.

TheKingBee@lemmy.world on 11 Feb 2024 16:20 collapse

As someone points out above, this just gives them the power to not authenticate real videos that make them look bad…

General_Effort@lemmy.world on 11 Feb 2024 20:13 next collapse

Videos by third parties, like Trump’s pussy grabber clip, would obviously have to be signed by them. After having thought about it, I believe this is a non-starter.

It just won’t be as good as https. Such a signing scheme only makes sense if the media is shared away from the original website. That means you can’t just take a quick look at the address bar to make sure you are not getting phished. That doesn’t work if it could be any news agency. You have to make sure that the signer is really a trusted agency and not some scammy lookalike. That takes too much care for casual use, which defeats the purpose.

Also, news agencies don’t have much of an incentive to allow sharing their media. Any cryptographic signature would only make sense for them if directs users to their site, where they can make money. Maybe the potential for more clicks - basically a kind of clickable watermark on media - could make this take off.

dejected_warp_core@lemmy.world on 12 Feb 2024 20:14 collapse

I honestly feel strategies like this should be mitigated by technically savvy journalism, or even citizen journalism. 3rd parties can sign and redistribute media in the public domain, vouching for their origin. While that doesn’t cover all the unsigned copies in existence, it provides a foothold for more sophisticated verification mechanisms like a “tineye” style search for media origin.

Starbuck@lemmy.world on 11 Feb 2024 17:08 next collapse

Adobe is actually one of the leading actors in this field, take a look at the Content Authenticity Initiative (contentauthenticity.org)

Like the other person said, it’s based on cryptographic hashing and signing. Basically the standard would embed metadata into the image.

Natanael@slrpnk.net on 12 Feb 2024 14:05 collapse

Not very well apparently

lemmy.blahaj.zone/comment/6377576

dejected_warp_core@lemmy.world on 12 Feb 2024 19:55 next collapse

TL;DR: one day the user will see an overlay or notification that shows an image/movie is verified as from a known source. No extra software required.

Honestly, I can see this working great in future web browsers. Much like the padlock in the URL bar, we could see something on images that are verified. The image could display a padlock in the lower-left corner or something, along with the name of the source, demonstrating that it’s a securely verified asset. “Normal” images would be unaffected. The big problem is how to put something on the page that cannot be faked by other means.

It’s a little more complicated for software like phone apps for X or Facebook, but doable. The problem is that those products must choose to add this feature. Hopefully, losing reputation to being swamped with unverifiable media will be motivation enough to do so.

The underlying verification process is complex, but should be similar to existing technology (e.g. GPG). The key is that images and movies typically contain a “scratch pad” area in the file for miscellaneous stuff (metadata). This is where the image’s author can add a cryptographic signature for the file itself. The user would never even know it’s there.

Cocodapuf@lemmy.world on 13 Feb 2024 12:50 collapse

It needs some kind of handler, but we mostly have those in place. A web browser could be the handler for instance. A web browser has the green dot on the upper left, telling you a page is secure, that https is on and valid. This could work like that, the browser can verify the video and display a green or red dot in the corner, the user could just mouse over it/tap on it to see who it’s verified to be from. But it’s up to the user to mouse over it and check if it says whitehouse.gov or dr-evil-mwahahaha.biz

pupbiru@aussie.zone on 11 Feb 2024 06:39 next collapse

i wouldn’t say signature exactly, because that ensures that a video hasn’t been altered in any way: no re-encoded, resized, cropped, trimmed, etc… platforms almost always do some of these things to videos, even if it’s not noticeable to the end-user

there are perceptual hashes, but i’m not sure if they work in a way that covers all those things or if they’re secure hashes. i would assume not

perhaps platforms would read the metadata in a video for a signature and have to serve the video entirely unaltered if it’s there?

AbouBenAdhem@lemmy.world on 11 Feb 2024 06:47 next collapse

Rather that using a hash of the video data, you could just include within the video the timestamp of when it was originally posted, encrypted with the White House’s private key.

Natanael@slrpnk.net on 16 Feb 2024 13:57 collapse

That doesn’t prove that the data outside the timestamp is unmodified

AbouBenAdhem@lemmy.world on 16 Feb 2024 14:08 collapse

It does if you can also verify the date of the file, because the modified file will be newer than the timestamp. An immutable record of when the file was first posted (on, say, YouTube) lets you verify which version is the source.

Natanael@slrpnk.net on 17 Feb 2024 00:47 collapse

No it does not because you can cut out the timestamp and put it into anything if the timestamp doesn’t encode anything about the frame contents.

It is always possible to backdate file edits.

Sure, public digital timestamping services exists, but most people will not check. Also once again, an older timestamp can simply be cut out of one file and posted into another file.

You absolutely must embedd something which identifies what the media file is, which can be used to verify ALL of the contents with cryptographic signatures. This may additionally refer to a verifiable timestamp at some timestamping service.

thantik@lemmy.world on 11 Feb 2024 07:01 next collapse

You don’t need to bother with cryptographically verifying downstream videos, only the source video needs to be able to be cryptographically verified. That way you have an unedited, untampered cut that can be verified to be factually accurate to the broadcast.

The White House could serve the video themselves if they so wanted to. Just use something similar to PGP for signature validation and voila. Studios can still do all the editing, cutting, etc - it shouldn’t be up to the end user to do the footwork on this, just for the studios to provide a kind of ‘chain of custody’ - they can point to the original verification video for anyone to compare to; in order to make sure alterations are things such as simple cuts, and not anything more than that.

pupbiru@aussie.zone on 11 Feb 2024 10:53 collapse

you don’t even need to cryptographically verify in that case because you already have a trusted authority: the whitehouse… of the video is on the whitehouse website, it’s trusted with no cryptography needed

the technical solutions only come into play when you’re trying to modify the video and still accurately show that it’s sourced from something verifiable

heck you could even have a standard where if a video adds a signature to itself, editing software will add the signature of the original, a canonical immutable link to the file, and timestamps for any cuts to the video… that way you (and by you i mean anyone; likely hidden from the user) can load up a video and be able to link to the canonical version to verify

in this case, verification using ML would actually be much easier because you (servers) just download the canonical video, cut it as per the metadata, and compare what’s there to what’s in the current video

Natanael@slrpnk.net on 12 Feb 2024 14:16 collapse

Apple’s scrapped on-device CSAM scanning was based on perceptual hashes.

The first collision demo breaking them showed up in hours with images that looked glitched. After just a week the newest demos produced flawless images with collisions against known perceptual hash values.

In theory you could create some ML-ish compact learning algorithm and use the compressed model as a perceptual hash, but I’m not convinced this can be secure enough unless it’s allowed to be large enough, as in some % of the original’s file size.

pupbiru@aussie.zone on 12 Feb 2024 23:10 collapse

you can definitely produced perceptual hashes that collide, but really you’re not just talking about a collision, you’re talking about a collision that’s also useful in subverting an election, AND that’s been generated using ML which is something that’s still kinda shakey to start with

Natanael@slrpnk.net on 12 Feb 2024 23:17 collapse

Perceptual hash collision generators can take arbitrary images and tweak them in invisible ways to make them collide with whichever hash value you want.

pupbiru@aussie.zone on 15 Feb 2024 08:03 collapse

from the comment above, it seems like it took a week for a single image/frame though… it’s possible sure but so is a collision in a regular hash function… at some point it just becomes too expensive to be worth it, AND the phash here isn’t being used as security because the security is that the original was posted on some source of truth site (eg the whitehouse)

Natanael@slrpnk.net on 15 Feb 2024 20:41 collapse

No, it took a week to refine the attack algorithm, the collision generation itself is fast

The point of perceptual hashes is to let you check if two things are similar enough after transformations like scaling and reencoding, so you can’t rely on that here

pupbiru@aussie.zone on 16 Feb 2024 09:27 collapse

oh yup that’s a very fair point then! you certainly wouldn’t use it for security in that case, however there are a lot of ways to implement this that don’t rely on the security of the hash function, but just uses it (for example) to point to somewhere in a trusted source to manually validate that they’re the same

we already have the trust frameworks; that’s unnecessary… we just need to automatically validate (or at least provide automatic verifyability) that a video posted on some 3rd party - probably friendly or at least cooperative - platform represents reality

Natanael@slrpnk.net on 16 Feb 2024 13:57 collapse

I think the best bet is really video formats with multiple embedded streams carrying complementary frame data (already exists) so you decide video quality based on how many streams you want to merge in playback.

If you then hashed the streams independently and signed the list of hashes, then you have a video file which can be “compressed” without breaking the signature by stripping out some streams.

mods_are_assholes@lemmy.world on 11 Feb 2024 07:28 collapse

Maybe deepfakes are enough of a scare that this becomes standard practice, and protects encryption from getting government backdoors.

RVGamer06@sh.itjust.works on 11 Feb 2024 07:59 collapse

<img alt="" src="https://sh.itjust.works/pictrs/image/b6247fb6-ed06-4ea8-b17f-4c4b379a8dac.jpeg">

mods_are_assholes@lemmy.world on 11 Feb 2024 11:21 collapse

Hey, congresscritters didn’t give a shit about robocalls till they were the ones getting robocalled.

We had a do not call list within a year and a half.

That’s the secret, make it affect them personally.

Daft_ish@lemmy.world on 11 Feb 2024 15:57 collapse

Doesn’t that prove that government officials lack empathy? We see it again and again but still we keep putting these unfeeling bastards in charge.

mods_are_assholes@lemmy.world on 12 Feb 2024 03:22 collapse

Well sociopaths are really good at navigating power hierarchies and I’m not sure there is an ethical way of keeping them from holding office.

Natanael@slrpnk.net on 12 Feb 2024 14:00 collapse

It really depends on their motivation. The ones we need to keep out are the ones who enjoy hurting others or don’t care at all.

pineapplelover@lemm.ee on 11 Feb 2024 07:07 next collapse

Huh. They actually do something right for once instead of spending years trying to ban A.I tools. I’m pleasantly surprised.

PhlubbaDubba@lemm.ee on 11 Feb 2024 08:47 next collapse

I mean banning use cases is deffo fair game, generating kiddy porn should be treated as just as heinous as making it the “traditional” way IMO

General_Effort@lemmy.world on 11 Feb 2024 10:25 next collapse

Yikes! The implication is that it does not matter if a child was victimized. It’s “heinous”, not because of a child’s suffering, but because… ?

PhlubbaDubba@lemm.ee on 11 Feb 2024 10:40 collapse

Man imagine trying to make “ethical child rape content” a thing. What were the lolicons not doing it for ya anymore?

As for how it’s exactly as heinous, it’s the sexual objectification of a child, it doesn’t matter if it’s a real child or not, the mere existence of the material itself is an act of normalization and validation of wanting to rape children.

Being around at all contributes to the harm of every child victimised by a viewer of that material.

General_Effort@lemmy.world on 11 Feb 2024 12:41 collapse

I see. Since the suffering of others does not register with you, you must believe that any “bleeding heart liberal” really has some other motive. Well, no. Most (I hope, but at least some) people are really disturbed by the suffering of others.

I take the “normalization” argument seriously. But I note that it is not given much credence in other contexts; violent media, games, … Perhaps the “gateway drug” argument is the closest parallel.

In the very least, it drives pedophiles underground where they cannot be reached by digital streetworkers, who might help them not to cause harm. Instead, they form clandestine communities that are already criminal. I doubt that makes any child safer. But it’s not about children suffering for you, so whatever.

PhlubbaDubba@lemm.ee on 11 Feb 2024 13:11 collapse

Man imagine continuing to try and argue Ethical Child Rape Content should be a thing.

If we want to make sweeping attacks on character, I’d rather be on the “All Child Rape Material is Bad” side of the argument but whatever floats ya boat.

Fly4aShyGuy@lemmy.one on 11 Feb 2024 17:50 collapse

I don’t think he’s arguing that, and I don’t think you believe that either. Doubt any of us would consider that content ethical, but what he’s saying is it’s not nearly the same as actually doing harm (as opposed what you said in your original post).

You implying that anyone who disagrees with you is somehow into those awful things is extremely poor taste. I’d expect so much more on Lemmy, that is a Reddit/Facebook level debate tactic. I guess I’m going to get accused of that too now?

I don’t like to give any of your posts any credit here, but I can somewhat see the normalization argument. However, where is the line drawn regarding other content that could be harmful because normalized. What about adult non consensual type porn, violence on TV and video games, etc. Sliding scale and everyone might draw the line somewhere else. There’s good reason why thinking about an awful things (or writing, drawing, creating fiction about it) is not the same as doing an awful thing.

I doubt you’ll think much of this, but please really try to be better. It’s 2024, time to let calling anyone you disagree with a pedo back on facebook in the 90s.

TheGrandNagus@lemmy.world on 11 Feb 2024 12:50 collapse

Idk, making CP where a child is raped vs making CP where no children are involved seem on very different levels of bad to me.

Both utterly repulsive, but certainly not exactly the same.

One has a non-consenting child being abused, a child that will likely carry the scars of that for a long time, the other doesn’t. One is worse than the other.

E: do the downvoters like… not care about child sexual assault/rape or something? Raping a child and taking pictures of it is very obviously worse than putting parameters into an AI image generator. Both are vile. One is worse. Saying they’re equally bad is attributing zero harm to the actual assaulting children part.

PhlubbaDubba@lemm.ee on 11 Feb 2024 13:13 collapse

Man imagine trying to make the case for Ethical Child Rape Material.

You are not going to get anywhere with this line of discussion, stop now before you say something that deservedly puts you on a watchlist.

TheGrandNagus@lemmy.world on 11 Feb 2024 13:19 collapse

I’m not making the case for that at all, and I find you attempting to make out that I am into child porn a disgusting debate tactic.

“Anybody who disagrees with my take is a paedophile” is such a poor argument and serves only to shut down discussion.

It’s very obviously not what I’m saying, and anybody with any reading comprehension at all can see that plainly.

You’ll notice I called it “utterly repulsive” in my comment - does that sound like the words of a child porn advocate?

The fact that you apparently don’t care at all about the child suffering side of it is quite troubling. If a child is harmed in its creation, then that’s obviously worse than some creepy fuck drawing loli in Inkscape or typing parameters into an AI image generator. I can’t believe this is even a discussion.

CyberSeeker@discuss.tchncs.de on 11 Feb 2024 13:32 collapse

Bingo. If, at the limit, the purpose of a generative AI is to be indistinguishable from human content, then watermarking and AI detection algorithms are absolutely useless.

The ONLY means to do this is to have creators verify their human-generated (or vetted) content at the time of publication (providing positive proof), as opposed to attempting to retroactively trying to determine if content was generated by a human (proving a negative).

Aurenkin@sh.itjust.works on 11 Feb 2024 07:09 next collapse

I think this is a great idea. Hopefully it becomes the standard soon, cryptographically signing clips or parts of clips so there’s no doubt as to the original source.

ryannathans@aussie.zone on 11 Feb 2024 07:20 next collapse

I have said for years all media that needs to be verifiable needs to be signed. Gpg signing lets gooo

NateNate60@lemmy.world on 11 Feb 2024 08:19 next collapse

Very few people understand why a GPG signature is reliable or how to check it. Malicious actors will add a “GPG Signed” watermark to their fake videos and call it a day, and 90% of victims will believe it.

optissima@lemmy.world on 11 Feb 2024 08:28 next collapse

As soon as VLC adds the gpg sig feature, it’s over.

TheKingBee@lemmy.world on 11 Feb 2024 16:18 next collapse

And that will in no way be the first step on the road to VLC deciding which videos it allows you to play…

NateNate60@lemmy.world on 11 Feb 2024 17:28 next collapse

No, it’s not. People don’t use VLC to watch misinformation videos. They see it on Reddit, Facebook, YouTube, or TikTok.

QuaternionsRock@lemmy.world on 12 Feb 2024 05:42 collapse

…how popular do you think VLC is among those who don’t understand cryptographic signatures?

PhlubbaDubba@lemm.ee on 11 Feb 2024 08:47 collapse

Yeah but all it takes is proving it doesn’t have the right signature and you can make the Social Media corpo take every piece of media with that signature just for that alone.

What’s even better is that you can attack entities that try to maliciously let people get away with misusing their look and fake being signed for failing to defend their IP, basically declaring you intend to take them to court to Public Domainify literally everything that makes them any money at all.

If billionaires were willing to allow disinformation as a service then they wouldn’t have gone to war against news as a service to make it profitable to begin with.

captain_aggravated@sh.itjust.works on 11 Feb 2024 09:00 next collapse

I just mentioned this in another comment tonight; cryptographic verification has existed for years but basically no one has adopted it for anything. Some people still seem to think pasting an image of your handwriting on a document is “signing” a document somehow.

ryannathans@aussie.zone on 11 Feb 2024 10:17 next collapse

Still trying to get people to sign their emails lol

captain_aggravated@sh.itjust.works on 11 Feb 2024 22:45 collapse

I mean, part of it is PGP is the exact opposite of streamlined and you’ve got to be NSA levels of paranoid to bother with it.

ryannathans@aussie.zone on 11 Feb 2024 23:01 collapse

It’s automated in all mainstream email clients, you don’t even have to think about it if a contact has it set up

NateNate60@lemmy.world on 12 Feb 2024 08:29 collapse

if a contact has it set up

Well, there’s your problem.

The most commonly-used mail client in the world is the Gmail web client which does not support it. Uploading your PGP key to Gmail and having them store it server-side for use in a webmail client is obviously problematic from a security standpoint. Number 2 I would guess is Outlook, which appears also not to support it. For most people, I don’t think they understand the value of cryptographically signing emails and going through the hassle of generating and publishing their PGP keys, especially since Windows has no built-in easy application for generating and managing such keys.

There’s also the case that for most people, signing their emails provides absolutely no immediate benefit to them.

captain_aggravated@sh.itjust.works on 12 Feb 2024 13:26 collapse

Plus that’s email. What about… Literally everything else?

NateNate60@lemmy.world on 12 Feb 2024 17:52 collapse

Yeah, almost nothing has good PGP integration.

Except Git, apparently.

wizardbeard@lemmy.dbzer0.com on 11 Feb 2024 15:32 collapse

It doesn’t help that in a lot of cases, this is actually accepted by a shit ton of important institutions that should be better, but aren’t.

bionicjoey@lemmy.ca on 11 Feb 2024 12:28 collapse

The average Joe won’t know what any of what you just said means. Hell, the Joe in the OP doesn’t know what any of you just said means. There’s no way (IMO) of simultaneously creating a cryptographic assurance and having it be accessible to the layman.

NateNate60@lemmy.world on 12 Feb 2024 08:33 collapse

There is, but only if you can implement a layer of abstraction and get them to trust that layer of abstraction.

Few laymen understand why Bitcoin is secure. They just trust that their wallet software works and because they were told by smarter people that it is secure.

Few laymen understand why TLS is secure. They just trust that their browser tells them it is secure.

Few laymen understand why biometric authentication on their phone apps is secure. They just trust that their device tells them it is secure.

bionicjoey@lemmy.ca on 12 Feb 2024 12:14 collapse

Each of those perfectly illustrates the problem with adding in a layer of abstraction though:

Bitcoin is a perfect example of the problem. Since almost nobody understands how it works, they keep their coins in an exchange instead of a wallet and have completely defeated the point of cryptocurrency in the first place by reintroducing blind trust into the system.

Similarly, the TLS ecosystem is problematic. Because even though it is theoretically supposed to verify the identity of the other party, most people aren’t savvy enough to check the name on the cert and instead just trust that if their browser doesn’t warn them, they must be okay. Blind trust one again is introduced alongside the necessary abstraction layers needed to make cryptography palatable to the masses.

Lastly, people have put so much trust in the face scanning biometrics to wake their phone that they don’t realize they may have given their face to a facial recognition company who will use it to help bring about the cyberpunk dystopia that we are all moving toward.

Zehzin@lemmy.world on 11 Feb 2024 07:44 next collapse

Official Joe Biden NFTs comfirmed

Gork@lemm.ee on 11 Feb 2024 07:55 collapse

Don’t trust any key you know is malarkey!

PhlubbaDubba@lemm.ee on 11 Feb 2024 09:10 collapse

I can totally see this being a thing and I kinda wish it would just because I love old people trying to seem like they know tech when they don’t but in the context of still helpful tech stuff.

long_chicken_boat@sh.itjust.works on 11 Feb 2024 08:16 next collapse

what if I meet Joe and take a selfie of both of us using my phone? how will people know that my selfie is an authentic Joe Biden?

PhlubbaDubba@lemm.ee on 11 Feb 2024 08:56 next collapse

Probably a signed comment from the Double-Cone Crusader himself, basically free PR so I don’t see why he or any other president wouldn’t at least have an intern give you a signed comment fist bump of acknowledgement

fidodo@lemmy.world on 11 Feb 2024 09:40 next collapse

That’s the big question. How will we verify anything as real?

cynar@lemmy.world on 11 Feb 2024 10:03 collapse

Ultimately, reputation based trust, combined with cryptographic keys is likely the best we can do. You (semi automatically) sign the photo, and upload it’s stamp to a 3rd party. They can verify that they received the stamp from you, and at what time. That proves the image existed at that time, and that it’s linked to your reputation. Anything more is just likely to leak, security wise.

Deello@lemm.ee on 11 Feb 2024 08:32 next collapse

So basically Biden ads on the blockchain.

TheGrandNagus@lemmy.world on 11 Feb 2024 12:56 next collapse

…no

Think of generating an md5sum to verify that the file you downloaded online is what it should be and hasn’t been corrupted during the download process or replaced in a Man in the Middle attack.

brbposting@sh.itjust.works on 11 Feb 2024 17:36 collapse

generating an md5sum to verify that the file you downloaded online

<img alt="" src="https://sh.itjust.works/pictrs/image/fb788092-0889-4dcb-8500-5f0265f51f96.jpeg">

Muehe@lemmy.ml on 11 Feb 2024 16:37 collapse

Cryptography ⊋ Blockchain

A blockchain is cryptography, but not all cryptography is a blockchain.

circuitfarmer@lemmy.world on 11 Feb 2024 09:19 next collapse

I’m sure they do. AI regulation probably would have helped with that. I feel like congress was busy with shit that doesn’t affect anything.

ours@lemmy.world on 11 Feb 2024 09:37 next collapse

I salute whoever has the challenge of explaining basic cryptography principles to Congress.

Spendrill@lemm.ee on 11 Feb 2024 11:19 next collapse

Might just as well show a dog a card trick.

wizardbeard@lemmy.dbzer0.com on 11 Feb 2024 15:30 collapse

That’s why I feel like this idea is useless, even for the general population. Even with some sort of visual/audio based hashing, so that the hash is independant of minor changes like video resolution which don’t change the content, and with major video sites implementing a way for the site to verify that hash matches one from a trustworthy keyserver equivalent…

The end result for anyone not downloading the videos and verifying it themselves is the equivalent of those old ”✅ safe ecommerce site, we swear" images. Any dedicated misinformation campaign will just fake it, and that will be enough for the people who would have believed the fake to begin with.

brbposting@sh.itjust.works on 11 Feb 2024 16:57 collapse

<img alt="" src="https://sh.itjust.works/pictrs/image/b8f00dda-4858-42e2-9abc-069efb65b9fe.webp">

johnyrocket@feddit.ch on 11 Feb 2024 12:32 collapse

Should probably start out with the colour mixing one. That was very helpfull for me to figure out public key cryptography. The difficulty comes in when they feel like you are treating them like toddlers so they start behaving more like toddlers. (Which they are 99% if the time)

lemmyingly@lemm.ee on 11 Feb 2024 11:33 collapse

I see no difference between creating a fake video/image with AI and Adobe’s packages. So to me this isn’t an AI problem, it’s a problem that should have been resolved a couple of decades ago.

JasSmith@sh.itjust.works on 11 Feb 2024 09:24 next collapse

This doesn’t solve anything. The White House will only authenticate videos which make the President look good. Curated and carefully edited PR. Maybe the occasional press conference. The vast majority of content will not be authenticated. If anything this makes the problem worse, as it will give the President remit to claim videos which make them look bad are not authenticated and should therefore be distrusted.

ours@lemmy.world on 11 Feb 2024 09:40 next collapse

I don’t understand your concern. Either it’ll be signed White House footage or it won’t. They have to sign all their footage otherwise there’s no point to this. If it looks bad, don’t release it.

maynarkh@feddit.nl on 11 Feb 2024 11:35 next collapse

The point is that if someone catches the President shagging kids, of course that footage won’t be authenticated by the WH. We need a tool so that a genuine piece of footage of the Pres shagging kids would be authenticated, but a deepfake of the same would not. The WH is not a good arbiter since they are not independent.

ours@lemmy.world on 11 Feb 2024 17:20 next collapse

But we are talking about official WH videos. Start signing those.

If it’s not from the WH, it isn’t signed. Or perhaps it’s signed by whatever media company is behind its production or maybe they’ve verified the video and its source enough to sign it. So maybe, let’s say the Washington Post can publish some compromising video of the President but it still has certain accountability as opposed to some completely random Internet video.

brbposting@sh.itjust.works on 11 Feb 2024 17:28 collapse

Politicians and anyone at deepfake risk wear a digital pendant at all times. Pendant displays continually rotating time-based codes. People record themselves using video hardware which crypto graphically signs output.

Only a law/Big 4 firm can extract video from the official camera (which has a twin for hot swapping).

Natanael@slrpnk.net on 12 Feb 2024 23:09 collapse

Codes which don’t embedd any information about what you’re saying or doing can be copied over to faked images.

In theory you could have such a pendant record your voice, etc, and continously emit signatures for compressed versions of your speech (or a signed speech-to-text transcript)

JasSmith@sh.itjust.works on 11 Feb 2024 11:37 collapse

Then this exercise is a waste of time. All the hard hitting journalism which presses the President and elicits a negative response will be unsigned, and will be distributed across social media as it is today: without authentication. All the videos for which the White House is concerned about authenticity will continue to circulate without any cause for contention.

cynar@lemmy.world on 11 Feb 2024 09:58 next collapse

It needs to be more general. A video should have multiple signatures. Each signature relies on the signer’s reputation, which works both ways. It won’t help those who don’t care about their reputation, but will for those that do.

A photographer who passes off a fake photo as real will have their reputation hit, if they are caught out. The paper that published it will also take a hit. It’s therefore in the paper’s interest to figure out how trustworthy the supplier is.

I believe canon recently announced a camera that cryptographically signs photographs, at the point of creation. At that point, the photographer can prove the camera, the editor can prove the photographer, the paper can prove the editor, and the reader can prove the newspaper. If done right, the final viewer can also prove the whole chain, semi-independently. It won’t be perfect (far from it) but might be the best will get. Each party wants to protect their reputation, and so has a vested interest in catching fraud.

For this to work, we need a reliable way to sign images multiple times, as well as (optionally) encode an edit history into it. We also need a quick way to match cryptographic keys to a public key.

An option to upload a time stamped key to a trusted 3rd party would also be of significant benefit. Ironically, Blockchain might actually be a good use for this. In case a trusted 3rd can’t be established.

JasSmith@sh.itjust.works on 11 Feb 2024 11:34 next collapse

Great points and I agree. I also think the signature needs to be built into the stream in a continuous fashion so that snippets can still be authenticated.

cynar@lemmy.world on 11 Feb 2024 13:05 collapse

Agreed. Embed a per-frame signature it into every key frame when encoding. Also include the video file time-stamp. This will mean any clip longer than around 1 second will include at least 1 signed frame.

Natanael@slrpnk.net on 12 Feb 2024 14:25 collapse

Merkle tree hashes exists for this purpose

Note that videos uses “keyframes” so you can’t extract arbitrary frames in isolation, you need to pull multiple if the frame you’re snapshotting isn’t a keyframe itself

General_Effort@lemmy.world on 11 Feb 2024 12:15 next collapse

I don’t think that’s practical or particularly desirable.

Today, when you buy something, EG a phone, the brand guarantees the quality of the product, and the seller guarantees the logistics chain (that it’s unused, not stolen, not faked, not damaged in transport, …). The typical buyer does not care about the parts used, the assembly factory, etc.

When a news source publishes media, they vouch for it. That’s what they are paid for (as it were). If the final viewer is expected to check the chain, they are asked to do the job of skilled professionals for free. Do-your-own-research rarely works out, even for well-educated people. Besides, in important cases, the whole chain will not be public to protect sources.

cynar@lemmy.world on 11 Feb 2024 13:02 collapse

It wouldn’t be intended for day to day use. It’s intended as a audit trail/chain of custody. Think of it more akin to a git history. As a user, you generally don’t care, however it can be excellent for retrospective analysis, when someone/something does screw up.

You would obviously be able to strip it out, but having it as a default would be helpful with openness.

LarmyOfLone@lemm.ee on 11 Feb 2024 17:33 next collapse

I’ve thought about this too but I’m not sure this would work. First you could hack the firmware of a cryptographically signed camera. I already read something about a camera like this that was hacked and the private key leaked. You could have an individual key for each camera and then revoke it maybe.

But you could also photograph a monitor or something like that, like a specifically altered camera lens.

Ultimately you’d probably need something like quantum entangled photon encoding to prove that the photons captured by the sensor were real photons and not fake photons. Like capturing a light field or capturing a spectrum of photons. Not sure if that is even remotely possible but it sounds cool haha.

Natanael@slrpnk.net on 12 Feb 2024 14:24 collapse

Look up transparency logs for that last part, it’s already used for TLS certificates

BrianTheeBiscuiteer@lemmy.world on 11 Feb 2024 17:08 collapse

Anyone can digitally sign anything (maybe not easily or for free). The Whitehouse can verify or not verify whatever they choose but if you, as a journalist let’s say, want to give credence to video you distribute you’ll want to digitally sign it. If a video switches hands several times without being signed it might as well have been cooked up by the last person that touched it.

go_go_gadget@lemmy.world on 11 Feb 2024 17:25 collapse

That’s fine?

Signatures aren’t meant to prove authenticity. They’re proving the source which you can use to weigh the authenticity.

I think the confusion comes from the fact that cryptographic signatures are mostly used in situations where proving the source is equivalent to proving authenticity. Proving a text message is from me proves the authenticity as there’s no such thing as doctoring my own text message. There’s more nuance when you’re using signatures to prove a source which may or may not be providing trustworthy data. But there is value in at least knowing who provided the data.

DrCake@lemmy.world on 11 Feb 2024 09:29 next collapse

Yeah good luck getting to general public to understand what “cryptographically verified” videos mean

maynarkh@feddit.nl on 11 Feb 2024 11:38 next collapse

Just make it a law that if as a social media company you allow unverified videos to be posted, you don’t get safe harbour protections from libel suits for that. It would clear right up. As long as the source of trust is independent of the government or even big business, it would work and be trustworthy.

General_Effort@lemmy.world on 11 Feb 2024 12:23 next collapse

Back in the day, many rulers allowed only licensed individuals to operate printing presses. It was sometimes even required that an official should read and sign off on any text before it was allowed to be printed.

Freedom of the press originally means that exactly this is not done.

FunderPants@lemmy.ca on 11 Feb 2024 12:25 next collapse

Jesus, how did I get so old only to just now understand that press is not journalism, but literally the printing press in ‘Freedom of the press’.

vithigar@lemmy.ca on 11 Feb 2024 13:25 collapse

You understand that there is a difference between being not permitted to produce/distribute material and being accountable for libel, yes?

“Freedom of the press” doesn’t mean they should be able to print damaging falsehood without repercussion.

General_Effort@lemmy.world on 11 Feb 2024 13:39 collapse

What makes the original comment legally problematic (IMHO), is that it is expected and intended to have a chilling effect pre-publication. Effectively, it would end internet anonymity.

It’s not necessarily unconstitutional. I would have made the argument if I thought so. The point is rather that history teaches us that close control of publications is a terrible mistake.

The original comment wants to make sure that there is always someone who can be sued/punished, with obvious consequences for regime critics, whistleblowers, and the like.

Dark_Arc@social.packetloss.gg on 11 Feb 2024 16:22 next collapse

We need to take history into account but I think we’d be foolish to not acknowledge the world has indeed changed.

Freedom of the press never meant that any old person could just spawn a million press shops and pedal whatever they wanted. At best the rich could, and nobody was anonymous for long at that kind of scale.

Personally I’m for publishing via proxy (i.e. an anonymous tip that a known publisher/person is responsible for) … I’m not crazy about “anybody can write anything on any political topic and nobody can hold them accountable offline.”

vithigar@lemmy.ca on 11 Feb 2024 17:14 collapse

So your suggestion is that libel, defamation, harassment, et al are just automatically dismissed when using online anonymous platforms? We can’t hold the platform responsible, and we can’t identify the actual offender, so whoops, no culpability?

I strongly disagree.

Supermariofan67@programming.dev on 11 Feb 2024 17:57 collapse

That’s not what the commenter said and I think you are knowingly misrepresenting it.

vithigar@lemmy.ca on 11 Feb 2024 21:09 collapse

I am not. And if that’s not what’s implied by their comments then I legitimately have no idea what they’re suggesting and would appreciate an explanation.

bionicjoey@lemmy.ca on 11 Feb 2024 12:26 collapse

As long as the source of trust is independent of the government or even big business, it would work and be trustworthy

That sounds like wishful thinking

FunderPants@lemmy.ca on 11 Feb 2024 12:19 next collapse

Democrats will want cryptographically verified videos, Republicans will be happy with a stamp that has trumps face on it.

<img alt="" src="https://lemmy.ca/pictrs/image/8a89f6ea-0959-45f5-a571-17c8f8b6ddef.jpeg">

<img alt="" src="https://lemmy.ca/pictrs/image/e018ce17-b09a-42b2-a7e4-ac687e93dde5.jpeg">

wizardbeard@lemmy.dbzer0.com on 11 Feb 2024 15:14 collapse

I mean, how is anyone going to crytographically verify a video? You either have an icon in the video itself or displayed near it by the site, meaning nothing, fakers just copy that in theirs. Alternatively you have to sign or make file hashes for each permutation of the video file sent out. At that point how are normal people actually going to verify? At best they’re trusting the video player of whatever site they’re on to be truthful when it says that it’s verified.

Saying they want to do this is one thing, but as far as I’m aware, we don’t have a solution that accounts for the rampant re-use of presidential videos in news and secondary reporting either.

I have a terrible feeling that this would just be wasted effort beyond basic signing of the video file uploaded on the official government website, which really doesn’t solve the problem for anyone who can’t or won’t verify the hash on their end.


Maybe some sort of visual and audio based hash, like musicbrainz ids for songs that are independant of the file itself but instead on the sound of it. Then the government runs a server kind of like a pgp key server. Then websites could integrate functionality to verify it, but at the end of the day it still works out to a “I swear we’re legit guys” stamp for anyone not techinical enough to verify independantly thenselves.


I guess your post just seemed silly when the end result of this for anyone is effectively the equivalent of your “signed by trump” image, unless the public magically gets serious about downloading and verifying everything themselves independently.

Fuck trump, but there are much better ways to shit on king cheeto than pretending the average populace is anything but average based purely on political alignment.

You have to realize that to the average user, any site serving videos seems as trustworthy as youtube. Average internet literacy is absolutely fucking abysmal.

beefontoast@lemmy.world on 11 Feb 2024 15:54 next collapse

In the end people will realise they can not trust any media served to them. But it’s just going to take time for people to realise… And while they are still blindly consuming it, they will be taken advantage of.

If it goes this road… Social media could be completely undermined. It could become the downfall of these platforms and do everyone a favour by giving them their lives back after endless doom scrolling for years.

technojamin@lemmy.world on 11 Feb 2024 23:03 next collapse

People aren’t going to do it, the platforms that 95% of people use (Facebook, Tik Tok, YouTube, Instagram) will have to add the functionality to their video players/posts. That’s the only way anything like this could be implemented by the 2024 US election.

Strykker@programming.dev on 12 Feb 2024 02:40 collapse

Do it basically the same what TLS verification works, sure the browsers would have to add something to the UI to support it, but claiming you can’t trust that is dumb because we already use that to trust the site your on is your bank and not some scammer.

Sure not everyone is going to care to check, but the check being there allows people who care to reply back saying the video is faked due to X

patatahooligan@lemmy.world on 11 Feb 2024 13:29 next collapse

The general public doesn’t have to understand anything about how it works as long as they get a clear “verified by …” statement in the UI.

kandoh@reddthat.com on 11 Feb 2024 16:14 collapse

The problem is that even if you reveal the video as fake,the feeling it reinforces on the viewer stays with them.

“Sure that was fake,but the fake that it seems believable tells you everything you need to know”

go_go_gadget@lemmy.world on 11 Feb 2024 17:23 collapse

“Herd immunity” comes into play here. If those people keep getting dismissed by most other people because the video isn’t signed they’ll give up and follow the crowd. Culture is incredibly powerful.

BradleyUffner@lemmy.world on 11 Feb 2024 15:30 next collapse

It could work the same way the padlock icon worked for SSL sites in browsers back in the day. The video player checks the signature and displays the trusted icon.

Natanael@slrpnk.net on 12 Feb 2024 14:19 collapse

It needs to focus on showing who published it, not the icon

makeasnek@lemmy.ml on 12 Feb 2024 02:18 collapse

“Not everybody will use it and it’s not 100% perfect so let’s not try”

NateNate60@lemmy.world on 12 Feb 2024 17:55 collapse

That’s not the point. It’s that malicious actors could easily exploit that lack of knowledge to trick users into giving fake videos more credibility.

If I were a malicious actor, I’d put the words “✅ Verified cryptographically by the White House” at the bottom of my posts and you can probably understand that the people most vulnerable to misinformation would probably believe it.

andrew_bidlaw@sh.itjust.works on 11 Feb 2024 12:50 next collapse

Why not just official channels of information, e.g. White house Mastodon instance with politicians’ accounts, government-hosted, auto-mirrored by third parties.

surewhynotlem@lemmy.world on 11 Feb 2024 13:22 next collapse

Fucking finally. We’ve had this answer to digital fraud for ages.

BrianTheeBiscuiteer@lemmy.world on 11 Feb 2024 16:46 collapse

Sounds like a very Biden thing (or for anyone well into their Golden Years) to say, “Use cryptography!” but it’s not without merit. How do we verify file integrity? How to we digitally sign documents?

The problem we currently have is that anything that looks real tends to be accepted as real (or authentic). We can’t rely on humans to verify authenticity of audio or video anymore. So for anything that really matters we need to digitally sign it so it can be verified by a certificate authority or hashed to verify integrity.

This doesn’t magically fix deep fakes. Not everyone will verify a video before distribution and you can’t verify a video that’s been edited for time or reformatted or broadcast on the TV. It’s a start.

SpaceCowboy@lemmy.ca on 11 Feb 2024 17:15 next collapse

The President’s job isn’t really to be an expert on everything, the job is more about being able to hire people who are experts.

If this was coupled with a regulation requiring social media companies to do the verification and indicate that the content is verified then most people wouldn’t need to do the work to verify content (because we know they won’t).

It obviously wouldn’t solve every problem with deepfakes, but at least it couldn’t be content claiming to be from CNN or whoever. And yes someone editing content from trusted sources would make that content no longer trusted, but that’s actually a good thing. You can edit videos to make someone look bad, you can slow it down to make a person look drunk, etc. This kind of content should not considered trusted either.

Someone doing a reaction video going over news content or whatever could have their stuff be considered trusted, but it would be indicated as being content from the person that produced the reaction video not as content coming from the original news source. So if you see a “news” video that has it’s verified source as “xXX_FlatEarthIsReal420_69_XXx” rather than CNN, AP News, NY Times, etc, you kinda know what’s up.

go_go_gadget@lemmy.world on 11 Feb 2024 17:21 next collapse

We’ve had this discussion a lot in the Bitcoin space. People keep arguing it has to change so that “grandma can understand it” but I think that’s unrealistic. Every technology has some inherent complexities that cannot be removed and people have to learn if they want to use it. And people will use it if the motivation is there. Wifi has some inherent complexities people have become comfortable with. People know how to look through lists of networks, find the right one, enter the passkey or go through the sign on page. Some non-technical people know enough about how Wifi should behave to know the internet connection might be out or the route might need a reboot. None of this knowledge was commonplace 20 years ago. It is now.

The knowledge required to leverage the benefits of cryptographic signatures isn’t beyond the reach of most people. The general rules are pretty simple. The industry just has to decide to make the necessary investments to motivate people.

nxdefiant@startrek.website on 11 Feb 2024 22:51 collapse

The number of 80 year olds that know what cryptography is AND know that it’s a proper solution here is not large. I’d expect an 80 year old to say something like “we should only look at pictures sent by certified mail” or “You cant trust film unless it’s an 8mm and the can was sealed shut!”

ZombiFrancis@sh.itjust.works on 11 Feb 2024 16:35 next collapse

It would become quite easy to dismiss anything for not being cryptographically verified simply by not cryptographically verifying.

I can see the benefit of having such verification but I also see how prone it might be to suppressing unpopular/unsanctioned journalism.

Unless the proof is very clear and easy for the public to understand the new method of denial just becomes the old method of denial.

abhibeckert@lemmy.world on 12 Feb 2024 00:50 next collapse

It would be nice if none of this was necessary… but we don’t live in that world. There is a lot of straight up bullshit in the news these days especially when it comes to controversial topics (like the war in Gaza, or Covid).

You could go a really long way by just giving all photographers the ability to sign their own work. If you know who took the photo, then you can make good decisions about wether to trust them or not.

Random account on a social network shares a video of a presidential candidate giving a speech? Yeah maybe don’t trust that. Look for someone else who’s covered the same speech instead, obviously any real speech is going to be covered by every major news network.

That doesn’t stop a ordinary people from sharing presidential speeches on social networks. But it would make it much easier to identify fake content.

jabjoe@feddit.uk on 12 Feb 2024 09:11 collapse

Once people get used to cryptographical signed videos, why only trust one source? If a news outlet is found signing a fake video, they will be in trouble. Loss of said trust if nothing else.

We should get to the point we don’t trust unsigned videos.

FlyingSquid@lemmy.world on 12 Feb 2024 12:30 next collapse

If a news outlet is found signing a fake video, they will be in trouble.

I see you’ve never heard of Fox News before.

en.wikipedia.org/wiki/Fox_News_controversies#Vide…

OsrsNeedsF2P@lemmy.ml on 12 Feb 2024 14:10 collapse

Yes, and now people don’t trust Fox News, to the point it is close to being banned from being used as a source for anything on Wikipedia

FlyingSquid@lemmy.world on 12 Feb 2024 14:12 collapse

I don’t know that ‘about to be banned by Wikipedia’ is a good metric for how much the general American public trusts Fox News. It could be that most of them don’t, but that is not a good way to tell considering there’s no general public input on what Wikipedia accepts as a source.

Also, it should have been banned by Wikipedia years ago.

ZombiFrancis@sh.itjust.works on 12 Feb 2024 15:00 collapse

Not trusting unsigned videos is one thing, but will people be judging the signature or the content itself to determine if it is fake?

Why only one source should be trusted is a salient point. If we are talking trust: it feels entirely plausible that an entity could use its trust (or power) to manufacture a signature.

And for some it is all too relevant that an entity like the White House, (or the gambit of others, past or present), have certainly presented false informstion as true to do things like invade countries.

Trust is a much more flexible concept that is willing to be bent. And so cryptographic verification really has to demonstrate how and why something is fake to the general public. Otherwise it is just a big ‘trust me bro.’

jabjoe@feddit.uk on 12 Feb 2024 15:44 collapse

Your right in that cryptographic verification only can prove someone signed the video. But that will mean nutters sharing “BBC videos”, that don’t have the BBC signature can basically be dismissed straight off. We are already in a soup of miss information, so sourcing being cryptographically provable is a step forward. If you trust those sources or not is another matter, but at least your know if it’s the true source or not. If a source abuse trust it has, it loses trust.

Blackmist@feddit.uk on 11 Feb 2024 16:47 next collapse

Honestly I’d say that’s on the way for any video or photographic evidence.

You’d need a device private key to sign with, probably internet connectivity for a timestamp from a third party.

Could have lidar included as well so you can verify that it’s not pointing at a video source of something fake.

Is there a cryptographically secure version of GPS too? Not sure if that’s even possible, and it’s the weekend so I’m done thinking.

SpaceCowboy@lemmy.ca on 11 Feb 2024 17:00 next collapse

It’s way more feasible to simply require social media sites to do the verification and display something like a blue check on verified videos.

This is actually a really good idea. Sure there will still be deepfakes out there, but at least a deepfake that claims to be from a trusted source can be removed relatively easily.

Theoretically a social media site could boost content that was verified over content that isn’t, but that would require social media sites to not be bad actors, which I don’t have a lot of hope in.

kautau@lemmy.world on 11 Feb 2024 18:11 collapse

I agree that it’s a good idea. But the people most swayed by deepfakes of Biden are definitely the least concerned with whether their bogeyman, the “deep state” has verified them

captain_aggravated@sh.itjust.works on 12 Feb 2024 00:31 collapse

Yeah it’s not going to change the mind of the folks making the deepfakes.

Natanael@slrpnk.net on 12 Feb 2024 23:11 collapse

Positioning using distance bounded challenge-response protocols with multiple beacons is possible, but none of the positioning satellite networks supports it. And you still can’t prove the photo was taken at the location, only that somebody was there.

helenslunch@feddit.nl on 11 Feb 2024 18:24 next collapse

I mean they could just create a highly-secure official Fediverse server/account?

stockRot@lemmy.world on 11 Feb 2024 19:43 next collapse

What problem would that solve?

helenslunch@feddit.nl on 11 Feb 2024 19:58 next collapse

An official channel to post and review deepfakes for accuracy.

otl@hachyderm.io on 11 Feb 2024 23:17 collapse

A link to the video could be shared via ActivityPub.
The video would be loaded over HTTPS; we can verify that the video is from the white house, and that it hasn't been modified in-transit.

A big issue is that places don't want to share a link to an independently verifiable video, they want you to load a copy of it from their website/app. This way we build trust with the brand (e.g. New York Times), and spend more time looking at ads or subscribe.
@stockRot @technology

stockRot@lemmy.world on 12 Feb 2024 02:18 collapse

A big issue is that places don’t want to share a link to an independently verifiable video, they want you to load a copy of it from their website/app.

Exactly. This “solution” doesn’t take into account how people actually use the Internet. Unless we expect billions of people to change their behavior, this is just a pointless comment.

otl@hachyderm.io on 12 Feb 2024 02:53 collapse

Might be closer than you think. The White House is just using Instagram right now: https://www.whitehouse.gov
(See section “featured media”)

@stockRot @technology

hyperhopper@lemmy.world on 12 Feb 2024 01:56 collapse

Just because you’re writing this on the fediverse doesn’t mean it’s the answer to everything. It’s certainly not the answer to this.

helenslunch@feddit.nl on 12 Feb 2024 02:09 collapse

Sick Strawman bro

drathvedro@lemm.ee on 11 Feb 2024 18:54 next collapse

I’ve been saying for a long time now that camera manufacturers should just put encryption circuits right inside the sensors. Of course that wouldn’t protect against pointing the camera at a screen showing a deepfake or someone painstakingly dissolving top layers and tracing out the private key manually, but that’d be enough of the deterrent from forgery. And also media production companies should actually put out all their stuff digitally signed. Like, come on, it’s 2024 and we still don’t have a way to find out if something was filmed or rendered, cut or edited, original or freebooted.

GeneralVincent@lemmy.world on 11 Feb 2024 19:36 next collapse

They’re doing something like that I think

techradar.com/…/sony-canon-and-nikon-set-to-comba…

drathvedro@lemm.ee on 11 Feb 2024 21:13 collapse

Oh, they’ve actually been developing that! Thanks for the link, I was totally unaware of C2PA thing. Looks like the ball has been very slowly rolling ever since 2019, but now that the Google is on board (they joined just a couple days ago), it might fairly soon be visible/usable by ordinary users.

Mark my words, though, I’ll bet $100 that everyone’s going to screw it up miserably on their first couple of generations. Camera manufacturers are going to cheap out on electronics, allowing for data substitution somewhere in the pipeline. Every piece of editing software is going to be cracked at least a few times, allowing for fake edits. And production companies will most definitely leak their signing keys. Maybe even Intel/AMD could screw up again big time. But, maybe in a decade or two, given the pace, we’ll get a stable and secure enough solution to become the default, like SSL currently is.

petrol_sniff_king@lemmy.blahaj.zone on 11 Feb 2024 23:09 collapse

You might find this interesting.
www.hackerfactor.com/blog/index.php?/archives/101…

drathvedro@lemm.ee on 12 Feb 2024 08:48 next collapse

Oh, so Adobe already screwed it up miserably. Thanks, had a good laugh at it

Natanael@slrpnk.net on 12 Feb 2024 13:52 collapse

Oof.

They need to implement content addressing for “sidecar” signature files (add a hash) both to prevent malleability and to allow independent caches to serve up the metadata for images of interest.

Also, the whole certificate chain and root of trust issues are still there and completely unaddressed. They really should add various recommendations for default use like not trusting anything by default, only showing a signature exists but treating it unvalidated until the keypair owner has been verified. Accepting a signature just because a CA is involved is terrible, and that being a terrible idea is exactly the whole reason who web browsers dropped support for displaying extended validation certificate metadata (because that extra validation by CAs was still not enough).

And signature verification should be mandatory for every piece, dropping old signatures should not be allowed and metadata which isn’t correctly signed shouldn’t be displayed. There’s even schemes for compressing multiple signatures into one smaller signature blob so you can do this while saving space!

And one last detail, they really should use timestamping via “transparency logs” when publishing photos like this to support the provenance claims. When trusted sources uses timestamping line this before publication then it helps verifying “earliest seen” claims.

hyperhopper@lemmy.world on 12 Feb 2024 01:54 collapse

If you’ve been saying this for a long time please stop. This will solve nothing. It will be trivial to bypass for malicious actors and just hampers normal consumers.

drathvedro@lemm.ee on 12 Feb 2024 08:57 next collapse

You must be severely misunderstanding the idea. The idea is not to encrypt it in a way that it’s only unlockable by a secret and hidden key, like DRM or cable TV does, but to do the the reverse - to encrypt it with a key that is unlockable by publicly available and widely shared key, where successful decryption acts as a proof of content authenticity. If you don’t care about authenticity, nothing is stopping you from spreading the decrypted version, so It shouldn’t affect consumers one bit. And I wouldn’t describe “Get a bunch of cameras, rip the sensors out, carefully and repeatedly strip the top layers off and scan using electron microscope until you get to the encryption circuit, repeat enough times to collect enough scans undamaged by the stripping process to then manually piece them together and trace out the entire circuit, then spend a few weeks debugging it in a simulator to work out the encryption key” as “trivial”

hyperhopper@lemmy.world on 12 Feb 2024 10:11 collapse

I think you are misunderstanding things or don’t know shit about cryptography. Why the fuck are y even talking about publicly unlockable encryption, this is a use case for verification like a MAC signature, not any kind of encryption.

And no, your process is wild. The actual answer is just replace the sensor input to the same encryption circuits. That is trivial if you own and have control over your own device. For your scheme to work, personal ownership rights would have to be severely hampered.

drathvedro@lemm.ee on 12 Feb 2024 12:54 next collapse

I think you are misunderstanding things or don’t know shit about cryptography. Why the fuck are y even talking about publicly unlockable encryption, this is a use case for verification like a MAC signature, not any kind of encryption.

Calm down. I was just dumbing down public key cryptography for you

The actual answer is just replace the sensor input to the same encryption circuits

This will not work. The encryption circuit has to be right inside the CCD, otherwise it will be bypassed just like TPM before 2.0 - by tampering with unencrypted connection in between the sensor and the encryption chip.

For your scheme to work, personal ownership rights would have to be severely hampered.

You still don’t understand. It does not hamper with ownership rights or right to repair and you are free to not even use that at all. All this achieves is basically camera manufacturers signing every frame with “Yep, this was filmed with one of our cameras”. You are free to view and even edit the footage as long as you don’t care about this signature. It might not be useful for, say, a movie, but when looking for original, uncut and unedited footage, like, for example, a news report, this’ll be a godsend.

Natanael@slrpnk.net on 12 Feb 2024 13:18 collapse

Analog hole, just set up the camera in front of a sufficiently high resolution screen.

You have to trust the person who owns the camera.

drathvedro@lemm.ee on 12 Feb 2024 14:41 collapse

Yes, I’ve mentioned that in the initial comment, and, I gotta confess, I don’t know shit about photography, but to me it sounds like a very non-trivial task to make such shot appear legitimate.

hyperhopper@lemmy.world on 14 Feb 2024 10:56 collapse

It’s not. Wait till you find out how they made movies before CGI!

Natanael@slrpnk.net on 12 Feb 2024 13:16 collapse

A MAC is symmetric and can thus only be verified by you or somebody who you trust to not misuse or leak the key. Regular digital signatures is what’s needed here

You can still use such a signing circuit but treat it as an attestation by the camera’s owner, not as independent proof of authenticity.

hyperhopper@lemmy.world on 14 Feb 2024 10:55 collapse

A MAC is symmetric and can thus only be verified by you or somebody who you trust to not misuse or leak the key.

You sign them against a known public key, so anybody can verify them.

Regular digital signatures is what’s needed here You can still use such a signing circuit but treat it as an attestation by the camera’s owner, not as independent proof of authenticity.

If it’s just the cameras owner attesting, then just have them sign it. No need for expensive complicated circuits and regulations forcing these into existence.

Natanael@slrpnk.net on 14 Feb 2024 13:04 collapse

You can’t use a MAC for public key signatures. That’s ECC, RSA, and similar.

Drewelite@lemmynsfw.com on 12 Feb 2024 15:19 collapse

Thank you, lol. This is what people end up with when they think of the first solution that comes to mind. Often just something that makes life harder for everyone EXCEPT bad actors. This just creates hoops for people following the rules to jump though while giving the impression the problem was solved, when it’s not.

npaladin2000@lemmy.world on 11 Feb 2024 19:19 next collapse

If the White House actually makes the deep fakes, do they count as “fakes?”

FrostKing@lemmy.world on 11 Feb 2024 19:53 next collapse

Can someone try to explain, relatively simply, what cryptographic verification actually entails? I’ve never really looked into it.

0xD@infosec.pub on 11 Feb 2024 20:16 next collapse

I’ll be talking about digital signatures which is the basis for such things. I assume basic understanding of asymmetric cryptography and hashing.

Basically, you hash the content you want to verify with a secure hashing function and encrypt the value with your private key. You can now append this encrypted value to the content or just release it alongside it.

To now verify this content they can use your public key to decrypt your signature and get the original hash value, and compare it to their own. To get that, they just need to hash the content themselves with the same function.

So by signing their videos with the white house private key and publishing their public key somewhere, you can verify the video’s authenticity like that.

For a proper understanding check out DSA :)

Natanael@slrpnk.net on 12 Feb 2024 14:18 collapse

Only RSA uses a function equivalent to encryption when producing signatures, and only when used in one specific scheme. Every other algorithm has a unique signing function.

abhibeckert@lemmy.world on 12 Feb 2024 00:40 collapse

Click the padlock in your browser, and you’ll be able to see that this webpage (if you’re using lemmy.world) was encrypted by a server that has been verified by Google Trust Services to be a server which is controlled by lemmy.world. In addition, your browser will remember that… and if you get a page from the same server that has been verified by another cloud provider, the browser (should) flag that and warn you it might be

The idea is you’ll be able to view metadata on an image and see that it comes from a source that has been verified by a third party such as Google Trust Services.

How it works, mathematically… well, look up “asymmetric cryptography and hashing”. It gets pretty complicated and there are a few different mathematical approaches. Basically though, the white house will have a key, that they will not share with anyone, and only that key can be used to authorise the metadata. Even Google Trust Services (or whatever cloud provider you use) does not have the key.

There’s been a lot of effort to detect fake images, but that’s really never going to work reliably. Proving an image is valid, however… that can be done with pretty good reliability. An attack would be at home on Mission Impossible. Maybe you’d break into a Whitehouse photographer’s home at night, put their finger on the fingerprint scanner of their laptop without waking them, then use their laptop to create the fake photo… delete all traces of evidence and GTFO. Oh and everyone would know which photographer supposedly took the photo, ask them how they took that photo of Biden acting out of character, and the real photographer will immediately say they didn’t take the photo.

FrostKing@lemmy.world on 12 Feb 2024 06:28 collapse

Thanks a lot, that helped me understand. Seems like a good idea

VampyreOfNazareth@lemm.ee on 11 Feb 2024 20:13 next collapse

Government also puts backdoor in said math, gets hacked, official fakes released

Squizzy@lemmy.world on 11 Feb 2024 23:34 collapse

Or more likely they will only discredit fake news and not verify actual footage that is a poor reflection. Like a hot mic calling someone a jackass, white House says no comment.

HawlSera@lemm.ee on 11 Feb 2024 23:07 next collapse

This is sadly necessary

Eezyville@sh.itjust.works on 12 Feb 2024 00:40 next collapse

Maybe the White House should create a hash of the video and add it to a public blockchain. Anyone can then verify if the video is authentic.

hyperhopper@lemmy.world on 12 Feb 2024 01:52 next collapse

  1. Anybody can also verify it if they just host the hash on their own website, or host the video itself.
  2. Getting the general populace to understand block chain implementations or how to interface with it is an unrealistic task
  3. What does a distributed zero trust model add to something that is inherently centralized requiring trust in only 1 party

Blockchain is the opposite of what you want for this problem, I’m not sure why people bring this up now. People need to take an introductory cryptography course before saying to use blockchain everywhere.

makeasnek@lemmy.ml on 12 Feb 2024 02:16 collapse

Putting it on the blockchain ensures you can always go back and say “see, at this date/time, this key verified this file/hash”… If you know the key of the uploader (the white house), you can verify it was signed by that key. Guatemala used a similar scheme to verify votes in elections using Bitcoin. Could the precinct lie and put in the wrong vote count? Of course! But what it prevented was somebody saying “well actually the precinct reported a different number” since anybody could verify that on chain they didn’t. It also prevented the precinct themselves from changing the number in the future if they were put under some kind of pressure.

hyperhopper@lemmy.world on 12 Feb 2024 08:51 next collapse

All of this could be done without blockchain. Once they sign a signature with their private key they can’t unsign it later. Once you attest something you cannot un-attest it.

Just make the public key known and sign things. Please stop shoehorning blockchain where it doesn’t belong, especially when you aren’t even giving any examples of things that blockchain is doing for you with 100000x the cost and complexity, that normal crypto from the 80s/90s cant do better.

Natanael@slrpnk.net on 12 Feb 2024 23:14 collapse

Trusted timestamping protocols and transparency logs exists and does that more efficiently

M500@lemmy.ml on 12 Feb 2024 01:57 next collapse

Wouldn’t this be defeated by people re-uploading the video? I think all these sites will re-encode the videos uploaded so the hash will not match, then people will use that as proof that the video is not real.

recapitated@lemmy.world on 12 Feb 2024 02:19 next collapse

There are many unnecessary steps in that.

recapitated@lemmy.world on 12 Feb 2024 14:00 collapse

Guys, it doesn’t need to be on a block chain. Asymmetric key cryptography is enought to verify authenticity.

dgmib@lemmy.world on 12 Feb 2024 03:23 collapse

Don’t need to involve a blockchain to make cryptographically provable authenticity. Just a digital signature.

The only thing a hash in a blockchain would add is proof the video existed at the time the hash was added to the blockchain. I can think of cases where that would be beneficial too, but it wouldn’t make sense to put a hash of every video on a public blockchain.

Natanael@slrpnk.net on 12 Feb 2024 14:27 collapse

Transparency logs like that are helpful to show when media was first seen / published

recapitated@lemmy.world on 12 Feb 2024 02:18 next collapse

It’s a good idea. And I hope to see more of this in other types of communications.

Snapz@lemmy.world on 12 Feb 2024 02:36 next collapse

We need something akin to the simplicity and ubiquity of Google that does this, government funded and with transparent oversight. We’re past the point of your aunt needing a way to quickly check if something is obvious bullshit.

Call it something like Exx-Ray, the two Xs mean double check - “That sounds very unlikely that they said that Aunt Pat… You need to Exx-Ray shit like that before you talk about it at Thanksgiving”

Or same thing, but with the word Check, CHEXX - “No that sounds like bullshit, I’m gonna CHEXX it… Yup that’s bullshit, Randy.”

csm10495@sh.itjust.works on 12 Feb 2024 06:30 collapse

Man some Chex mix sounds good right now. They have this one that has chocolate pieces now.

cooopsspace@infosec.pub on 12 Feb 2024 03:48 next collapse

You mean to tell me that cryptography isn’t the enemy and that instead of fighting it in the name of “terrorism and child protection” that we should be protecting children by having strong encryption instead??

Thirdborne@lemmy.world on 12 Feb 2024 09:34 next collapse

When it comes to misinformation I always remember when I was a kid I’m the early 90s, another kid told me confidently that the USSR had landed on Mars, gathered rocks, filmed it and returned to earth(it now occurs to me that this homeschooled kid was confusing the real moon landing.) I remember knowing it was bullshit but not having a way to check the facts. The Internet solved that problem. Now, by God , the Internet has recreated the same problem.

FlyingSquid@lemmy.world on 12 Feb 2024 12:29 next collapse

I don’t blame them for wanting to, but this won’t work. Anyone who would be swayed by such a deepfake won’t believe the verification if it is offered.

tacosplease@lemmy.world on 12 Feb 2024 12:55 next collapse

Agreed and I still think there is value in doing it.

FlyingSquid@lemmy.world on 12 Feb 2024 12:58 collapse

I honestly do not see the value here. Barring maybe a small minority, anyone who would believe a deepfake about Biden would probably also not believe the verification and anyone who wouldn’t would probably believe the administration when they said it was fake.

The value of the technology in general? Sure. I can see it having practical applications. Just not in this case.

Natanael@slrpnk.net on 12 Feb 2024 13:14 next collapse

It helps journalists, etc, when files have digital signatures verifying who is attesting to it. If the WH has their own published public key for signing published media and more then it’s easy to verify if you have originals or not.

FlyingSquid@lemmy.world on 12 Feb 2024 13:15 next collapse

I don’t even think that matters when Trump’s people are watching media that won’t verify it anyway.

EatATaco@lemm.ee on 12 Feb 2024 13:26 collapse

The world is not black and white. There are not just trump supporters and Biden supporters. I know it’s hard to grasp but there are tons of people in the the toss up category.

You’re right that this probably won’t penetrate the deeply perverted world of trump cultists, but the wh doesn’t expect to win the brainwashed over. They are going for those people who could go one way or another.

FlyingSquid@lemmy.world on 12 Feb 2024 13:28 collapse

I find it hard to believe that there are too many people who truly can’t decide between Trump and Biden at this point. The media really wants a horse race here, but if your mind isn’t made up by this point, I think you’re unlikely to vote in the first place.

I’ll be happy to be proven wrong and have this sway people who might vote for Trump to vote for Biden though.

EatATaco@lemm.ee on 12 Feb 2024 13:50 collapse

So, the race is basically already decided but there is a conspiracy among the media and polling companies to make it look like the race is actually close and that there are undecides. Of course, the only way to prove this wrong would be with polls, but we’ve conveniently already just rejected that evidence. Very convenient.

FlyingSquid@lemmy.world on 12 Feb 2024 13:58 collapse

Ah yes, our oh-so-accurate polls.

news.vanderbilt.edu/…/pre-election-polls-in-2020-…

EatATaco@lemm.ee on 12 Feb 2024 14:17 collapse

A couple of things.

First, your link is outdated: fivethirtyeight.com/…/2022-election-polling-accur…

Yes, 2020 was a bad year, but last year was actually a very good year. Basically what you are saying is that “4 years ago polls were bad, so that allows me to just believe whatever I want.”

Second, if you believe you have no metric by which to measure something, the correct thought is “I’m not sure what the answer is” not “what I think is true must be true.”

Plus, don’t believe it was missed that you just outright ignored the whole part of your post that this is some conspiracy, of course thrown out there with zero evidence.

FlyingSquid@lemmy.world on 12 Feb 2024 14:20 collapse

Basically what you are saying is that “4 years ago polls were bad, so that allows me to just believe whatever I want.”

How on Earth am I saying that?

Second, if you believe you have no metric by which to measure something, the correct thought is “I’m not sure what the answer is” not “what I think is true must be true.”

Have you met the average Fox viewer?

Plus, don’t believe it was missed that you just outright ignored the whole part of your post that this is some conspiracy, of course thrown out there with zero evidence.

Conservative media not giving a shit about the truth isn’t a conspiracy theory, it’s a fact. Hence Fox having to pay a billion dollars to Dominion.

EatATaco@lemm.ee on 12 Feb 2024 14:23 collapse

How on Earth am I saying that?

Sorry I got it wrong. What exactly are you saying with that point?

Have you met the average Fox viewer?

What does the average Fox viewer have to do with you and your point?

Conservative media not giving a shit about the truth isn’t a conspiracy theory, it’s a fact. Hence Fox having to pay a billion dollars to Dominion.

Wait, now we are just talking about conservative media? I thought we were talking about the media wanting you to think there was actually a race?

FlyingSquid@lemmy.world on 12 Feb 2024 14:28 collapse

Yes, I was just talking about conservative media. The media as a whole loves a horse race, but they aren’t generally willing to lie to get it.

That said, polls right now are all over the place, which does put the media in general in a good place because a contentious election means more viewers.

…fivethirtyeight.com/…/president-general/

But that’s not a conspiracy, that’s just capitalism- an exciting election equals more news viewers equals higher advertising rates. Would, say, CBS news lie about the polls to achieve that? I doubt it. Would Fox? Absolutely.

EatATaco@lemm.ee on 12 Feb 2024 15:09 collapse

So how does this tie into your original point that it’s hard for you to believe that anyone isn’t decided? The whole point of bringing up polls in general was to show that this shouldn’t be hard to believe at all. The claim that you were always just talking about the conservative media seems like a massive non-sequitur.

FlyingSquid@lemmy.world on 12 Feb 2024 15:28 next collapse

Yes, when polls are all over the place, it’s hard to believe them. I gave you the link to see for yourself.

skulblaka@startrek.website on 12 Feb 2024 18:44 collapse

It is impossible to escape political propaganda in modern America. It’s on your internet, it’s on your radio, it’s on your cable TV, it’s on your streaming TV, it’s on your super bowl ads, it’s on your gas station pumps, it’s on your news sources, it’s on your social media. “Oh I don’t pay attention to politics” is no longer a reasonable excuse because that is impossible, it’s shoved down the throat of every citizen nonstop from every angle. The two candidates, in this case Trump and Biden, are such polar opposites of each other in every single possible regard that the only way someone can be undecided between the two is if their multiple personalities are arguing over it.

EatATaco@lemm.ee on 12 Feb 2024 19:14 collapse

So what are you saying, exactly? That the polls are made up and there is some conspiracy to mislead? What you are saying sounds potentially reasonable, but at the same time the numbers don’t support it.

skulblaka@startrek.website on 12 Feb 2024 20:30 collapse

Personally, I’ve never been polled. Not once. And neither has anyone else I’ve ever met in my life. I’m not saying they’re made up wholesale, because frankly, I have no idea. But I am saying that, at the very least, they’re not likely to be an accurate representation of the American citizenry as a whole. If nothing else, the percentage of “undecided” voters raises some eyebrows for me for the reasons I just stated. If you’ve lived in America the last 8-16 years and are somehow still a fence sitter, you’ve managed to ignore a veritable deluge of information being sprayed directly into your eyeballs with all the delicacy and care of a fire hose.

I understand the average person is probably pretty dumb, but I have faith in humanity that a significant percentage of us aren’t that dumb. Being on the bell curve means you’re plenty intelligent enough to understand whether you want to vote for red or for blue and for what reasons. I refuse to believe that there are people in America legitimately weighing if they would rather vote for protected freedoms for American citizens or vote for banning books that speak about protected freedoms for American citizens. The two choices are so wildly opposed to each other in structure and in intent that there isn’t a choice to be made, all people will land on one side or the other of this argument and there is no center ground to waffle around.

Twenty years ago, I understood undecided voters, because there still remained some small amount of nuance in the way American politics were carried out. We have now lost that. Our political landscape is now Blue Team vs Anti-Blue Team and the fence that the undecided voters were previously sitting on is now uninhabitable rubble, because there is now no component of our government that can come to a sensible cross-aisle decision. The independent, moderate voter is now a relic of the past in our supercharged, hyper-partisan pre-civil-war violence mockery of a civilized government.

EatATaco@lemm.ee on 12 Feb 2024 21:39 collapse

I feel like this was a whole lot of words to dodge the actual question. I get that you don’t believe that people can be still undecided, and I full understand the sentiment (although, I also recognize that I am a lot more in tune with politics than other people, this isn’t calling them stupid, but simply focused on other things).

But the numbers tell a different story. So what are you saying about those numbers? That they’re faked?

jj4211@lemmy.world on 12 Feb 2024 15:17 collapse

Problem is that broadly speaking, you would only sign the stuff you want to sign.

Imagine you had a president that slapped a toddler, and there was a phone video of it from the parents. The white house isn’t about to sign that video, because why would they want to? Should the journalists discard it because it doesn’t carry the official White House blessing?

It would limit the ability for someone to deep fake an official edit of a press briefing, but again, what if he says something damning, and the ‘official’ footage edits it out, would the press discard their own recordings because they can’t get it signed, and therefore not credible?

That’s the fundamental challenge in this sort of proposal, it only allows people to endorse what they would have wanted to endorse in the first place, and offers no mechanism to prove/disprove third party sources that are the only ones likely to carry negative impressions.

Natanael@slrpnk.net on 12 Feb 2024 16:12 next collapse

But then the journalists have to check if the source is trustworthy, as usual. Then they can add their own signature to help other papers check it

jj4211@lemmy.world on 12 Feb 2024 16:44 collapse

To that extent, we already have that.

I go to ‘cnn.com’, I have cryptographic verification that cnn attests to the videos served there. I go to youtube, and I have assurances that the uploader is authenticated as the name I see in the description.

If I see a repost of material claimed to be from a reliable source, I can go chase that down if I care (and I often do).

AA5B@lemmy.world on 12 Feb 2024 18:10 collapse

It’s not a challenge, because this is only valid for photos and videos distributed by the White House, which they already wouldn’t do.

The challenge is that it would have to leave out all the photos and videos taken by journalists and spectators. That’s where the possible baby slapping would come out, and we would still have no idea whether to trust it

throw4w4y5@sh.itjust.works on 12 Feb 2024 13:26 next collapse

If a cryptographic claim/validation is provided then anyone refuting the claims can be seen to be a bad faith actor. Voters are one dimension of that problem but mainstream media being able to validate election videos is super important both domestically, but also internationally as the global community needs to see efforts being undertaken to preserve free and fair elections. This is especially true given the consequences if america’s enemies are seen to have been able to steer the election.

Zink@programming.dev on 12 Feb 2024 21:20 collapse

Sure, the grandparents that get all their news via Facebook might see a fake Biden video and eat it up like all the other hearsay they internalize.

But, if they’re like my parents and have the local network news on half the damn time, at least the typical mainstream network news won’t be showing the forged videos. Maybe they’ll even report a fact check on it?!?

And yeah, many of them will just take it as evidence that the mainstream media is part of the conspiracy. That’s a given.

OsrsNeedsF2P@lemmy.ml on 12 Feb 2024 14:04 next collapse

Deepfakes could get better. And if they do, a lot more people will start to get fooled

ilinamorato@lemmy.world on 12 Feb 2024 20:32 collapse

I don’t think that’s what this is for. I think this is for reasonable people, as well as for other governments.

Besides, passwords can be phished or socially engineered, and some people use “abc123.” Does that mean we should get rid of password auth?

recapitated@lemmy.world on 12 Feb 2024 14:04 next collapse

I’ve always thought that bank statements should require cryptographic signatures for ledger balances. Same with individual financial transactions, especially customer payments.

Without this we’re pretty much at the mercy of trust with banks and payment card providers.

I imagine there’s a lot of integrity requirements for financial transactions on the back end, but the consumer has no positive proof except easily forged statements.

[deleted] on 12 Feb 2024 14:37 next collapse

.

phoenixz@lemmy.ca on 12 Feb 2024 15:21 collapse

Yeah but that would require banks to actually invest money to improve customer trust… Not something banks are very interested in, really. It’s easier and cheaper to just have the marketing department come up with some nonsense claim and advertise that instead.

Darkassassin07@lemmy.ca on 12 Feb 2024 14:38 next collapse

I’m more interested in how exactly you’d implement something like this.

It’s not like videos viewed on tiktok display a hash for the file you’re viewing; and users wouldn’t look at that data anyway, especially those that would be swayed by a deep fake…

aodhsishaj@lemmy.world on 12 Feb 2024 15:48 next collapse

Likely it would be a service provided by the Whitehouse press corps and media outlets then could rehost the videos with the whitehouse watermark

AA5B@lemmy.world on 12 Feb 2024 18:39 collapse

Digital signature. A watermark may be useful so that an unauthorized user can’t easily hide their source without noticeably defacing the photo, but it doesn’t prevent anyone from modifying it

A digital signature is a somewhat similar idea except that signature verification fails if there are any changes. This is tough to do with a photograph, where some applications may be blindly re-encoding or modifying the resolution so those may need to be fixed.

You could argue this is a good use case for blockchain, certainly much better than those stupid monkey images. When John Stewart parodies a politician, there should be a verifiable chain of evidence from the White House release to the news bureau to his studio, before they alter the lighting to highlight orange skin tone for yucks.

Knock_Knock_Lemmy_In@lemmy.world on 12 Feb 2024 19:49 collapse

The question is how does the USER verify the authenticity. They just see a video, not a signature.

AA5B@lemmy.world on 13 Feb 2024 00:55 collapse

They shouldn’t have to actively verify that, but yeah, I don’t know if there is a relevant file format though

I once worked with signed xml, where the signature field is really no different than any other field, but with binary data. That data used a private key to sign a checksum if the file. For tools that understand the format, you just verify the trust chain against cert authority public keys using your local keystore. It just worked, with no action required of the user and no internet required

  • if you edit the signature, the trust chain will fail validation
  • if you edit other data, the signed checksum would not match and validation would fail
  • if you edit the checksum, the key would no longer match and validation would fail

It’s actually been a lot of years, so I hope I’m remembering it accurately

cley_faye@lemmy.world on 12 Feb 2024 17:03 collapse

Like you said, the issue is in verification by the end-user. It is trivial to provide a digitally signed (and timestamped) file. It is also trivial to provide trusted tools to verify these files. It is immensely difficult to provide a solution user will care about; which is why more often than not the most people asks companies in the data authenticity business is “can we show a green check on screen? That would be perfect!”.

And we end up with something that nobody checks beyond the “it’s probably ok” phase. If the goal is to teach the masses about trusting their source, either they have a miracle solution, or it just won’t work. (and all that is assuming people actually care about checking the authenticity of the stuff they see, which is not a norm as it is…)

nutsack@lemmy.world on 12 Feb 2024 15:13 next collapse

the technology to do this has existed for decades and it’s crazy to me that people aren’t doing it all the time yet

SpezBroughtMeHere@lemmy.world on 12 Feb 2024 16:13 next collapse

Tinfoil hat time. It’s probably because they need to start creating AI videos to show he’s ‘competent and coherent’ and they’ll say their 'tests proves that it’s a real video not a fake. And since the government said it’s true, morons will believe it.

hayes_@sh.itjust.works on 12 Feb 2024 16:55 next collapse

I don’t have nearly enough tinfoil for this

[deleted] on 12 Feb 2024 18:24 next collapse

.

skulblaka@startrek.website on 12 Feb 2024 18:51 collapse

If Trump is any indication, no politician will ever need to be ‘competent and coherent’ ever again, constituents would vote in a literal corpse if it had a sign propped on it saying “gays bad”

SpezBroughtMeHere@lemmy.world on 12 Feb 2024 20:51 collapse

There is that. It’s kind of like the people that are proud to vote for Biden again too. The whole thing is a fucking joke.

dejected_warp_core@lemmy.world on 12 Feb 2024 17:26 next collapse

Wait. Did the White House just discover a legitimate use-case for NFTs?

realharo@lemm.ee on 12 Feb 2024 19:55 collapse

No, all you need for this is a digital signature and to publish the public key on an official government website. And maybe for platforms like YouTube and TikTok to integrate check status in their UI (e.g. flag any footage of candidates that was not signed by the government private key as “unverified”).

How would an NFT help in any way?

dejected_warp_core@lemmy.world on 12 Feb 2024 20:20 collapse

I was being glib, but as NFTs are (typically) images signed by a blockchain, it meets the criteria of “cryptographically signed image” in a way.

In reality, you are correct.

AA5B@lemmy.world on 12 Feb 2024 18:06 collapse

So should Taylor Swift