A.I. Video Generators Are Now So Good You Can No Longer Trust Your Eyes (www.nytimes.com)
from silence7@slrpnk.net to technology@lemmy.world on 10 Oct 22:59
https://slrpnk.net/post/28684332

#technology

threaded - newest

FriendOfDeSoto@startrek.website on 10 Oct 23:34 next collapse

Maybe the NYT’s headline writers’ eyes weren’t that great to begin with?

The tech could represent the end of visual fact — the idea that video could serve as an objective record of reality — as we know it.

We already declared that with the advent of photoshop. I don’t want to downplay the possibility of serious harm being a result of misinformation carried through this medium. People can be dumb. I do want to say the sky isn’t falling. As the slop tsunami hits us we are not required to stand still, throw our hands in the air, and take it. We will develop tools and sensibilities that will help us not to get duped by model mud. We will find ways and institutions to sieve for the nuggets of human content. Not all at once but we will get there.

This is fear mongering masquerading as balanced reporting. And it doesn’t even touch on the precarious financial situations the whole so-called AI bubble economy is in.

silence7@slrpnk.net on 11 Oct 00:12 next collapse

What you end up stuck doing is deciding to trust particular sources. This makes it a lot harder to establish a shared reality

dontsayaword@piefed.social on 11 Oct 00:24 next collapse

To no longer be able to trust video evidence is a big deal. Sure the sky isn’t falling, but this is a massive step beyond what Photoshop enabled, and a major powerup for disinformation, which was already winning.

IllNess@infosec.pub on 11 Oct 01:08 next collapse

All those tech CEOs met up with Trump makes me think this is a major reason for pouring money in to this technology. Any time Trump says “fake news”, he can just say it is AI.

FriendOfDeSoto@startrek.website on 11 Oct 01:58 next collapse

You couldn’t “trust” video before sora et al. We had all these sightings of aliens and flying saucers - which stopped conveniently having an impact when everybody started carrying cameras around.

There will be a need to verify authenticity and my prediction is that need will be met.

Venator@lemmy.nz on 11 Oct 14:41 collapse

To no longer be able to trust video evidence is a big deal.

except that you still can trust video evidence if you examine the video carefully … for now …

JustTesting@lemmy.hogru.ch on 11 Oct 21:03 collapse

But what if your phone comes with nice AI filters? The fake videos get more and more real and the real videos get more and more fake

tal@olio.cafe on 11 Oct 00:25 next collapse

The tech could represent the end of visual fact — the idea that video could serve as an objective record of reality — as we know it.

We already declared that with the advent of photoshop.

I think that this is “video” as in “moving images”. Photoshop isn’t a fantastic tool for fabricating video (though, given enough time and expense, I suppose that it’d be theoretically possible to do it, frame-by-frame). In the past, the limitations of software have made it much harder to doctor up — not impossible, as Hollywood creates imaginary worlds, but much harder, more expensive, and requiring more expertise — to falsify a video of someone than a single still image of them.

I don’t think that this is the “end of truth”. There was a world before photography and audio recordings. We had ways of dealing with that. Like, we’d have reputable organizations whose role it was to send someone to various events to attest to them, and place their reputation at stake. We can, if need be, return to that.

And it may very well be that we can create new forms of recording that are more-difficult to falsify. A while back, to help deal with widespread printing technology making counterfeiting easier, we rolled out holographic images, for example.

I can imagine an Internet-connected camera — as on a cell phone — that sends a hash of the image to a trusted server and obtains a timestamped, cryptographic signature. That doesn’t stop before-the-fact forgeries, but it does deal with things that are fabricated after-the-fact, stuff like this:

https://en.wikipedia.org/wiki/Tourist_guy

makyo@lemmy.world on 11 Oct 12:24 collapse

The real danger is the failing trust in traditional news sources and the attack on the truth from the right.

People have been believing what they want regardless of if they see it for a long time and AI will fuel that but is not the root of the problem.

fruitycoder@sh.itjust.works on 11 Oct 17:43 collapse

Traditional news sources became aggregators of actual news sources and open source Intel, and have made “embellishing” the norm. Stock/reused visuals, speculating minutes into events, etc etc

It is increasingly faked. The right just pretends that means they’re lies that feel “good” are the truth

snoons@lemmy.ca on 11 Oct 00:19 next collapse

🤓 Is this marketing from AI companies? 🦋

very_well_lost@lemmy.world on 11 Oct 03:24 collapse

Absolutely.

DeathByBigSad@sh.itjust.works on 11 Oct 03:45 next collapse

Videos are now basically have the same weights as words, no longer a “smoking gun”. Videos basically become like eyewitness testimony, well… its slightly better as it protect against misremembering or people with inadequate lexicon and unable to clearly articulate what they saw. The process wil become: get the witness to testify they had posession of the camera, was recording at the time of incident, and they believe the video being presented in court is genuine and have not been altered, then its basically a video version of their eyewitness testimony. The credibility of the video is now tied to the witness/camera-person’s own credibility, and should not be evaluated as an independent evidence, but the jury should treat the video as the witnese’s own words, meaning, they should factor in the possibility the witness faked it.

A video you see on the internet is now just as good as just a bunch of text, both equally unreliable.

We live in a post-truth world now.

Tehdastehdas@piefed.social on 11 Oct 06:55 next collapse

A hacker may have replaced the authentic video in the phone. The edit must be unnoticeable to the eyewitness who shot it.

silence7@slrpnk.net on 11 Oct 13:56 collapse

If there’s an edit that alters a detail that doesn’t matter to the witness, it probably isn’t important. And that kind of replacement is hard to do at scale without getting caught.

vacuumflower@lemmy.sdf.org on 11 Oct 11:08 next collapse

And that’s perfect, that’s the world that made all the due process and similar things evolve.

There’s never been such a thing as independent evidence. The medium has always mattered. And when people started believing this is no more true, we’ve almost gotten ourselves a new planetary fascist empire, I hope we’re still in time to stop that.

Aneb@lemmy.world on 11 Oct 20:29 next collapse

I’m just thinking, people thought Americans were faking the moon landing, we’ve always had conspiracy theorists. AI just spins them faster and sloppier, let’s go back to humans lying to humans than a computer taught to lie and advertise by humans to do the same thing

utopiah@lemmy.world on 12 Oct 08:21 collapse

Videos are now basically have the same weights as words…

We live in a post-truth world now.

It’s interesting that you start with a bold statement that is IMHO correct, namely that namely what was once taken as unquestionable truth now isn’t, but also it’s not new, just yet another media, but still conclude that it’s different.

Arguably we were already in a post-truth World, always have been, it only extends to a medium we considered too costly to fake until now. The principle is still the same.

vacuumflower@lemmy.sdf.org on 13 Oct 04:44 collapse

In the Middle Ages people believed in creatures nobody had ever seen. And the legal systems and the concepts of knowledge were not very good.

And still the latter evolved to become better long before people started recording sounds to wax cylinders and shooting photos.

utopiah@lemmy.world on 13 Oct 07:49 collapse

In the Middle Ages people believed in creatures nobody had ever seen

FWIW even centuries later, during Linneaus time, people were actually looking for unicorns.

vacuumflower@lemmy.sdf.org on 13 Oct 08:40 collapse

Some people are still looking for yetis and aliens and mountain lake dragons.

Crashumbc@lemmy.world on 11 Oct 04:46 next collapse

Meh we’re not there yet. But the day is coming.

“The Running Man” predicted the future!

noretus@sopuli.xyz on 11 Oct 09:22 next collapse

I’m just holding out minor hope that people finally get with the program and realize the value of reputable news organizations and plain old grapevine again. Leave internet for nerds.

Cybersteel@lemmy.world on 11 Oct 12:47 next collapse

Nothing is true
Everything is permitted

lightsblinken@lemmy.world on 11 Oct 14:35 next collapse

videos need to be cryptographically signed and able to be verified. all news outlets should do this.

panda_abyss@lemmy.ca on 11 Oct 21:14 next collapse

That’s not really feasible without phones doing this automatically.

Even then didn’t the first Trump admin already argue iPhone video can’t be trusted because it’s modified with AI filters?

lightsblinken@lemmy.world on 12 Oct 07:43 collapse

… so make the phones do it?

i mean, its not rocket surgery.

TheBlackLounge@lemmy.zip on 12 Oct 22:49 collapse

Sign every video automatically? Sounds like chatcontrol all over.

Also, I could just generate a video on my computer and film it with my phone. Now it’s signed, even has phone artifacts for added realism.

lightsblinken@lemmy.world on 13 Oct 04:17 collapse

i think the point is to be able to say “this video was released by X, and X signed it so they must have released it, and you can validate that yourself”. it means if you see a logo that shows CNN, and its signed by CNN, then you know for sure that CNN released it. As a news organisation they should have their own due diligence about sources etc, but they can at least be held to account at that point. versus random ai generated video with a fake logo and fake attribution that is going viral and not being able to be discredited in time before it becomes truth.

TheBlackLounge@lemmy.zip on 13 Oct 09:04 collapse

then you know for sure that CNN released it.

Why not link to the original CNN source then, if you want to be trusted? You’d have to do that anyways if you want to use the CNN footage in your own video.

I don’t think people who care about the validity of a news video will be helped much with this, and people who don’t care about the truth can easily ignore it too.

As a news organisation they should have their own due diligence about sources etc

But what if they can’t anymore? News orgs don’t only show video that they recorded. They have videos from freelance reporters, people who were at an event, government orgs, other news orgs in other countries…

lightsblinken@lemmy.world on 13 Oct 11:43 collapse

sure, totally ok to incorporate those video items & publish your (signed) story. i think we’ve seen pretty clesrly that people want to publish and be recognised for their publications. building a web of trust has to start somewhere. currently we’re in the “its all very difficult, we cant solve all the tricky things, so we’re not even trying” stage. hopefully we find a way to move forward, even if its not perfect.

TheBlackLounge@lemmy.zip on 13 Oct 16:26 collapse

How would that work? Why would you have any reason to trust me in this chain?

sip@programming.dev on 11 Oct 21:35 next collapse

agreed. having a cryptography mark on the file and relying on chain of trust is the way.

danhab99@programming.dev on 11 Oct 22:03 next collapse

The NFTs tried to solve this problem already and it didn’t work. You can change the hash/sig of a video file by just changing one pixel on one frame, meaning you just tricked the computer, not the people who use it.

lightsblinken@lemmy.world on 12 Oct 07:38 next collapse

so try again? also: if a pixel changes then it isn’t the original source video, by definition. being able to determine that it has been altered is entirely the point.

TheBlackLounge@lemmy.zip on 12 Oct 22:46 collapse

The point was to sign AI footage so you know what’s fake. NFTs can be used as a decentralized repository of signatures. You could realistically require the companies to participate, but the idea doesn’t work because you can edit footage so it doesn’t match the signature. More robust signatures exist, but none is good enough, especially since the repo would have to be public.

Signing real footage makes even less sense. You’d have to trust everybody and their uncle’s signature.

dragonfly4933@lemmy.dbzer0.com on 13 Oct 21:19 collapse

The signing keys could be published to DNS, for better or worse.

TheBlackLounge@lemmy.zip on 14 Oct 13:53 collapse

What would that solve? NFTs don’t have to be powerhungry proof of work, that was just for the monkeys. The public ledger part of this is not the problem.

dragonfly4933@lemmy.dbzer0.com on 14 Oct 20:31 collapse

How can an organization prove that a given key is theirs using NFTs?

TheBlackLounge@lemmy.zip on 14 Oct 21:36 collapse

A digital signature works with public/private keys and content hashes. This is a solved problem.

In fact, it’s part of secure DNS.

dragonfly4933@lemmy.dbzer0.com on 14 Oct 22:05 collapse

How does that answer my question, how do NFTs help an organization prove that a key belongs to them?

NFTs and blockchains are an entirely virtual construct that can’t affect the real world, or take trusted, non-key inputs from the real world. That’s not 100% true, but it is mostly true.

So really, you need a way to tie or bind a key to an identity or organization. You could perhaps sign some data, such as a domain name with a key on a chain, but that doesn’t prove anything. Anyone could sign anything with any key, so you need to approach the problem from the other direction.

You can install the key directly, or the hash of the key into DNS, verifiers can retrieve the key from DNS, then resolve it to the full key if necessary. You can then use the key to verify signatures of signed data.

Why DNS? Because that is currently the most standard way to identify organizations on the internet. Also, much of the security of the internet is directly bound to DNS. For example, getting certificates for websites often entails changing a DNS record at the request of an issuer to prove that you own the domain in question.

This is not an idea I invented just now, there are multiple DNS record types that have been defined for literally decades at this point which allow an organization to publish keys to DNS. Among the first is this: www.rfc-editor.org/rfc/rfc2535#section-3 Not completely related, but it is a key of some kind published to DNS.

I don’t think NFTs provide any useful functionality in helping organizations prove that a key is theirs, at least nothing much better than a simpler solution which already exists.

TheBlackLounge@lemmy.zip on 15 Oct 21:00 collapse

It’s kind of besides the point. Yes they don’t add anything unique and yes it was most likely because if hype, but NFTs is just what they used in the wip to store the signatures on, but the core principle is flawed no matter what you put it on.

Sorry I thought you suggested DNS to solve the core issues.

Kissaki@feddit.org on 12 Oct 22:31 collapse

By changing one pixel it’s no longer signed by the original author. What are you trying to say?

danhab99@programming.dev on 12 Oct 23:51 collapse

Exactly that, if I change a pixel then the cryptographic signature breaks

captain_aggravated@sh.itjust.works on 12 Oct 22:34 next collapse

Cryptographic signatures are something we should have been normalizing for awhile now.

I remember during the LTT Linux challenge, at one point they were assigned the task “sign a PDF.” Linus interpreted this as PGP sign the document, which apparently Okular can do but he didn’t have any credentials set up. Luke used some online tool to photoshop an image of his handwriting into the document.

[deleted] on 13 Oct 03:04 collapse

.

Dasus@lemmy.world on 11 Oct 17:44 next collapse

Is this going to kill Onlyfans?

Or is the market decidedly because Onlyfans is about personal creators and thus it’s more meaningful than porn?

But when short AI videos become so good you can’t tell if you’re being catfished, will it feel the same?

bhamlin@lemmy.world on 11 Oct 18:06 collapse

To be fair, if anyone was going to kill Onlyfans, it was Onlyfans. They haven’t yet managed it.

leastaction@lemmy.ca on 11 Oct 18:21 next collapse

Your eyes are fine. It’s AI that can’t be trusted.

DeathByBigSad@sh.itjust.works on 12 Oct 21:55 collapse

But what if your eyes were secretly replaced by an AI-eye that replace everything you see with real-time generated imagery? 🤔

Darkenfolk@sh.itjust.works on 12 Oct 22:08 next collapse

Then it’s the guy who replaced your eyes that’s untrustworthy.

WALLACE@feddit.uk on 12 Oct 22:13 collapse

But what if he was tricked into replacing them by his own AI eyes. It’s untrustworthy eyes all the way down

HubertManne@piefed.social on 12 Oct 23:02 collapse

until you get to the Techno-Necromancers of alpha centari!

demonsword@lemmy.world on 13 Oct 11:47 collapse

that’s some serious Laughing Man stuff right there

vane@lemmy.world on 11 Oct 23:03 next collapse

Someone doesn’t know what mockumentary or docufiction is. There were lots of fake videos way before AI. This is just amplification because of better accessibility.

WALLACE@feddit.uk on 12 Oct 22:18 next collapse

It’s the accessibility and scale that’s scary now. Anyone will be able to make convincing fakes of anything from their couch during an ad break on TV. The internet will be essentially useless for getting any useful information because the garbage will outnumber everything else by a million to one.

vane@lemmy.world on 13 Oct 02:08 collapse

Commercial web over the years is slowly transforming from education and news into video entertainment platform, this is just next step. I hope AI slop will accelerate transition towards decentralized federated trust ring networks. I also hope it will destroy or at least largely damage current internet / cloud monopolies - google / meta / amazon / microsoft. Maybe public knowledge will be harmed but people will always find the way to pass information without slop.

captain_aggravated@sh.itjust.works on 12 Oct 22:30 collapse

You see the same panic about 3D printed guns. It’s not that difficult to make a gun at home, but 3D printers makes it slightly more trivial.

hotdogcharmer@lemmy.zip on 14 Oct 21:49 collapse

Internet’s dead folks, time to get back to the real world! 🥳