TheGrandNagus@lemmy.world
on 07 Feb 2024 06:51
nextcollapse
I support automatically face swapping everyone’s faces in photos with the gingerbread man from Shrek.
After all, there’s no such thing as a real photo anyway, so Samsung editing them in unusual ways is completely reasonable.
Kolanaki@yiffit.net
on 07 Feb 2024 06:54
nextcollapse
What’s a Polaroid then?
Vorticity@lemmy.world
on 07 Feb 2024 06:59
nextcollapse
A Polaroid is the best representation that can be made of a scene on Polaroid photo film. The lens, the paper, and other factors will always make the representation, to a degree, not real. That was the Samsung exec’s point. It’s a little disingenuous, though. The discussion shouldn’t be about “real” vs “fake” it should be about “faithful” vs “misleading”.
Vorticity@lemmy.world
on 07 Feb 2024 08:27
collapse
Thanks!
Deceptichum@kbin.social
on 07 Feb 2024 08:14
collapse
So what’s an eyeball then?
Our perception of reality isn’t real, it’s just light hitting a lens and being decoded by an organic computer.
Or to paraphrase the philosopher Jaden Smith: How Can Cameras Be Real If Our Eyes Aren't Real
Vorticity@lemmy.world
on 07 Feb 2024 08:22
collapse
Add to that the fact that our brains run software that doesn’t even try to faithfully store images and you have part of the reason that photos are, currently, more reliable than eye witnesses. That may be changing though.
Our brains are natural intelligence and perform natural learning. The results are even less reliable, predictable, and repeatable than the results provided by artificial intelligence.
General_Effort@lemmy.world
on 07 Feb 2024 19:56
collapse
The Samsung Boss said:
As soon as you have sensors to capture something, you reproduce [what you’re seeing], and it doesn’t mean anything. There is no real picture.
A Polaroid photograph is a real picture, in the sense that it exists as a single, definitive, physical thing. Whether what it shows is real is a different question, though.
Vorticity@lemmy.world
on 07 Feb 2024 06:55
nextcollapse
The statement that “There is no such thing as a real picture” isn’t wrong. It kind of missed the point though. It’s true that, even when a photo attempts to make the most faithful representation possible, it can only approximate what it sees. The sensors used all have flaws and idiosyncracies and software that processes the images makes different decisions in different situations to give a good image. Trying to draw a line between a “real” picture and a “fake” picture is like trying to define where the beach ends and where the ocean begins. The line can be drawn in many places for many reasons.
That said, the editing that the S24 is going to allow may be going a little far in the direction of “fake” from the sounds of things. I’m not sure if that is good or bad but it does scare me that photos can’t really be relied upon to give an accurate representation of a scene anymore. Everyone having access to ti’s kind of AI is going to make it tremendously difficult to distinguish between realistic and misleading images.
thehatfox@lemmy.world
on 07 Feb 2024 07:33
nextcollapse
Capturing any data or making any measurement is an approximation, because every type of sensor has a limited degree of accuracy - with some more sensitive than others.
I think there is a clear enough line however between making an approximated record of a value, and making a guess at a value, the latter being essentially how these “AI” camera systems work.
Vorticity@lemmy.world
on 07 Feb 2024 08:05
collapse
I disagree. It’s not that easy to draw a line.
First, current cameras that we consider to not use AI still manipulate images beyond just attempting to approximate the scene. They may not allow easy face swapping but they still don’t faithfully represent the scene much of the time.
Also, I don’t even think it is clear where we can draw a line between “normal” algorithms and “AI” algorithms. What level of machine learning is required before we consider an alrogitm to be AI?
Simple non-AI algorithms and generative AI are on a spectrum of comlexity. They aren’t discrete from one another such that they can be easily categorized.
ArbiterXero@lemmy.world
on 07 Feb 2024 09:07
collapse
I think that’s disingenuous…
There’s a clear difference between a processing mistake and an intentional addition. That’s a fairly clear line.
Grain on a photo is not the same as making you look like a human head on a shark’s body.
Yes, no photo is 100% accurate, just as no statement will ever capture an incident perfectly. That doesn’t mean there’s no such thing as lying.
There is definitely a line.
Is the tech trying to make the photo more accurate, or change the narrative?
Sure, there’s some tech that’s awkwardly in the middle, like skin smoothing, but we don’t NEED that, and it’s not directly changing the narrative of the story, unless you’re selling acne medication.
Vorticity@lemmy.world
on 07 Feb 2024 15:43
collapse
I still disagree that there is a clear line. Yes, it is obvious that photo grain is different from making you look like a human head on a shark’s body. The problem is somewhere in the middle. Determining where that line is drawn is going to be difficult and is only going to become more difficult as this technology advances.
ArbiterXero@lemmy.world
on 07 Feb 2024 16:19
collapse
I think the line (while the details may be certainly difficult) is along “are you making the existing image/story clearer or are you changing the narrative of the media?
When the story you get from the image changes, then you’ve crossed the line.
Vorticity@lemmy.world
on 07 Feb 2024 16:32
collapse
I generally agree with you but that is still a fuzzy line to draw that is likely very dependent on circumstances. The devil is in the details.
ArbiterXero@lemmy.world
on 07 Feb 2024 16:37
collapse
I can concede to that… there will be some grey area, but the idea that “there is no true photo” or “there is no truth” feels wrong.
fidodo@lemmy.world
on 07 Feb 2024 07:49
nextcollapse
At least sensors will be relatively consistently flawed, while AI can just completely make details up.
Vorticity@lemmy.world
on 07 Feb 2024 08:10
collapse
It’s not just the sensors though. The software used to convert what the sensors saw into an image makes decisions. Those decisions are sometimes simple and sometimes complex. Sometimes they are the result of machine learning and might already be considered to be AI. This is just another step in the direction of less faithfulness in photos.
fine_sandy_bottom@discuss.tchncs.de
on 07 Feb 2024 07:53
collapse
Whether or not this feature is on Samsung phones it will still be accessible. It already is really. You can’t hold back the tide.
Vorticity@lemmy.world
on 07 Feb 2024 08:13
collapse
Yeah, you’re right. It still scares me somewhat, though. What happens when courts fall behind and continue to rely on photo evidence after photo evidence becomes easy for anyone to fake. What happens when the courts finally do realize that photos are unreliable?
I don’t think this change can or should be stopped. It is just worrisome and thought should be put into how to mitigate the problems it will inevitably cause
fine_sandy_bottom@discuss.tchncs.de
on 07 Feb 2024 08:53
collapse
That’s not how courts work. It’s not like there’s a list of acceptable evidence that gets updated once in a while.
Prosecutors and defenders will present evidence to jurors in their contemporary context.
Basically, we all need to acknowledge that images do not convey “truth”, and really they never did.
circuitfarmer@lemmy.world
on 07 Feb 2024 07:26
nextcollapse
Yeah, this is a great example of a true statement that just serves to muddy the water of the actual argument.
A better way to think about it is: an AI-dependent photo is less representative of whatever is in the photo versus a regular photo.
It's not even a true statement. "A real picture of a pipe" has never once in history been interpreted as "my golly - there's an actual goddamn pipe trapped inside this piece of paper". We know it's a freaking representation.
The "real" part refers to how it's a product of mechanically capturing the light that was reflected off an actual pipe at some moment in time. You could have a real picture with adjusted colours, at which point it's real but manipulated. Of course with digital photography it's more complicated as the camera will try to figure out what the colours should be, but it doesn't mean the notion of a real picture is suddenly ready for the scrapyard. Monet's painting is still a painting.
Everyone knows exactly what you mean when you say a real picture. Imposing a 3D model over the moon to make it more detailed, for example, constitutes "not a real picture". Pretending this is some impossible philosophical dilemma is just a corporate exercise in doublespeak.
9point6@lemmy.world
on 07 Feb 2024 10:44
nextcollapse
To play devil’s advocate, even traditional photography involves a lot of subjective/artistic decisions before you get a photo. The type of film used can massively affect the image reproduced, and then once the photos are being developed, there’s a load of choices made there which determine what the photo looks like.
There’s obviously a line where a photo definitely becomes “edited”, but people often believe that an objective photo is something that exists, and I don’t think that’s ever been the case.
Of course - there's a huge difference between a "real photo" and "objective reality", and there always has been. In the same way an impressionistic painting might capture reality accurately while not really looking like it that much.
azertyfun@sh.itjust.works
on 08 Feb 2024 00:08
collapse
It’s actually way worse. Modern smartphones do a LOT of postprocessing that is basically just AI, and have been for years. Noise reduction, upscaling, auto-HDR and bokeh are all achieved through “AI” and are way further removed from reality than a film print or a DSLR picture. Smartphone sensors aren’t nearly as good as a decent DSLR, they just make up for it with compute power and extremely advanced processing pipelines so we can’t tell the difference at a glance.
Zoom into even a simple picture of a landscape, and you can obviously tell whether it was shot on smartphone. HDR artifacting and weird hallucinogenic blobs in low-light details are telltale signs, and not coincidentally rather similar to telltale sign of AI-generated photorealistic pictures.
Anyway it’s still important to draw a line in the sand for what constitutes a “doctored” picture, but the line isn’t so obviously placed once you realize just how wildly different a “no filter” smartphone pic is from the raw image straight from the sensor.
This is fuckin’ brilliant. A picture worth a thousand mutilated words.
Drewelite@lemmynsfw.com
on 07 Feb 2024 16:37
collapse
An AI edited photo might not necessarily be less representative of whatever is in the photo. Imagine an image taken in a very dark room, then an AI enhancement makes it look like the lights are on. You can actually get a much better idea of what’s in the room, but a less good idea of what the lighting was like. So it comes down to opinion, which one is more representative of reality? Because no photo since the beginning of time has been completely representative of what humans actually see with their eyes. It’s always been a trade-off of: what do we change to give humans the image they want with the technology we have.
circuitfarmer@lemmy.world
on 07 Feb 2024 20:40
collapse
…but the lights weren’t on.
Drewelite@lemmynsfw.com
on 07 Feb 2024 21:09
collapse
Do you think night vision produces a ‘fake’ image? Maybe you do, but my point is, that’s your opinion. You might think that accurate representation of the light level is more important than accurate representation of the objects in front of the lens. But someone else might not. Same way a colorized photo can give a more accurate representation of reality with false information.
circuitfarmer@lemmy.world
on 07 Feb 2024 21:22
collapse
I mean, you’re debating the meaning of “accurate representation”. We may as well debate the meaning of perception, too, but I don’t think it changes the point of my original argument.
Drewelite@lemmynsfw.com
on 07 Feb 2024 23:45
collapse
I think it does, because photos have always been an inaccurate representation of what a person sees. You zoom in on my face in a picture and you see a bunch of pixels. That’s not what my face looks like, I’m not made of tiny boxes. If I AI upscale it, it looks a lot closer. My argument here is simply: the statement that an AI dependent image is inherently less representative of reality, is not necessarily true.
pinkdrunkenelephants@lemmy.cafe
on 08 Feb 2024 01:11
collapse
The fact that it’s AI generated and not directly light-into-image makes it untrustworthy.
Like actual film photos are a lot harder to fake and therefore are more trustworthy.
In principle, that image AI software can be programmed to generate whatever it wants. It can even censor your own film footage.
Like if a revolution happens in this country next year, you bet your ass the police and military will exact atrocities on the American people to stop it, and the corporations they’re in bed with can reprogram everyone’s phones to censor out the footage of it, so genocide cannot be proven.
Watch and see it happen.
WallEx@feddit.de
on 07 Feb 2024 07:43
nextcollapse
Altering the photo even further only makes it worse though …
massive_bereavement@kbin.social
on 07 Feb 2024 08:00
collapse
I'm altering the photo, pray I don't alter it any further.
Deceptichum@kbin.social
on 07 Feb 2024 08:12
nextcollapse
thanks_shakey_snake@lemmy.ca
on 07 Feb 2024 08:17
nextcollapse
“There was a very nice video by Marques Brownlee last year on the moon picture,” Chomet told us. “Everyone was like, ‘Is it fake? Is it not fake?’ There was a debate around what constitutes a real picture. And actually, there is no such thing as a real picture. As soon as you have sensors to capture something, you reproduce [what you’re seeing], and it doesn’t mean anything. There is no real picture. You can try to define a real picture by saying, ‘I took that picture’, but if you used AI to optimize the zoom, the autofocus, the scene – is it real? Or is it all filters? There is no real picture, full stop.”
If your epistemological resolution for determining the fakeness of the moon landing photos is to just assert that all photos are in a sense fake so case closed, then I feel like you aren’t even wrong about the right thing.
astrsk@kbin.social
on 07 Feb 2024 09:07
nextcollapse
The moon photos they’re talking about are specifically the AI enhanced zoom moon photos of previous Samsung models that caught controversy because taking a picture of a marginally round object against a black background with their zoom enhancement would produce a moon photo even if it was in someone’s dark basement and the object was a dimly lit bottle cap.
Exec@pawb.social
on 07 Feb 2024 09:41
nextcollapse
To think about it, if a new crater gets ever created on the moon one way or the other these AI models won’t be ever updated and will show the “fake old version” forever.
if you used AI to optimize the zoom, the autofocus, the scene – is it real?
To me, using AI to optimize zoom, focus, aperture (or fake aperture effects), framing etc. That's composition. The picture isn't fake, but software helped compose the real image in a better way.
When you change the image (remove objects, distort parts of the image not the whole, airbrush etc) then the image isn't based on reality any more.
That's where I see the line drawn, at least. Yes, drawing a line also makes the image not real any more.
Beyond this, we get to philosophy. In which case, I'll refer to my other comment on another post about this story. Our brain transforms the image our eyes receive (presumably to be able to relay it around the brain efficiently, who knows?). So we can take it to Matrix philosophy. When we don't know if what we're seeing is real, what is real?
Drewelite@lemmynsfw.com
on 07 Feb 2024 16:29
collapse
I think the reality is that there is no reality, there is only perception. Composition does add to remove things from the photo. Light, both the amount and its wavelength, is a thing. Whether the lens picks up the pores on a person’s face is a thing. Whether The background seems close or far as a thing. But I agree that camera makers would tow any philosophical line to help them drive profits.
arken@lemmy.world
on 07 Feb 2024 08:22
nextcollapse
When corporations dabble in philosophy, you know they’re trying to muddy the waters and skirt an ethical issue. It’s not a genuine inquiry going on here; it’s a “whatever argument serves the bottom line” situation.
I guess there’s no such thing as intellectual property either, when you really think about it. Hence nothing wrong with me making and selling pirated samsung phones.
eatthecake@lemmy.world
on 07 Feb 2024 10:23
collapse
Can i copyright my face and get an ai to trawl the internet for any pictures of me and demand people take them down or pay me?
Viking_Hippie@lemmy.world
on 07 Feb 2024 13:01
collapse
If you’re prominent enough and can afford to pay for the lawyers, you can assert ownership of your own likeness, yes.
That either is necessary is a travesty, though.
gapbetweenus@feddit.de
on 07 Feb 2024 09:48
nextcollapse
I don’t really understand what is unethical about AI photo edit to begin with? Like photo editing before AI existed and you could make anything you want with a ps.
datavoid@lemmy.ml
on 07 Feb 2024 15:38
nextcollapse
Besides potential copyright infringement, there isn’t really any issue with it I’d say. This is still an incredibly stupid take from Samsung however.
pinkdrunkenelephants@lemmy.cafe
on 08 Feb 2024 01:14
collapse
Because it’s not automatic and is outside of the user’s control, to start.
gapbetweenus@feddit.de
on 08 Feb 2024 09:04
collapse
It’s not?
zout@kbin.social
on 07 Feb 2024 09:51
nextcollapse
At this rate with this mentality, we are going to have photo "evidence" of bigfoot and UFO's before 2030.
Nudding@lemmy.world
on 07 Feb 2024 12:56
nextcollapse
I dunno if you’re unaware but UFOs (uaps) are definitely real and there’s tons of photo and video evidence of em.
They’re still unidentified lol. We have no idea what’s inside them.
key@lemmy.keychat.org
on 07 Feb 2024 14:24
collapse
There’s a lot of UFOs if you’re bad at birding.
rottingleaf@lemmy.zip
on 07 Feb 2024 13:01
nextcollapse
Early XXI century seems to model late XIX century quite well, if you think about it. A tremendous leap followed by gimping and all kinds of such stuff.
General_Effort@lemmy.world
on 07 Feb 2024 19:52
collapse
We’ve had photo (and film!) “evidence” of both for many decades. “It has to be real, because they did not have CGI back then” is something that people actually argue.
foggy@lemmy.world
on 07 Feb 2024 10:15
nextcollapse
We are truly in a post-truth era
Lie on your resume.
MaxHardwood@lemmy.ca
on 07 Feb 2024 11:36
nextcollapse
Photo manipulation has been a thing since photos have been a thing
Akrenion@programming.dev
on 07 Feb 2024 12:33
nextcollapse
While that is true, it has gotten incredibly easy to alter and spread such photos. We all love interesting ways to take photos with optical illusions and practical effects.
Just like using a gun. It has gotten very easy to inflict disproportional harm compared to just 50 or 100 years ago.
key@lemmy.keychat.org
on 07 Feb 2024 13:28
collapse
The only way to stop a bad guy with a camera is a good guy with a camera?
That’s true, but they didn’t used to sell you a camera claiming it would take a picture when in fact it just invented a picture it thought looked similar to the picture you were trying to take.
XeroxCool@lemmy.world
on 07 Feb 2024 15:47
collapse
“and that’s why cutting and pasting a picture of the moon in the backend of the processing done to your shitty attempt to take a shaky picture of the moon without appreciable amounts of optical zoom to share with absolutely no one that cares is totally fine in a Samsung. You could have been doing that with scissors or MS Paint your whole life”
Ha ha, very clever… money is just made up. But wait, so are borders, sovereignty, language, art, moral and commercial value, the law, logic, authority, human rights, culture, and government.
According to Jacques Lacan, experience itself is a fabrication; the worst thing that can happen to a person is to come into contact with the real.
HeavyRaptor@lemmy.zip
on 07 Feb 2024 11:57
nextcollapse
How can the picture be real if your eyes aren’t real?
themeatbridge@lemmy.world
on 07 Feb 2024 13:05
nextcollapse
The universe is a hologram projected from 19 dimensional space to look like it exists in 4 dimensional spacetime.
macarthur_park@lemmy.world
on 07 Feb 2024 23:19
collapse
Settle down, Jayden
Pulptastic@midwest.social
on 07 Feb 2024 13:47
nextcollapse
General_Effort@lemmy.world
on 07 Feb 2024 16:32
nextcollapse
There are certainly purposes for which one wants as much of the raw sensor readings as possible. Other than science, evidence for legal proceedings is the only thing that comes to mind, though.
I’m more disturbed by the naive views so many people have of photographic evidence. Can you think of any historical photograph that proves anything?
A more momentous occasion is illustrated by a photograph of Red Army soldiers raising the soviet flag over the Reichstag. The rubble of Berlin in the background gives it more evidentiary value, but it is manipulated. It was not only staged but actually doctored. Smoke was added in the background and an extra watch on a soldier’s arm (evidence of robbery) removed.
Closer to now: As you are aware, anti-American operatives are trying to destroy the constitutional order of the republic. After the last election, they claimed to have video evidence of fraud during ballot counting. On one short snippet of video, one sees a woman talking to some people and then, after they leave, pull a box out from under a table. It’s quite inconspicuous, but these bad actors invented a story around this video snippet, in which a “suitcase” full of fraudulent ballots is taken out of hiding after observers leave.
As psychologists know, people do not think in strictly rational terms. We do not take in facts and draw logical conclusion. Professional manipulators, such as advertisers, know that we tend to think in “narratives”. If a story is compelling, we like to twist neutral snippets of fact into evidence. We see what we believe.
PipedLinkBot@feddit.rocks
on 07 Feb 2024 16:33
nextcollapse
SocialMediaRefugee@lemmy.world
on 07 Feb 2024 20:11
nextcollapse
The situations that drive me nuts are the conspiracy idiots who zoom in super hard on some heavily compressed image they pulled off of the web. They then proceed to claim that compression artifacts, optical flares, noise, etc are evidence of whatever crap they are pushing.
Taking things out of context is another issue. It has become painfully common online. I would see it all the time when pushing the “all police are bad!” narrative. They will deliberately edit out the violence that triggered the arrest then make it look like the arrest was unwarranted and overly physical. People will do this with dashcam videos and show road rage but edit out the part where they triggered it with their own aggression.
Ok the one hand, yeah. Actions have consequences. On the other hand, no amount of aggressive driving “deserves” to be responded to in the way some do, and no amount of someone doing something dangerous or illegal justifies police using unnecessary force (or else we wouldn’t call it that). Once they’ve been subdued, it should be done.
Kbobabob@lemmy.world
on 07 Feb 2024 22:18
nextcollapse
Can you think of any historical photograph that proves anything?
Any photos from war zones.
Tienanmen Square images.
Moon landing.
To name a few that are IMO.
General_Effort@lemmy.world
on 08 Feb 2024 00:33
collapse
Any photos from war zones.
I gave 2 photos from war zones as examples. What do they prove and how?
Mrkawfee@feddit.uk
on 08 Feb 2024 09:37
nextcollapse
The one in Berlin illustrates the inevitable triumph of Communism over capitalist fascism. Obviously.
Simulation6@sopuli.xyz
on 08 Feb 2024 10:15
collapse
The fact the scene was reenacted for the photo does not change all that much. There was a time when most people thought that photos never lie, but that hasn’t been for a long time.
General_Effort@lemmy.world
on 08 Feb 2024 12:18
collapse
The fact the scene was reenacted for the photo does not change all that much.
How do you know that they were reenacted? There are AIs that can produce deepfake texts.
pinkdrunkenelephants@lemmy.cafe
on 08 Feb 2024 01:04
collapse
The Moon landings? Hello?
General_Effort@lemmy.world
on 08 Feb 2024 12:04
collapse
Seriously? At least clarify that you mean film and not photographs. The effects of lower gravity and not atmosphere take at least some effort to get right. Photos can be staged anywhere.
Have you ever looked at the arguments of moon hoaxers? A lot of them would be good questions, if they were questions. Why can’t you see the stars? Why are there multiple shadows if the sun is supposed to be the only light source? The boot prints are so perfect, as prints only are in wet sand. How can the flag wave without an atmosphere?
You can easily find answers (and more questions). How do you “prove” these answers? How do you go from that to actually turning photos, and even film, into proof positive of the moon landing?
pinkdrunkenelephants@lemmy.cafe
on 08 Feb 2024 14:12
nextcollapse
Oh my god, this IS just a launchpad for lame conspiracybros 🤦🤦🤦
No, the Moon landings were real and the images/clips you see of it online are real too. If it was faked, the USSR would have screamed to the high heavens about it. Just because you personally haven’t experienced something does not mean it can’t actually happen.
Grow the fuck up and accept reality doesn’t conform to your wishful thinking.
Oh, and the Earth is round, too, and NASA livestreams and photos of the very clearly round Earth are valid too.
Just because the majority of people believe something without thinking about it doesn’t mean it isn’t true or they’re not critical thinkers. People know what’s worthwhile to question and what’s not, and that’s a vital aspect of critical thinking you did not consider because you don’t understand or care about what it is, it’s just an emotional cudgel for you to accuse regular people of bullshit to brainwash and abuse them.
Get off of my feed. Go outside.
Inb4 “Well that’s not his point” – yes it is; he’s just trying to pretend to be reasonable to get his foot in the door. Salesmen do this shit all the time; it’s a common tactic and it’s why we know not to listen to people like him.
General_Effort@lemmy.world
on 08 Feb 2024 14:56
nextcollapse
Well, that’s one straightforward, though rather disturbing, demonstration of what I was talking about. You perceived some snippets of fact and constructed a story around it.
There’s no rational way you can deduce any parts of that story from my posts here. There is nothing that suggests any hidden motives on my part. Occam’s razor would say that you should simply accept my stated motive, as it is a sufficient explanation.
A more linear rational view would find problems with your story. You brought up the moon landing and I responded. This contradicts the idea that I have any particular interest in moon hoax ideas.
A taxonomy of fallacies might identify this as an ad hominem attack or character assassination. You made up lies about me, instead of replying to my arguments. I note that you do not use photos or film to argue for the reality of the moon landings, but refer to the reaction of the Soviet Union. That is something worth thinking about some more. While it is still a narrative, we do glimpse a rational argument.
So, thanks for the example. The way you just conjure a paranoid fantasy tale, instead of engaging rationally, is a very topical demonstration of conspiracy thinking.
prole@sh.itjust.works
on 08 Feb 2024 15:59
collapse
Just stop. Stop trying to repackage stupid, boring conspiracy bullshit by couching it in faux-philosophy and five dollar words.
Nobody with more than an 8th grade education is falling for the “This sounds smart and I see big words, therefore the person who wrote it must be smarter than me and know what they’re talking about” bit.
General_Effort@lemmy.world
on 08 Feb 2024 20:17
collapse
Can I ask why you feel the need to insult me?
prole@sh.itjust.works
on 08 Feb 2024 20:19
collapse
I didn’t insult you
General_Effort@lemmy.world
on 08 Feb 2024 20:21
collapse
Ok, if you feel that way. Can you express why you felt that the content of your replies was reasonable?
prole@sh.itjust.works
on 08 Feb 2024 15:54
collapse
Inb4 “Well that’s not his point” – yes it is; he’s just trying to pretend to be reasonable to get his foot in the door.
Maybe it’s just this particular instance, but lately my Lemmy feed has been full of comments doing exactly this. To the point where it’s ruining my experience. Which seems to be the goal.
prole@sh.itjust.works
on 08 Feb 2024 15:45
collapse
Really didn’t take long for this site (maybe just this instance?) to turn into another reddit. A cesspool of astroturfing and proud ignorance. Either a complete inability to think critically, or just brain rot, in this case.
It seems like there are just certain people who are dead set on ruining any space on the internet that still exists for people without brain rot. Like they know they’re lowering the overall quality of discussion, and instead of doing better, they lean into it. If they can’t enjoy themselves, then nobody can.
Just one more extension of their childish, petulant demeanor. It’s exactly why over 70 million people voted for a traitorous, demented man child; they see themselves in him.
Siegfried@lemmy.world
on 07 Feb 2024 21:32
nextcollapse
Im 14 and this is deep
OutrageousUmpire@lemmy.world
on 07 Feb 2024 23:29
nextcollapse
edits made using this generative AI tech will result in a watermark and metadata changes
The metadata is easy to erase. It’s only a matter of time until we start seeing some open source projects come out that can remove the watermarking the AI players are starting to try.
pinkdrunkenelephants@lemmy.cafe
on 08 Feb 2024 01:12
nextcollapse
So, I’m just going to not buy their garbage.
taanegl@lemmy.world
on 08 Feb 2024 01:38
nextcollapse
There’s no such thing as a real CEO… they’re just target practice what got up and walked.
BeatTakeshi@lemmy.world
on 08 Feb 2024 08:58
nextcollapse
They do have a point when they say AI is here to stay, and what they propose (A ‘watermark’ in the metadata for AI edited content) is at least a way forward. There should be also some electronic seal/signature for this to be effective though (md5?) , metadata so far is easy to tweak
General_Effort@lemmy.world
on 08 Feb 2024 12:47
nextcollapse
FWIW, The Samsung Boss said:
As soon as you have sensors to capture something, you reproduce [what you’re seeing], and it doesn’t mean anything. There is no real picture.
I understand this as talking about a definitive original, as you get with trad analog photography. With a photographic film, you have a thin coat (a film) of a light sensitive substance on top of a strip of plastic. Taking an analog picture means exposing this substance to light. The film is developed, meaning that the substance is chemically altered to no longer be light sensitive. This gives you a physical object that is, by definition, the original. It is the real picture. Anything that happens afterward is manipulation.
An electronic sensor gives you numbers; 1s and 0s that can be copied at will. These numbers are used to control little lights in a display.
As far as I understand him, he is not being philosophical but literal. There is no (physically) real picture, just data.
prole@sh.itjust.works
on 08 Feb 2024 15:40
collapse
I would consider the negatives to be “the original” over the first photo that was printed using them.
General_Effort@lemmy.world
on 08 Feb 2024 17:34
collapse
I agree. The negatives are the developed film. They were physically present at the scene and were physically altered by the conditions at the scene. Digital photography has nothing quite like it.
NotMyOldRedditName@lemmy.world
on 08 Feb 2024 18:09
collapse
Are the raw photos manipulated or are they just the original 1s and 0s unedited?
Not sure if there’s any preprocessing before the processing.
Edit: I’m imagining a digital camera that cryptographicly signs each raw frame before any processing with a timestamp and GPS location. Would be the best you could probably do. Could upload it’s hash to a block chain for proof of existence as well.
Edit: I guess the GPS system would need some sort of ceyptographic handshake with the camera to prove the location was legitimately provided by the satellite as well.
General_Effort@lemmy.world
on 08 Feb 2024 20:15
nextcollapse
Are the raw photos manipulated or are they just the original 1s and 0s unedited?
I think your average camera does not have the option to save RAW files. It seems somewhat common even outside professional equipment, though.
Could upload it’s hash
Yes, exactly. However, this would only prove that the image (and metadata like GPS coordinates) existed at that particular point in time. That would add a lot of credibility to, say, dashcam footage after a collision. It’s curious that misinformation has become a major issue in the public consciousness, at a time when we have far better means of credibly documenting facts, than ever before.
But it would do little to add weight to images from, say, war zones. Knowing that a particular image or video existed at a particular point in time would rarely allow the conclusion if it was real or misinformation. In some cases, one may be able to cross-references with independent, trustworthy sources, like reporters from neutral countries or satellite imagery.
Creating a tamper-proof camera is a fool’s errand. The best you can do is tamper-resistant. That may be enough if the camera can be checked by a trustworthy organization and does not leave its control for long. But in such a scenario you would rarely need that, and it’s not the usual scenario. The price would be very high. Fakes that do pass muster will be given more credibility.
I think your issue starts there, you already have to decide how to build your sensor:
If it’s a CMOS sensor how strong do the MOSFETs amplify? That should affect brightness and probably noise.
How quickly do you vertically shift the data rows? The slower the stronger the rolling shutter effect will be.
What are the thresholds in your ADC? Affects the brightness curve.
How do you layout the color filter grid? Will you put in twice as many green sensors compared to blue or red as usual? This should affect the color balance.
How many pixels will you use in the first place? If there is many each will be more noisy, but spacial resolution should be better.
All of these choices will lead to different original 1s and 0s, even before any post-processing.
jmbreuer@lemmy.ml
on 08 Feb 2024 12:50
nextcollapse
I feel most of this is a slippery slope / negative sum spiral.
threaded - newest
I support automatically face swapping everyone’s faces in photos with the gingerbread man from Shrek.
After all, there’s no such thing as a real photo anyway, so Samsung editing them in unusual ways is completely reasonable.
What’s a Polaroid then?
A Polaroid is the best representation that can be made of a scene on Polaroid photo film. The lens, the paper, and other factors will always make the representation, to a degree, not real. That was the Samsung exec’s point. It’s a little disingenuous, though. The discussion shouldn’t be about “real” vs “fake” it should be about “faithful” vs “misleading”.
Excellent to-the-point comment!
Thanks!
So what’s an eyeball then?
Our perception of reality isn’t real, it’s just light hitting a lens and being decoded by an organic computer.
Or to paraphrase the philosopher Jaden Smith: How Can Cameras Be Real If Our Eyes Aren't Real
Add to that the fact that our brains run software that doesn’t even try to faithfully store images and you have part of the reason that photos are, currently, more reliable than eye witnesses. That may be changing though.
Our brains are natural intelligence and perform natural learning. The results are even less reliable, predictable, and repeatable than the results provided by artificial intelligence.
The Samsung Boss said:
A Polaroid photograph is a real picture, in the sense that it exists as a single, definitive, physical thing. Whether what it shows is real is a different question, though.
The statement that “There is no such thing as a real picture” isn’t wrong. It kind of missed the point though. It’s true that, even when a photo attempts to make the most faithful representation possible, it can only approximate what it sees. The sensors used all have flaws and idiosyncracies and software that processes the images makes different decisions in different situations to give a good image. Trying to draw a line between a “real” picture and a “fake” picture is like trying to define where the beach ends and where the ocean begins. The line can be drawn in many places for many reasons.
That said, the editing that the S24 is going to allow may be going a little far in the direction of “fake” from the sounds of things. I’m not sure if that is good or bad but it does scare me that photos can’t really be relied upon to give an accurate representation of a scene anymore. Everyone having access to ti’s kind of AI is going to make it tremendously difficult to distinguish between realistic and misleading images.
Capturing any data or making any measurement is an approximation, because every type of sensor has a limited degree of accuracy - with some more sensitive than others.
I think there is a clear enough line however between making an approximated record of a value, and making a guess at a value, the latter being essentially how these “AI” camera systems work.
I disagree. It’s not that easy to draw a line.
First, current cameras that we consider to not use AI still manipulate images beyond just attempting to approximate the scene. They may not allow easy face swapping but they still don’t faithfully represent the scene much of the time.
Also, I don’t even think it is clear where we can draw a line between “normal” algorithms and “AI” algorithms. What level of machine learning is required before we consider an alrogitm to be AI?
Simple non-AI algorithms and generative AI are on a spectrum of comlexity. They aren’t discrete from one another such that they can be easily categorized.
I think that’s disingenuous…
There’s a clear difference between a processing mistake and an intentional addition. That’s a fairly clear line. Grain on a photo is not the same as making you look like a human head on a shark’s body.
Yes, no photo is 100% accurate, just as no statement will ever capture an incident perfectly. That doesn’t mean there’s no such thing as lying.
There is definitely a line.
Is the tech trying to make the photo more accurate, or change the narrative?
Sure, there’s some tech that’s awkwardly in the middle, like skin smoothing, but we don’t NEED that, and it’s not directly changing the narrative of the story, unless you’re selling acne medication.
I still disagree that there is a clear line. Yes, it is obvious that photo grain is different from making you look like a human head on a shark’s body. The problem is somewhere in the middle. Determining where that line is drawn is going to be difficult and is only going to become more difficult as this technology advances.
I think the line (while the details may be certainly difficult) is along “are you making the existing image/story clearer or are you changing the narrative of the media?
When the story you get from the image changes, then you’ve crossed the line.
I generally agree with you but that is still a fuzzy line to draw that is likely very dependent on circumstances. The devil is in the details.
I can concede to that… there will be some grey area, but the idea that “there is no true photo” or “there is no truth” feels wrong.
At least sensors will be relatively consistently flawed, while AI can just completely make details up.
It’s not just the sensors though. The software used to convert what the sensors saw into an image makes decisions. Those decisions are sometimes simple and sometimes complex. Sometimes they are the result of machine learning and might already be considered to be AI. This is just another step in the direction of less faithfulness in photos.
Whether or not this feature is on Samsung phones it will still be accessible. It already is really. You can’t hold back the tide.
Yeah, you’re right. It still scares me somewhat, though. What happens when courts fall behind and continue to rely on photo evidence after photo evidence becomes easy for anyone to fake. What happens when the courts finally do realize that photos are unreliable?
I don’t think this change can or should be stopped. It is just worrisome and thought should be put into how to mitigate the problems it will inevitably cause
That’s not how courts work. It’s not like there’s a list of acceptable evidence that gets updated once in a while.
Prosecutors and defenders will present evidence to jurors in their contemporary context.
Basically, we all need to acknowledge that images do not convey “truth”, and really they never did.
Yeah, this is a great example of a true statement that just serves to muddy the water of the actual argument.
A better way to think about it is: an AI-dependent photo is less representative of whatever is in the photo versus a regular photo.
It's not even a true statement. "A real picture of a pipe" has never once in history been interpreted as "my golly - there's an actual goddamn pipe trapped inside this piece of paper". We know it's a freaking representation.
The "real" part refers to how it's a product of mechanically capturing the light that was reflected off an actual pipe at some moment in time. You could have a real picture with adjusted colours, at which point it's real but manipulated. Of course with digital photography it's more complicated as the camera will try to figure out what the colours should be, but it doesn't mean the notion of a real picture is suddenly ready for the scrapyard. Monet's painting is still a painting.
Everyone knows exactly what you mean when you say a real picture. Imposing a 3D model over the moon to make it more detailed, for example, constitutes "not a real picture". Pretending this is some impossible philosophical dilemma is just a corporate exercise in doublespeak.
To play devil’s advocate, even traditional photography involves a lot of subjective/artistic decisions before you get a photo. The type of film used can massively affect the image reproduced, and then once the photos are being developed, there’s a load of choices made there which determine what the photo looks like.
There’s obviously a line where a photo definitely becomes “edited”, but people often believe that an objective photo is something that exists, and I don’t think that’s ever been the case.
Of course - there's a huge difference between a "real photo" and "objective reality", and there always has been. In the same way an impressionistic painting might capture reality accurately while not really looking like it that much.
It’s actually way worse. Modern smartphones do a LOT of postprocessing that is basically just AI, and have been for years. Noise reduction, upscaling, auto-HDR and bokeh are all achieved through “AI” and are way further removed from reality than a film print or a DSLR picture. Smartphone sensors aren’t nearly as good as a decent DSLR, they just make up for it with compute power and extremely advanced processing pipelines so we can’t tell the difference at a glance.
Zoom into even a simple picture of a landscape, and you can obviously tell whether it was shot on smartphone. HDR artifacting and weird hallucinogenic blobs in low-light details are telltale signs, and not coincidentally rather similar to telltale sign of AI-generated photorealistic pictures.
Anyway it’s still important to draw a line in the sand for what constitutes a “doctored” picture, but the line isn’t so obviously placed once you realize just how wildly different a “no filter” smartphone pic is from the raw image straight from the sensor.
<img alt="" src="https://feddit.de/pictrs/image/14f109c0-14e0-4298-a234-ea9fcba134e4.jpeg">
This is fuckin’ brilliant. A picture worth a thousand mutilated words.
An AI edited photo might not necessarily be less representative of whatever is in the photo. Imagine an image taken in a very dark room, then an AI enhancement makes it look like the lights are on. You can actually get a much better idea of what’s in the room, but a less good idea of what the lighting was like. So it comes down to opinion, which one is more representative of reality? Because no photo since the beginning of time has been completely representative of what humans actually see with their eyes. It’s always been a trade-off of: what do we change to give humans the image they want with the technology we have.
…but the lights weren’t on.
Do you think night vision produces a ‘fake’ image? Maybe you do, but my point is, that’s your opinion. You might think that accurate representation of the light level is more important than accurate representation of the objects in front of the lens. But someone else might not. Same way a colorized photo can give a more accurate representation of reality with false information.
I mean, you’re debating the meaning of “accurate representation”. We may as well debate the meaning of perception, too, but I don’t think it changes the point of my original argument.
I think it does, because photos have always been an inaccurate representation of what a person sees. You zoom in on my face in a picture and you see a bunch of pixels. That’s not what my face looks like, I’m not made of tiny boxes. If I AI upscale it, it looks a lot closer. My argument here is simply: the statement that an AI dependent image is inherently less representative of reality, is not necessarily true.
The fact that it’s AI generated and not directly light-into-image makes it untrustworthy.
Like actual film photos are a lot harder to fake and therefore are more trustworthy.
In principle, that image AI software can be programmed to generate whatever it wants. It can even censor your own film footage.
Like if a revolution happens in this country next year, you bet your ass the police and military will exact atrocities on the American people to stop it, and the corporations they’re in bed with can reprogram everyone’s phones to censor out the footage of it, so genocide cannot be proven.
Watch and see it happen.
Altering the photo even further only makes it worse though …
I'm altering the photo, pray I don't alter it any further.
This photo is getting worse all the time.
Enshitification of the photo
If your epistemological resolution for determining the fakeness of the moon landing photos is to just assert that all photos are in a sense fake so case closed, then I feel like you aren’t even wrong about the right thing.
The moon photos they’re talking about are specifically the AI enhanced zoom moon photos of previous Samsung models that caught controversy because taking a picture of a marginally round object against a black background with their zoom enhancement would produce a moon photo even if it was in someone’s dark basement and the object was a dimly lit bottle cap.
To think about it, if a new crater gets ever created on the moon one way or the other these AI models won’t be ever updated and will show the “fake old version” forever.
We should nuke the near side of the moon to catch AI off-guard
I’ve always wondered what it would look like from earth it the moon were nuked
Kurzgesagt - What if we nuke the moon?
Here is an alternative Piped link(s):
Kurzgesagt - What if we nuke the moon?
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Forgot that kurzgesagt has already nuked everything
Kurzgesagt has a video on the subject. Your question is answered at 5:50 - tiny blink of light for few seconds.
Here is an alternative Piped link(s):
video
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
To me, using AI to optimize zoom, focus, aperture (or fake aperture effects), framing etc. That's composition. The picture isn't fake, but software helped compose the real image in a better way.
When you change the image (remove objects, distort parts of the image not the whole, airbrush etc) then the image isn't based on reality any more.
That's where I see the line drawn, at least. Yes, drawing a line also makes the image not real any more.
Beyond this, we get to philosophy. In which case, I'll refer to my other comment on another post about this story. Our brain transforms the image our eyes receive (presumably to be able to relay it around the brain efficiently, who knows?). So we can take it to Matrix philosophy. When we don't know if what we're seeing is real, what is real?
I think the reality is that there is no reality, there is only perception. Composition does add to remove things from the photo. Light, both the amount and its wavelength, is a thing. Whether the lens picks up the pores on a person’s face is a thing. Whether The background seems close or far as a thing. But I agree that camera makers would tow any philosophical line to help them drive profits.
When corporations dabble in philosophy, you know they’re trying to muddy the waters and skirt an ethical issue. It’s not a genuine inquiry going on here; it’s a “whatever argument serves the bottom line” situation.
I guess there’s no such thing as intellectual property either, when you really think about it. Hence nothing wrong with me making and selling pirated samsung phones.
Can i copyright my face and get an ai to trawl the internet for any pictures of me and demand people take them down or pay me?
If you’re prominent enough and can afford to pay for the lawyers, you can assert ownership of your own likeness, yes.
That either is necessary is a travesty, though.
I don’t really understand what is unethical about AI photo edit to begin with? Like photo editing before AI existed and you could make anything you want with a ps.
Besides potential copyright infringement, there isn’t really any issue with it I’d say. This is still an incredibly stupid take from Samsung however.
Because it’s not automatic and is outside of the user’s control, to start.
It’s not?
At this rate with this mentality, we are going to have photo "evidence" of bigfoot and UFO's before 2030.
I dunno if you’re unaware but UFOs (uaps) are definitely real and there’s tons of photo and video evidence of em.
I should have specified that I mean the x-files type of UFO's, containing aliens.
They’re still unidentified lol. We have no idea what’s inside them.
There’s a lot of UFOs if you’re bad at birding.
Early XXI century seems to model late XIX century quite well, if you think about it. A tremendous leap followed by gimping and all kinds of such stuff.
We’ve had photo (and film!) “evidence” of both for many decades. “It has to be real, because they did not have CGI back then” is something that people actually argue.
Here’s a famous Bigfoot film.
Here is an alternative Piped link(s):
a famous Bigfoot film
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
What absolute hogwash.
We are truly in a post-truth era
Lie on your resume.
Photo manipulation has been a thing since photos have been a thing
While that is true, it has gotten incredibly easy to alter and spread such photos. We all love interesting ways to take photos with optical illusions and practical effects.
Just like using a gun. It has gotten very easy to inflict disproportional harm compared to just 50 or 100 years ago.
The only way to stop a bad guy with a camera is a good guy with a camera?
That’s true, but they didn’t used to sell you a camera claiming it would take a picture when in fact it just invented a picture it thought looked similar to the picture you were trying to take.
“and that’s why cutting and pasting a picture of the moon in the backend of the processing done to your shitty attempt to take a shaky picture of the moon without appreciable amounts of optical zoom to share with absolutely no one that cares is totally fine in a Samsung. You could have been doing that with scissors or MS Paint your whole life”
Wait… we weren’t supposed to be doing this for decades already? No one told me.
Relevant:
Reminds me of jREG’s hyper-self rant
Here is an alternative Piped link(s):
jREG’s hyper-self rant
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Hands Monopoly money to the clerk at a Samsung store
“I’ll take the S24, money is a made up concept anyway”
Ha ha, very clever… money is just made up. But wait, so are borders, sovereignty, language, art, moral and commercial value, the law, logic, authority, human rights, culture, and government.
According to Jacques Lacan, experience itself is a fabrication; the worst thing that can happen to a person is to come into contact with the real.
How can the picture be real if your eyes aren’t real?
The universe is a hologram projected from 19 dimensional space to look like it exists in 4 dimensional spacetime.
Settle down, Jayden
Do you think that’s air you’re breathing?
<img alt="" src="https://sopuli.xyz/pictrs/image/1874bd9e-ca3f-43be-bccd-8c089bf17dae.webp">
There are certainly purposes for which one wants as much of the raw sensor readings as possible. Other than science, evidence for legal proceedings is the only thing that comes to mind, though.
I’m more disturbed by the naive views so many people have of photographic evidence. Can you think of any historical photograph that proves anything?
Really famous in the US: The marines raising the flag over Iwo Jima. It was staged for the cameras, of course. What does it prove?
A more momentous occasion is illustrated by a photograph of Red Army soldiers raising the soviet flag over the Reichstag. The rubble of Berlin in the background gives it more evidentiary value, but it is manipulated. It was not only staged but actually doctored. Smoke was added in the background and an extra watch on a soldier’s arm (evidence of robbery) removed.
Closer to now: As you are aware, anti-American operatives are trying to destroy the constitutional order of the republic. After the last election, they claimed to have video evidence of fraud during ballot counting. On one short snippet of video, one sees a woman talking to some people and then, after they leave, pull a box out from under a table. It’s quite inconspicuous, but these bad actors invented a story around this video snippet, in which a “suitcase” full of fraudulent ballots is taken out of hiding after observers leave.
As psychologists know, people do not think in strictly rational terms. We do not take in facts and draw logical conclusion. Professional manipulators, such as advertisers, know that we tend to think in “narratives”. If a story is compelling, we like to twist neutral snippets of fact into evidence. We see what we believe.
Here is an alternative Piped link(s):
On one short snippet of video, one sees a woman talking to some people and then, after they leave, pull a box out from under a table.
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
The situations that drive me nuts are the conspiracy idiots who zoom in super hard on some heavily compressed image they pulled off of the web. They then proceed to claim that compression artifacts, optical flares, noise, etc are evidence of whatever crap they are pushing.
Taking things out of context is another issue. It has become painfully common online. I would see it all the time when pushing the “all police are bad!” narrative. They will deliberately edit out the violence that triggered the arrest then make it look like the arrest was unwarranted and overly physical. People will do this with dashcam videos and show road rage but edit out the part where they triggered it with their own aggression.
Ok the one hand, yeah. Actions have consequences. On the other hand, no amount of aggressive driving “deserves” to be responded to in the way some do, and no amount of someone doing something dangerous or illegal justifies police using unnecessary force (or else we wouldn’t call it that). Once they’ve been subdued, it should be done.
Any photos from war zones.
Tienanmen Square images.
Moon landing.
To name a few that are IMO.
I gave 2 photos from war zones as examples. What do they prove and how?
The one in Berlin illustrates the inevitable triumph of Communism over capitalist fascism. Obviously.
The fact the scene was reenacted for the photo does not change all that much. There was a time when most people thought that photos never lie, but that hasn’t been for a long time.
How do you know that they were reenacted? There are AIs that can produce deepfake texts.
The Moon landings? Hello?
Seriously? At least clarify that you mean film and not photographs. The effects of lower gravity and not atmosphere take at least some effort to get right. Photos can be staged anywhere.
Have you ever looked at the arguments of moon hoaxers? A lot of them would be good questions, if they were questions. Why can’t you see the stars? Why are there multiple shadows if the sun is supposed to be the only light source? The boot prints are so perfect, as prints only are in wet sand. How can the flag wave without an atmosphere?
You can easily find answers (and more questions). How do you “prove” these answers? How do you go from that to actually turning photos, and even film, into proof positive of the moon landing?
Oh my god, this IS just a launchpad for lame conspiracybros 🤦🤦🤦
No, the Moon landings were real and the images/clips you see of it online are real too. If it was faked, the USSR would have screamed to the high heavens about it. Just because you personally haven’t experienced something does not mean it can’t actually happen.
Grow the fuck up and accept reality doesn’t conform to your wishful thinking.
Oh, and the Earth is round, too, and NASA livestreams and photos of the very clearly round Earth are valid too.
Just because the majority of people believe something without thinking about it doesn’t mean it isn’t true or they’re not critical thinkers. People know what’s worthwhile to question and what’s not, and that’s a vital aspect of critical thinking you did not consider because you don’t understand or care about what it is, it’s just an emotional cudgel for you to accuse regular people of bullshit to brainwash and abuse them.
Get off of my feed. Go outside.
Inb4 “Well that’s not his point” – yes it is; he’s just trying to pretend to be reasonable to get his foot in the door. Salesmen do this shit all the time; it’s a common tactic and it’s why we know not to listen to people like him.
Well, that’s one straightforward, though rather disturbing, demonstration of what I was talking about. You perceived some snippets of fact and constructed a story around it.
There’s no rational way you can deduce any parts of that story from my posts here. There is nothing that suggests any hidden motives on my part. Occam’s razor would say that you should simply accept my stated motive, as it is a sufficient explanation.
A more linear rational view would find problems with your story. You brought up the moon landing and I responded. This contradicts the idea that I have any particular interest in moon hoax ideas.
A taxonomy of fallacies might identify this as an ad hominem attack or character assassination. You made up lies about me, instead of replying to my arguments. I note that you do not use photos or film to argue for the reality of the moon landings, but refer to the reaction of the Soviet Union. That is something worth thinking about some more. While it is still a narrative, we do glimpse a rational argument.
So, thanks for the example. The way you just conjure a paranoid fantasy tale, instead of engaging rationally, is a very topical demonstration of conspiracy thinking.
Just stop. Stop trying to repackage stupid, boring conspiracy bullshit by couching it in faux-philosophy and five dollar words.
Nobody with more than an 8th grade education is falling for the “This sounds smart and I see big words, therefore the person who wrote it must be smarter than me and know what they’re talking about” bit.
Can I ask why you feel the need to insult me?
I didn’t insult you
Ok, if you feel that way. Can you express why you felt that the content of your replies was reasonable?
Maybe it’s just this particular instance, but lately my Lemmy feed has been full of comments doing exactly this. To the point where it’s ruining my experience. Which seems to be the goal.
Really didn’t take long for this site (maybe just this instance?) to turn into another reddit. A cesspool of astroturfing and proud ignorance. Either a complete inability to think critically, or just brain rot, in this case.
It seems like there are just certain people who are dead set on ruining any space on the internet that still exists for people without brain rot. Like they know they’re lowering the overall quality of discussion, and instead of doing better, they lean into it. If they can’t enjoy themselves, then nobody can.
Just one more extension of their childish, petulant demeanor. It’s exactly why over 70 million people voted for a traitorous, demented man child; they see themselves in him.
Im 14 and this is deep
The metadata is easy to erase. It’s only a matter of time until we start seeing some open source projects come out that can remove the watermarking the AI players are starting to try.
So, I’m just going to not buy their garbage.
There’s no such thing as a real CEO… they’re just target practice what got up and walked.
They do have a point when they say AI is here to stay, and what they propose (A ‘watermark’ in the metadata for AI edited content) is at least a way forward. There should be also some electronic seal/signature for this to be effective though (md5?) , metadata so far is easy to tweak
FWIW, The Samsung Boss said:
I understand this as talking about a definitive original, as you get with trad analog photography. With a photographic film, you have a thin coat (a film) of a light sensitive substance on top of a strip of plastic. Taking an analog picture means exposing this substance to light. The film is developed, meaning that the substance is chemically altered to no longer be light sensitive. This gives you a physical object that is, by definition, the original. It is the real picture. Anything that happens afterward is manipulation.
An electronic sensor gives you numbers; 1s and 0s that can be copied at will. These numbers are used to control little lights in a display.
As far as I understand him, he is not being philosophical but literal. There is no (physically) real picture, just data.
I would consider the negatives to be “the original” over the first photo that was printed using them.
I agree. The negatives are the developed film. They were physically present at the scene and were physically altered by the conditions at the scene. Digital photography has nothing quite like it.
Are the raw photos manipulated or are they just the original 1s and 0s unedited?
Not sure if there’s any preprocessing before the processing.
Edit: I’m imagining a digital camera that cryptographicly signs each raw frame before any processing with a timestamp and GPS location. Would be the best you could probably do. Could upload it’s hash to a block chain for proof of existence as well.
Edit: I guess the GPS system would need some sort of ceyptographic handshake with the camera to prove the location was legitimately provided by the satellite as well.
I think your average camera does not have the option to save RAW files. It seems somewhat common even outside professional equipment, though.
Yes, exactly. However, this would only prove that the image (and metadata like GPS coordinates) existed at that particular point in time. That would add a lot of credibility to, say, dashcam footage after a collision. It’s curious that misinformation has become a major issue in the public consciousness, at a time when we have far better means of credibly documenting facts, than ever before.
But it would do little to add weight to images from, say, war zones. Knowing that a particular image or video existed at a particular point in time would rarely allow the conclusion if it was real or misinformation. In some cases, one may be able to cross-references with independent, trustworthy sources, like reporters from neutral countries or satellite imagery.
Creating a tamper-proof camera is a fool’s errand. The best you can do is tamper-resistant. That may be enough if the camera can be checked by a trustworthy organization and does not leave its control for long. But in such a scenario you would rarely need that, and it’s not the usual scenario. The price would be very high. Fakes that do pass muster will be given more credibility.
I think your issue starts there, you already have to decide how to build your sensor:
All of these choices will lead to different original 1s and 0s, even before any post-processing.
I feel most of this is a slippery slope / negative sum spiral.
See e.g. Liv Boeree’s video on beauty filters.
Here is an alternative Piped link(s):
video on beauty filters
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Sorry, but I don’t want to see fake “real” images just because people think it looks good.