Avoiding AI is hard – but our freedom to opt out must be protected (theconversation.com)
from Pro@programming.dev to technology@lemmy.world on 12 May 13:49
https://programming.dev/post/30202089

#technology

threaded - newest

muntedcrocodile@lemm.ee on 12 May 14:24 next collapse

I think it may he more productive to get people to use alternative ai products that are foss and/or respect privacy.

But_my_mom_says_im_cool@lemmy.world on 12 May 15:42 collapse

You got downvoted because Lemmy users like knee jerk reactions and think that you can unmake a technology or idea. You can’t, Ai is here and it’s forever now. Best we can do is find ways to live with it and like you said, reward those who use it ethically. The Lemmy idea that Ai should be banned and not used is so unrealistic

atomicbocks@sh.itjust.works on 12 May 19:59 collapse

You seem to misunderstand the ire;

AI in its current state has existed for over a decade. Watson used ML algorithms to beat Jeopardy by answering natural language questions in 2011. But techbros have gotten ahold of it and decided that copyright rules don’t apply to them and now the cat is out of the bag?!? From the outside it looks like bootlicking for the same bullshit that told us we would be using blockchain to process mortgages in 10 years… 10 years ago. AI isn’t just here to stay it’s been here for 70 years.

ClamDrinker@lemmy.world on 12 May 23:33 collapse

ML technology has existed for a while, but it’s wild to claim that the technology pre-2020 is the same. A breakthrough happened.

atomicbocks@sh.itjust.works on 13 May 02:36 collapse

Breakthroughs are more or less of a myth. Everything is iterative.

chunes@lemmy.world on 13 May 09:45 next collapse

Agreed. The only thing that has really changed is how much hardware we can throw at it. ML has existed more or less since the 60s.

ClamDrinker@lemmy.world on 13 May 10:15 collapse

Breakthroughs are not a myth. They still happen even when the process is iterative. That page even explains it. The advent of the GAN (2014-2018), which got overtaken by transformers in around 2017 for which GPTs and Diffusion models later got developed on. More hardware is what allowed those technologies to work better and bigger but without those breakthroughs you still wouldnt have the AI boom of today.

atomicbocks@sh.itjust.works on 13 May 15:58 collapse

I posted it because you claimed none of that happened before 2020.

ClamDrinker@lemmy.world on 13 May 16:24 collapse

I never claimed anything besides that breakthroughs did happen since you claimed, which is objectively true. You claimed very concretely that AI was the same for over a decade, aka it was the same in at least 2015 if I’m being charitable, all of these things were researched in the last 7-8 years and only became the products as we know them in the last 5 years. (Aka 2020)

atomicbocks@sh.itjust.works on 13 May 16:27 collapse

You are clearly only reading the parts you want to read. Have fun.

ClamDrinker@lemmy.world on 13 May 16:52 collapse

The absolute irony

spankmonkey@lemmy.world on 12 May 14:31 next collapse

I disagree with the base premise that being opt out needs to be a right. That implies that having data be harvested for companies to make profits should be the default.

We should have the right to not have our data harvested by default. Requiring companies to have an opt in process with no coercion or other methods of making people feel obligated to opt in is our right.

ItsComplicated@sh.itjust.works on 12 May 14:46 next collapse

being opt out needs to be a right. That implies that having data be harvested for companies to make profits should be the default.

As the years have passed, it has become the acceptable consensus for all of your personal information, thoughts, and opinions, to become freely available to anyone, at anytime, for any reason in order for companies to profit from it.

People keep believing this is normal and companies keep taking more. Unless everyone is willing to stand firm and say enough, I only see it declining further, unfortunately.

sugar_in_your_tea@sh.itjust.works on 13 May 00:06 next collapse

I’m there with you, and I’d join in a protest to get it.

Zenith@lemm.ee on 13 May 04:02 collapse

The death of the private life

General_Effort@lemmy.world on 12 May 16:18 next collapse

We should have the right to not have our data harvested by default.

How would that benefit the average person?

spankmonkey@lemmy.world on 12 May 16:49 next collapse

Send me your name, birthdate, web browsing history, online spending history, real time location, and a list of people you know and I will explain it to you.

sunzu2@thebrainbin.org on 12 May 16:59 next collapse

Less price gouging

General_Effort@lemmy.world on 12 May 19:28 collapse

How do you expect that to result?

sunzu2@thebrainbin.org on 12 May 19:32 collapse

I don't expect... It is already happening. Prime example are rents and wages.

There is nothing to be done about it. Too late

Dynamic pricing is a more current battle ground.

All of these are fixed based on cohort specific information and with dynamic pricing it can be literally individual level data.

General_Effort@lemmy.world on 12 May 19:38 collapse

The question was how “less price gouging” would result from a right not to have “your data harvested by default”.

sunzu2@thebrainbin.org on 12 May 19:46 collapse

By denying corpos data, their models are less effective especially if you are salting it when ever possible.

Do you really need Faceerh and Sundar the creep to have access to your tax returns and locations? Also, do you need them to know you like Asian women with large tits? Or that you and your friends enjoy a hobby?

General_Effort@lemmy.world on 12 May 23:15 collapse

Doesn’t that seem awfully roundabout? You make the practice less effective at the price of also making beneficial uses of the data, eg for medical research, less effective.

The mega-rich can see my tax returns if I can see theirs. The data of the rich and famous is much more valuable than mine. Let’s not pretend that this helps the little guy. The little guy doesn’t throw around money to get their flight data removed from Twitter.

sunzu2@thebrainbin.org on 13 May 04:05 collapse

🤡

FourWaveforms@lemm.ee on 13 May 21:12 collapse

By giving us the choice of whether someone else should profit by our data.

Same as I don’t want someone looking over my shoulder and copying off my test answers.

General_Effort@lemmy.world on 14 May 09:50 collapse

By giving us the choice of whether someone else should profit by our data.

What benefit do you expect from that?

Same as I don’t want someone looking over my shoulder and copying off my test answers.

Why not?

FourWaveforms@lemm.ee on 14 May 16:50 collapse

I prefer that the benefits of those things accrue to me, or to others, or to no one, in accordance with my choice.

In this way, I would decide who gains the economic or social benefits of these activities of mine; and I also, in the case of personal data, would decide who gets to make my business, their business.

General_Effort@lemmy.world on 14 May 18:15 collapse

Thanks for the answer.

taladar@sh.itjust.works on 12 May 17:42 next collapse

We should have the right to not have our data harvested by default.

I would maybe not go quite that far but at the very least this should apply to commercial interests and living people.

I think there are some causes where it should be acceptable to have your data usable by default, e.g. statistical analysis of health threats (think those studies about the danger of living near a coal power plant or similar things).

spankmonkey@lemmy.world on 12 May 17:44 next collapse

That implies that having data be harvested for companies to make profits should be the default.

I sure hope those studies are not being done by for profit companies!

sugar_in_your_tea@sh.itjust.works on 13 May 00:12 collapse

I disagree. Yes, there are benefits to a lot of invasions of privacy, but that doesn’t make it okay. If an entity wants my information, they can ask me for it.

One potential exception is for dead people, I think it makes sense for a of information to be released on death and preventing that should be opt in by the estate/survivors, depending on the will.

taladar@sh.itjust.works on 13 May 05:44 collapse

But they literally can’t ask you for it if it is about high volumes of data that only become useful if you have all or close to all of it like statistical analysis of rare events. It would be prohibitively expensive if you had to ask hundreds of thousands of people just to figure out that there is an increase in e.g. cancer or some lung disease near coal power plants.

sugar_in_your_tea@sh.itjust.works on 13 May 14:15 collapse

They don’t need most of the date, they need a statistically significant sample to have a high confidence in the result. And that’s a small percentage of the total population.

And you could have something on file where you opt in to such things, just like you can opt in to being an organ donor. Maybe make it opt out if numbers are important. But it cannot be publicly available without a way to say no.

Maeve@kbin.earth on 13 May 05:48 next collapse

Actually and time for data sales to be illegal. Not even opt-in.

sugar_in_your_tea@sh.itjust.works on 13 May 14:34 collapse

Exactly. The focus should be on data privacy, not on what technologies a service chooses to use.

oxysis@lemmy.blahaj.zone on 12 May 14:31 next collapse

Is it really though? I haven’t touched it since the very early days of slop ai. That was before I learned of how awful it is to real people

But_my_mom_says_im_cool@lemmy.world on 12 May 15:40 collapse

They don’t mean directly, i guarantee that companies, service providers, etc that you are with do indeed use Ai. That’s what I took the headline to mean. Some facet of everyone’s life uses Ai now

turtlesareneat@discuss.online on 12 May 17:59 collapse

Hell AI has been making fully automated kill-chain decisions for 5 years now. Yes it’s in everything.

DarthObi@feddit.org on 12 May 20:38 next collapse

You don’t need AI. There are enough porn sites with real humans.

ICastFist@programming.dev on 13 May 00:08 collapse

And lots of hentai for stuff that is humanly impossible

fxdave@lemmy.ml on 12 May 21:03 next collapse

The problem is not the tool. It’s the inability to use the tool without a third party provider.

blinx615@lemmy.ml on 12 May 23:45 collapse

Local is a thing. And models are getting smaller with every iteration.

WaitThisIsntReddit@lemmy.world on 12 May 22:48 next collapse

If there was an ai to detect ai would you use it?

NotASharkInAManSuit@lemmy.world on 13 May 03:48 collapse

Yes. That is actually an ideal function of ethical AI. I’m not against AI in regards to things that is is actually beneficial towards and where it can be used as a tool for understanding, I just don’t like it being used as a thief’s tool pretending to be a paintbrush or a typewriter. There are good and ethical uses for AI, art is not one of them.

underline960@sh.itjust.works on 13 May 00:17 next collapse

I doubt we’ll ever be offered a real opt-out option.

Instead I’m encouraged by the development of poison pills for the AI that are non-consensually harvesting human art (Glaze and Nightshade) and music (HarmonyCloak).

Loduz_247@lemmy.world on 13 May 01:40 next collapse

But do Glaze, Nightshade, and HarmonyCloak really work to prevent that information from being used? Because at first, it may be effective. But then they’ll find ways around those barriers, and that software will have to be updated, but only the one with the most money will win.

underline960@sh.itjust.works on 13 May 03:05 collapse

AI is a venture capital money pit, and they are struggling to monetize before the hype dies out.

If the poison pills work as intended, investors will stop investing “creative” AI when the new models stop getting better (and sometimes get worse) because they’re running out of clean content to steal.

Loduz_247@lemmy.world on 13 May 03:59 collapse

AI has been around for many years, dating back to the 1960s. It’s had its AI winters and AI summers, but now it seems we’re in an AI spring.

But the amount of poisoned data is minuscule compared to the data that isn’t poisoned. As for data, what data are we referring to: everything in general or just data that a human can understand?

T156@lemmy.world on 13 May 03:19 next collapse

Remind me in 3 days.

Although poison pills are only so effective since it’s a cat and mouse game, and they only really work for a specific version of a model, with other models working around it.

Zenith@lemm.ee on 13 May 04:00 collapse

I’ve deleted pretty much all social media, I’m down to only Lemmy. I only use my home PC for gaming, like CiV or cities skylines or search engines for things like travel plans. I’m trying to be as offline as possible because I don’t believe there’s any other way to opt out and I don’t believe there ever will be. Like opting out of the internet is practically impossible, AI will get to this point as well

smarttech@lemmy.world on 13 May 05:07 next collapse

AI is everywhere now, but having the choice to opt out matters. Sometimes, using tools lik Instant Ink isn’t about AI it’s just about saving time and making printing easier.

Maeve@kbin.earth on 13 May 05:46 next collapse

It should be opt in

RvTV95XBeo@sh.itjust.works on 13 May 05:55 next collapse

If AI is going to be crammed down our throats can we at least be able to hold it (aka the companies pushing it) liable for providing blatantly false information? At least then they’d have incentive to provide accurate information instead of just authoritative information.

Womble@lemmy.world on 14 May 05:51 collapse

As much as you can hold a computer manufacturer responsible for buggy software.

KeenFlame@feddit.nu on 13 May 08:11 next collapse

Ah yes. The “freedom” the usa has spread all over its country and other nations… Yes of course we must protect that freedom that is ofc a freedom for people to avoid getting owned by giant corporations. We must protect the freedom of giant corporations to not give us ai if they want to. I don’t disagree but think people are more important

Irelephant@lemm.ee on 13 May 14:34 next collapse

You can opt-out by deleting your accounts on corporate social networks.

iknowitwheniseeit@lemmynsfw.com on 13 May 14:50 next collapse

Can you? When all businesses start using AI for customer interaction…

Don_alForno@feddit.org on 14 May 06:28 collapse

Haha, no you can’t.

backgroundcow@lemmy.world on 14 May 05:08 next collapse

I very much understand wanting to have a say against our data being freely harvested for AI training. But this article’s call for a general opt-out of interacting with AI seems a bit regressive. Many aspects of this and other discussions about the “AI revolution” remind me about the Mitchell and Web skit on the start of the bronze age: youtu.be/nyu4u3VZYaQ

FriendBesto@lemmy.ml on 14 May 05:37 next collapse

It is not hard. One just has to be committed enough.

PalimpsestNavigator@midwest.social on 14 May 06:11 next collapse

WUH OH, A COTTON GIN’S TRYNA REPLACE US!!

🥴

<img alt="" src="https://midwest.social/pictrs/image/56f235c6-46ec-40fe-ad2d-14d1f2c8eb84.jpeg">

Shayeta@feddit.org on 14 May 10:06 collapse

Nope, screw opt-out. OPT-IN ONLY, i want it to be disabled by default.