Meta says European regulators are ruining its AI bot (www.theverge.com)
from Stopthatgirl7@lemmy.world to technology@lemmy.world on 15 Jun 01:48
https://lemmy.world/post/16541148

Meta is putting plans for its AI assistant on hold in Europe after receiving objections from Ireland’s privacy regulator, the company announced on Friday

In a blog post, Meta said the Irish Data Protection Commission (DPC) asked the company to delay training its large language models on content that had been publicly posted to Facebook and Instagram profiles.

Meta said it is “disappointed” by the request, “particularly since we incorporated regulatory feedback and the European [Data Protection Authorities] have been informed since March.”** **Per the Irish Independent, Meta had recently begun notifying European users that it would collect their data and offered an opt-out option in an attempt to comply with European privacy laws.

#technology

threaded - newest

autotldr@lemmings.world on 15 Jun 01:50 next collapse

This is the best summary I could come up with:


Meta is putting plans for its AI assistant on hold in Europe after receiving objections from Ireland’s privacy regulator, the company announced on Friday.

Meta said it will “continue to work collaboratively with the DPC.” But its blog post says that Google and OpenAI have “already used data from Europeans to train AI” and claims that if regulators don’t let it use users’ information to train its models, Meta can only deliver an inferior product.

“We are pleased that Meta has reflected on the concerns we shared from users of their service in the UK, and responded to our request to pause and review plans to use Facebook and Instagram user data to train generative AI,” Stephen Almond, the executive director of regulatory risk at the UK Information Commissioner’s Office, said in a statement.

The DPC’s request followed a campaign by the advocacy group NOYB — None of Your Business — which filed 11 complaints against Meta in several European countries, Reuters reports.

NOYB founder Max Schrems told the Irish Independent that the complaint hinged on Meta’s legal basis for collecting personal data.

“Meta is basically saying that it can use any data from any source for any purpose and make it available to anyone in the world, as long as it’s done via AI technology,” Schrems said.


The original article contains 354 words, the summary contains 217 words. Saved 39%. I’m a bot and I’m open source!

Hobbes_Dent@lemmy.world on 15 Jun 02:07 next collapse

Off-topic but that red and blue illustration is an eye trip.

can@sh.itjust.works on 15 Jun 04:12 next collapse

And ostensibly a human approved it

littlewonder@lemmy.world on 16 Jun 03:15 collapse

It’s wild to me considering organizations are more aware of web accessibility than ever.

lets_get_off_lemmy@reddthat.com on 15 Jun 02:13 next collapse

Tough shit Zuck

goatsarah@thegoatery.dyndns.org on 15 Jun 02:28 next collapse

@Stopthatgirl7 if they paid attention to the opt out, they wouldn’t have asked me three times, when I said to opt out the first time.

tal@lemmy.today on 15 Jun 02:39 next collapse

asked the company to delay training its large language models on content that had been publicly posted to Facebook and Instagram profiles.

I think that there are definitely issues with mass data-mining of that data. For generative AIs being trained on image data, I don’t really care – I think that concerns there are hugely overblown. But it’s also possible to do things like build a mass facial recognition database with image data, and I’m pretty sure that text-processing is also an issue.

However.

The problem is that this applies to anyone. Like, I am confident that someone has gone out and scraped publicly-available data from Facebook and similar before. I know that someone has dumped Reddit comment and post history; you can download those. I am very confident that someone is either, right now, or if not now, will be if the Threadiverse gets big enough, dumping my comment data here and will be doing all kinds of processing on it.

That is, I don’t think that Meta is the issue here. Meta would be the issue if processing private data were the issue, because only Meta and a limited set of users have access to that. The problem here is people posting publicly-accessible data that can potentially be used in ways that they might not want, potentially not understanding the implications of doing so. Meta’s only responsible there in that they maybe encourage users to do so, have profile photos or whatnot.

And…I don’t really have a great fix for that. Like, I think – like most people here, obviously, as every single user I’ve seen on the Threadiverse uses a pseudonym – that pseudonymity is at least a partial fix. There isn’t a (direct) link to a real-world identity; someone would have to go to the work of deanonymizing account data. Few if any people on the Threadiverse seem to have a real profile photo. I use a swirl of water. So…that helps, because someone can’t trivially link that data to data elsewhere.

I don’t have any problem with someone training a model on information that I’ve posted publicly and just using it like a “better search engine”, the way people are now. That’s pretty low on my list of concerns.

But lemme give some concerns that might apply…and these aren’t really primarily about generating chatbots. One thing that you can do with text classifiers – which I think a lot of people out there don’t realize – is to search for and find some correlation in text. Like, the Federalist Papers, important documents about the US Constitution, were written by a few of the Founding Fathers under pseudonyms. Hundreds of years later, we went back and did statistical analysis – IIRC using Markov chains, to deanonymize them. That might be possible to deanonymize people.

You can also extract a lot of information about someone from their text. Some of it humans can do, like “if someone uses inches, they’re probably from the US”. But you can do that en masse, get probably a pretty good location on someone, identify regional slang and local spellings and such.

There’s software that can identify someone’s gender and give a confidence estimate from someone’s comments. You can probably – and I’m sure people have – train those classifiers to look for correlations on a lot of other things, like political views and such.

I had a buddy working in the video game industry who had a game that extracted a bunch of “employability” characteristics. Play for about ten minutes, and it logs a bunch of data about the gameplay. They trained a classifier to look for correlations in gameplay actions with IQ and a whole host of other things, so play the game, and you’re transferring a lot of personal data about yourself. I would imagine that lots of video games could do that, and that that might let games be another form of revenue if information is sold to data-brokers. Probably can do the same thing with comments.

And I’m not so sure that people who are posting material attached to their identity are always necessarily realizing just how much they might actually be posting. Not necessarily information that Meta in particular might analyze, but information that they’re handing to the world such that any organization that wants to do data-mining on it could analyze.

NeoNachtwaechter@lemmy.world on 15 Jun 04:49 next collapse

Long, but wrong :-)

my list of concerns.

It is not about your concerns, and it is not about concerns at all.

When they try to do forbidden things, then someone is going to tell them, and if they do it anyway (like that whole ‘concerns’ attitude seems to suggest), then someone is going to give them what they deserve.

General_Effort@lemmy.world on 15 Jun 15:23 next collapse

What about is wrong?

[deleted] on 15 Jun 21:25 collapse

.

General_Effort@lemmy.world on 15 Jun 15:25 collapse

But it’s also possible to do things like build a mass facial recognition database with image data,

Facebook built one years ago, but ended up destroying it. theverge.com/…/meta-facebook-face-recognition-aut…

tal@lemmy.today on 15 Jun 16:06 collapse

Thanks. That also kind of drives home the “I’m sure that third parties are scraping data and analyzing it too” thing:

Facebook’s decision won’t stop independent companies like Clearview AI — which built huge image databases by scraping photos from social networks, including Facebook — from using facial recognition algorithms trained with that data. US law enforcement agencies (alongside other government divisions) work with Clearview AI and other companies for facial recognition-powered surveillance.

Lost_My_Mind@lemmy.world on 15 Jun 02:46 next collapse

Good.

iAmTheTot@sh.itjust.works on 15 Jun 03:15 next collapse

Good.

LordWiggle@lemmy.world on 15 Jun 04:39 next collapse

Europe says Meta is ruining people their privacy and rights. Meta is like complaining there are guards and a locked safe ruining their bank heist. It’s what those measures are for, keeping you in line.

NeoNachtwaechter@lemmy.world on 15 Jun 04:41 next collapse

offered an opt-out option in an attempt to comply with European privacy laws.

LMAO have they still not realized that any “opt-out” kind of coercion is forbidden now?

praise_idleness@sh.itjust.works on 15 Jun 05:52 next collapse

Meta is ruining the Internet.

Cosmicomical@lemmy.world on 15 Jun 07:12 collapse

Meta has ruined the internet

krysel@lemmy.ml on 15 Jun 16:39 collapse

Meta has ruined large parts of our society

MalReynolds@slrpnk.net on 15 Jun 05:59 next collapse

Narrator: They were lying about not using the data. They already had.

Eggyhead@kbin.run on 15 Jun 06:38 next collapse

Mafia: cops are ruining our extortion business!

Poor mafia.

riodoro1@lemmy.world on 15 Jun 08:44 next collapse

Yet „dumb fucks” still on the platforms.

_sideffect@lemmy.world on 15 Jun 14:00 next collapse

Ruining anything of meta sounds like a positive to me

Hackworth@lemmy.world on 15 Jun 14:02 collapse

As long as no one messes with their open source contributions… (ditto for MS)

rottingleaf@lemmy.zip on 15 Jun 14:10 next collapse

Ruin it deeper, baby

billwashere@lemmy.world on 15 Jun 20:41 next collapse

Good.

Adalast@lemmy.world on 15 Jun 20:58 next collapse

<img alt="" src="https://i.kym-cdn.com/photos/images/newsfeed/001/881/867/6f6.jpg">

Peter_Arbeitslos@discuss.tchncs.de on 15 Jun 21:11 next collapse

Isn’t that the reason of regulating shitty things?

barsquid@lemmy.world on 16 Jun 18:33 collapse

They’re used to the US system of regulations where they can just pay Congress a few tens of thousands.

raspberriesareyummy@lemmy.world on 15 Jun 21:28 next collapse

Get fucked, Facebook. Rot in pain, Zuckfuck.

barsquid@lemmy.world on 16 Jun 18:32 collapse

Get fucked, Meta.