Apple to Analyze User Data on Devices to Bolster AI Technology. (machinelearning.apple.com)
from Tea@programming.dev to technology@lemmy.world on 14 Apr 20:52
https://programming.dev/post/28618344

#technology

threaded - newest

hobovision@lemm.ee on 14 Apr 21:31 next collapse

Apple is the best on privacy though right?

huppakee@lemm.ee on 14 Apr 21:49 next collapse

Yes they have said so themselves

Reyali@lemm.ee on 14 Apr 22:05 collapse

Tell me you didn’t read the article without telling me you didn’t read the article.

The entire thing is explaining how they are upholding privacy to do this training.

  1. It’s opt-in only (if you don’t choose to share analytics, nothing is collected).
  2. They use differential privacy (adding noise so they get trends, not individual data).
  3. They developed a new method to train on text patterns without collecting actual messages or emails from devices. (link to research on arXiv)
MurrayL@lemmy.world on 14 Apr 22:10 next collapse

Right. There’s plenty to criticise Apple for, both in general and for chasing the AI trend, but looking at it purely in terms of user privacy within AI features they’re miles ahead of the competition.

hobovision@lemm.ee on 14 Apr 23:43 next collapse

I had scanned through it, and it looked like the exact same stuff that Google and Microsoft say. Paraphrasing: “we value your privacy” “we’re de-identifying your data” “the processing occurs on-device”…

Apple probably is better on privacy than other big tech corpos, but it’s a race to the bottom, and they’re definitely participating in the race.

dependencyinjection@discuss.tchncs.de on 15 Apr 06:29 collapse

But it’s opt-in. So only people who choose to.

deleted@lemmy.world on 15 Apr 07:53 next collapse

I wanted to know the battery cycle for my iPhone 7 in 2019 and the only way was to enable analytics and diagnostics data collection with Apple.

Thankfully, now it’s in the settings.

Nalivai@lemmy.world on 15 Apr 18:48 collapse

I call bullshit. They might say they’re opt-in, I bet they have some way to use the personal data that technically doesn’t violate very specific wording of the rule.

dependencyinjection@discuss.tchncs.de on 15 Apr 19:04 collapse

Rather than betting on conjecture you might do well to search for blog posts of security researchers that test these kinds of claims. Then you could have posted that instead of going on feelings.

Like this.

Lawsuit based on research by mysk not sure in the outcome of this though and whether the the claims that there is a difference between data collection for selling to data brokers vs data collection to improve the user experience. As I developer myself we will collect data to help us understand our software better and with no intention to do anything with it.

Another one appears to be about a flaw in how they anonymise data, specifically local differential privacy.

So it does appear that there are claims about opt-in, but I didn’t see anything concrete in my cursory look and I’m not afraid to post articles attacking my own point.

deleted@lemmy.world on 15 Apr 07:51 collapse

To be honest, it’s important to the point it should be in the title since privacy is the selling point for apple.

Reyali@lemm.ee on 15 Apr 21:04 collapse

Yeah, that’s on OP. The article is actually titled, “Understanding Aggregate Trends for Apple Intelligence Using Differential Privacy.”

CompactFlax@discuss.tchncs.de on 14 Apr 22:18 next collapse

Ben Thompson has been saying that they need to collect user data (like google) for a decade.

It seems the botched Apple Intelligence release changed some minds, a little bit.

Salvo@aussie.zone on 15 Apr 00:34 collapse

That still doesn’t give them the right to mine the data that their users entrusted to them though a paid service.

It doesn’t matter how anonymised their harvesting is, they had an agreement with their subscribers not to invade their privacy like this.

We are better off with a LLM that doesn’t work than abusing the data entrusted to them by their users.

It won’t be long until the LLM bubble bursts and we all laugh about how stupid we were to think they had any use whatsoever.

CompactFlax@discuss.tchncs.de on 15 Apr 02:05 next collapse

I guess you didn’t see the several points in the article where they make it clear that it is “opt in”?

I do look forward for the bursting of the LLM bubble, but the article isn’t just about LLM.

discuss.tchncs.de/comment/17767086

Salvo@aussie.zone on 15 Apr 02:25 collapse

Is this the same “Opt-In” as keeping Apple Intelligence disabled between software updates?

Apple are haemorrhaging a lot of hard earned goodwill every time they try to move forward with their own AI.

Ledericas@lemm.ee on 15 Apr 09:32 collapse

Much like Google and ms is doing

Salvo@aussie.zone on 15 Apr 10:17 collapse

Anyone who uses gMail knows (or should know) that their data is being used for commercial purposes. Any business that uses Google.Business or MS Office should also be aware that they are giving away all their corporate secrets, regardless of any “Opt-In”/“Opt-Out” broken promises.

Ledericas@lemm.ee on 15 Apr 09:31 collapse

They already admitted they aren’t generating profit from it

Salvo@aussie.zone on 15 Apr 10:15 collapse

If they get Apple Intelligence into a functional form, (and not an embarrassing hilarious punchline in an anecdote), the will be profiting of my data.

They can claim that it is Opt-In only (until a bug the next software update ‘accidentally’ changes my Opt-out status) and they can Anonymize my data, but that still doesn’t change the fact that they inferred that hey wouldn’t use my data.

At least their user abuse is still less than Mozilla and Google threw out the “Don’t be Evil” motto decades ago…

plz1@lemmy.world on 15 Apr 01:30 next collapse

It would be nice if they actually fixed the stability issues in Apple Intelligence before they start adding more layers of slop to it. Writing tools summarization has been broken off and on since it launched.

LordCrom@lemmy.world on 15 Apr 04:47 next collapse

Holy crap , this is really intrusive. It’s opt in, but who would opt in to this harvesting at all?

Eggyhead@lemmings.world on 15 Apr 08:29 collapse

Opt in means they’re building up the infrastructure to make it opt-out when nobody is looking.

taladar@sh.itjust.works on 15 Apr 11:12 collapse

And then “accidentally” lose the opt-out setting every other update or so.

fubarx@lemmy.world on 15 Apr 05:08 next collapse

Was working on a simulator and needed random interaction data. Statistical randomness didn’t capture likely scenarios (bell curves and all that). Switched to LLM synthetic data generation. Seemed better, but wait… seemed off 🤔. Checked it for clustering and entropy vs human data. JFC. Waaaaaay off.

Lesson: synthetic data for training is a Bad Idea. There are no shortcuts. Humans are lovely and messy.

Viri4thus@feddit.org on 15 Apr 06:28 collapse

Oh look, it’s the second shoe dropping.