Problems with Orion model forces OpenAI to change strategies (www.techzine.eu)
from Joker@sh.itjust.works to technology@lemmy.world on 11 Nov 2024 13:05
https://sh.itjust.works/post/27935815

#technology

threaded - newest

Imgonnatrythis@sh.itjust.works on 11 Nov 2024 13:16 next collapse

Training AI on “synthetic” data generated from other AIs sounds genius! Seems like a bulletproof way to make AI infinity smarter just by recuressively feeding itself! Great success is on the horizon!

itsathursday@lemmy.world on 11 Nov 2024 13:34 next collapse

I like to call it, saving the red jpeg. One more save will make it better surely.

FaceDeer@fedia.io on 11 Nov 2024 15:12 next collapse

That's not how synthetic data generation generally works. It uses AI to process data sources, generating well-formed training data based on existing data that's not so useful directly. Not to generate it entirely from its own imagination.

The comments assuming otherwise are ironic because it's misinformation that people keep telling each other.

Voroxpete@sh.itjust.works on 11 Nov 2024 15:18 collapse

It’s been proven that even small amounts of synthetic data injected into a training set quickly leads to a phenomenon termed “model collapse”, though I prefer the term “Hapsburg AI” (not mine).

Basically, this is the kind of thing you announce you’re doing because it will hopefully get you one more round of investment funding while Sam Altman finishes working out how to fake his death.

[deleted] on 11 Nov 2024 13:32 next collapse

.

_sideffect@lemmy.world on 11 Nov 2024 14:02 collapse

Like photocopying a copy over and over again