Propagandists are using AI too—and companies need to be open about it (www.technologyreview.com)
from BodaciousMunchkin@links.hackliberty.org to technology@lemmy.world on 18 Jun 2024 16:34
https://links.hackliberty.org/post/1871071

#technology

threaded - newest

autotldr@lemmings.world on 18 Jun 2024 16:35 collapse

This is the best summary I could come up with:


The company had caught five networks of covert propagandists—including players from Russia, China, Iran, and Israel—using their generative AI tools for deceptive tactics that ranged from creating large volumes of social media comments in multiple languages to turning news articles into Facebook posts.

The transparent disclosure that this has begun to happen—and that OpenAI has prioritized detecting it and shutting down accounts to mitigate its impact—shows that at least one large AI company has learned something from the struggles of social media platforms in the years following Russia’s interference in the 2016 US election.

Perhaps most important, Meta included a direct set of “recommendations for stronger industry response” that called for governments, researchers, and other technology companies to collaboratively share threat intelligence to help disrupt the ongoing Russian campaign.

The Meta report’s call for threat sharing and collaboration, although specific to a Russian adversary, highlights a broader path forward for social media platforms, AI companies, and academic researchers alike.

While OpenAI’s first report offered high-level summaries and select examples, expanding data-sharing relationships with researchers that provide more visibility into adversarial content or behaviors is an important next step.

In our own research, we’ve seen communities of Facebook users proactively call out AI-generated image content created by spammers and scammers, helping those who are less aware of the technology avoid falling prey to deception.


The original article contains 1,573 words, the summary contains 223 words. Saved 86%. I’m a bot and I’m open source!