Generative AI Is the Newest Tool in the Dictator's Handbook (gizmodo.com)
from FlyingSquid@lemmy.world to technology@lemmy.world on 04 Oct 2023 10:41
https://lemmy.world/post/6276101

#technology

threaded - newest

autotldr@lemmings.world on 04 Oct 2023 10:50 next collapse

This is the best summary I could come up with:


When protests in Pakistan earlier this year escalated into clashes between pro-government forces and supporters of former Prime Minister Imran Khan, the now-imprisoned leader turned to social media to bolster his message.

“This use of AI also masks the role of the state in censorship and may ease the so-called digital dictator’s dilemma, in which undemocratic leaders must weigh the benefits of imposing online controls against the costs of public anger at such restrictions,” the report adds.

In other cases, state actors are reportedly turning to private “AI for hire” companies that specialize in creating AI-generated propaganda intended to mimic real newscasters.

The Freedom House researchers see these novel efforts to generate deepfake newscasters as a technical and tactical evolution of governments forcing or paying news stations to push propaganda.

“These uses of deepfakes are consistent with the ways in which unscrupulous political actors have long employed manipulated news content and social media bots to spread false or misleading information,” the report notes.

Though the majority of political manipulation and disinformation efforts discovered by Freedom House in the past year still primarily rely on lower tech deployment of bots and paid trolls, that equation could flip as generative AI tools continue to become more convincing and drop in price.


The original article contains 848 words, the summary contains 209 words. Saved 75%. I’m a bot and I’m open source!

anticommon@sh.itjust.works on 04 Oct 2023 17:58 collapse

I read this as director. I was immensely confused as I read further

bionicjoey@lemmy.ca on 04 Oct 2023 22:50 collapse

I think the SGA will make sure that isn’t the case