Monsters: AI conspiracies flood TikTok

WASHINGTON  -  From vampires and wendigos to killer asteroids, TikTok users are pump­ing out outlandish end-of-the-world conspiracy theories, researchers say, in yet another misinformation trend on a platform whose fate in the Unit­ed States hangs in the balance. In the trend reported by the nonprofit Media Matters, TikTok users seek to mon­etize viral videos that make unfounded claims about the US government se­cretly capturing or preserving mythi­cal monsters that include -- wait for it -- King Kong. It is the latest illustration of misinformation swirling on the plat­form -- a stubborn issue that has been largely absent in recent policy debates as US lawmakers mull banning the Chinese-owned app on grounds of national security. Often accompanied by spooky background music, the vid­eos -- many of which garner millions of views -- feature imperious AI-gen­erated voices, sometimes mimicking celebrities. “We are all probably going to die in the next few years. Did you hear about this?” said a voice imper­sonating podcaster Joe Rogan in one viral video. “There’s this asteroid that is on a collision course with Earth,” the voice claims, citing information leaked by a government official who stumbled upon a folder titled “Keep secret from the public.” At least one account ped­dling that video appeared to be deac­tivated after AFP reached TikTok for comment. Conspiracy theory videos, often posted by anonymous accounts, typically had the tell-tale signs of AI-generated images such as extra fingers and distortions, said TikTok misinfor­mation researcher Abbie Richards. According to Richards, peddling such theories can be financially rewarding, with TikTok’s “Creativity Program” designed to pay creators for content generated on the platform. It has spawned what she called a cottage industry of conspiracy theory vid­eos powered by artificial intelligence tools including text-to-speech appli­cations that are widely -- and freely -- available online. A TikTok spokes­woman insisted that “conspiracy the­ories are not eligible to earn money or be recommended” in user feeds. “Harmful misinformation is prohib­ited, with our safety teams removing 95 percent of it proactively before it’s reported,” she told AFP. Still, tu­torials on platforms such as YouTube show users how to create “viral con­spiracy theory videos” and profit off TikTok’s Creativity Program. One such tutorial openly instructed us­ers to start by making up “something outrageous” such as “scientists just got caught hiding a saber-toothed tiger.” “Financially incentivizing con­tent that is both highly engaging and cheap to manufacture creates an en­vironment for conspiracy theories to thrive,” Richards wrote in the Media Matters report. Such concerns, driven by rapid advancements in AI, are par­ticularly high in a year of major elec­tions around the world.

ePaper - Nawaiwaqt