OpenAI said on Thursday that it had identified and disrupted five online campaigns that used its generative artificial intelligence technologies to deceptively manipulate public opinion around the world and influence geopolitics.
The efforts were run by state actors and private companies in Russia, China, Iran and Israel, OpenAI said in a report about covert influence campaigns. The operations used OpenAI’s technology to generate social media posts, translate and edit articles, write headlines and debug computer programs, typically to win support for political campaigns or to swing public opinion in geopolitical conflicts. ...
Like Google, Meta and Microsoft, OpenAI offers online chatbots and other A.I. tools that can write social media posts, generate photorealistic images and write computer programs. In its report, the company said its tools had been used in influence campaigns that researchers had tracked for years, including a Russian campaign called Doppelganger and a Chinese campaign called Spamouflage.
The Doppelganger campaign used OpenAI’s technology to generate anti-Ukraine comments that were posted on X in English, French, German, Italian and Polish, OpenAI said. The company’s tools were also used to translate and edit articles that supported Russia in the war in Ukraine into English and French, and to convert anti-Ukraine news articles into Facebook posts. ...
See the full story here: https://www.nytimes.com/2024/05/30/technology/openai-influence-campaigns-report.html?mc_cid=e892866c6a&mc_eid=cf24d7da5b