Entering the age of artificial truth
... AI researchers Maggie Harrison and Jathan Sadowski have each drawn attention to what the latter cleverly termed “Habsburg AI,” which appears when AI-generated information is fed back into another AI program on a loop. What results is a sort of information “inbreeding” that drives the AI mad, causing it to spew abominations of data. Yet even absent these conditions, human influence on the information filtering process creates opportunities for additional forms of distortion. ...
Earlier this month, I published a study describing how disinformation made its way into trusted sources and shaped the consensus to invade Iraq in 2003. If available at the time, AI-powered news filters could have further reinforced that narrative and stifled or altogether silenced opposition. Such a predicament emerged during the COVID-19 pandemic and the 2020 presidential election, as social media platforms banned what they considered suspect reports that wound up being true. Society’s insatiable demand for rapid and continuous information access has also become a lucrative market that large language models are perfectly suited to exploit. ...
If these practices are not curbed, they could produce a Tower of Babel effect by creating an online ecosystem of self-replicating fictions. Americans read fewer books, have less faith in the news, view higher education as less important and rely more than ever on TikTok for their news, all of which makes the modern world fertile ground for algorithmic manipulation. Making matters worse, traditional checks on specious information — such as expert knowledge, reputable publishing agenciesand hard news sources — have lost much of their influence. ...
AI’s threat to society therefore looks less like James Cameron’s vision of a cyborg Armageddon and more like a hopelessly polluted information environment in which everything is disputed and meaningful communication is impossible. ...
If Washington and Silicon Valley wade into the age of artificial truth without a clear strategy for managing its risks, America could end up drowning in a sea of incoherence.
Capt. Michael P. Ferguson, U.S. Army, is a Ph.D. student in the Department of History at the University of North Carolina at Chapel Hill. He is coauthor of “The Military Legacy of Alexander the Great: Lessons for the Information Age.”
See the full story here: https://thehill.com/opinion/technology/4172906-entering-the-age-of-artificial-truth/
Pages
- About Philip Lelyveld
- Mark and Addie Lelyveld Biographies
- Presentations and articles
- Tufts Alumni Bio