The Vatican released a new document concerning artificial intelligence (AI) on Tuesday, cautioning that this technology has the potential to erode the trust that underpins societal structures, leading to the possibility of individuals becoming mere “cogs in a machine.”
Earlier this month, Pope Francis, who has frequently been a target of misinformation and manipulated images, expressed concerns that AI could be “misused to manipulate minds.” The latest Vatican document expands on these concerns.
The text states, “AI-generated false media can gradually undermine the foundations of society.”
It was produced by two departments within the Vatican and received approval from the 88-year-old pontiff.
The document further asserts that as deepfakes prompt individuals to question the authenticity of all information, and as AI-generated misinformation diminishes trust in the content individuals encounter, societal polarization and conflict are likely to increase.
The implications of such widespread deception are significant; it fundamentally challenges the essence of humanity by dismantling the essential trust upon which cohesive societies are established.
This release follows the announcement made three weeks prior by Meta, the parent company of Facebook, regarding its decision to discontinue its third-party fact-checking initiative in the United States and instead implement a crowd-sourced approach to combat misinformation, akin to the methods employed by Elon Musk’s Twitter.
During the G7 summit held in Italy last June, the Pope addressed the issue of technology, emphasizing that individuals should not allow algorithms to determine their future.
The Vatican has issued a new document entitled “Antica et Nova” (Ancient and New), which examines the implications of artificial intelligence across various sectors, including the labor market, healthcare, and education.
“As in all areas where humans are called to make decisions, the shadow of evil also looms here,” it said.
“The moral evaluation of this technology will need to take into account how it is directed and used.”