There’s no such thing as an AI system without values — and that means this newest technology platform must navigate partisan rifts, culture-war chasms and international tensions from the very beginning.
Every step in training, tuning and deploying AI models forces its creators to make choices about whose values the system will respect, whose point of view it will present and what limits it will observe.
The creators of previous dominant tech platforms, such as the PC and the smartphone, had to wade into controversies over map borders or app store rules. But those issues lay at the edges rather than at the center of the systems themselves.
“This is the first time a technology platform comes embedded with values and biases,” one AI pioneer, who asked not to be identified, told Axios at last week’s World Economic Forum in Davos.
“That’s something countries are beginning to notice.” AI systems’ points of view begins in the data with which they are trained — and the efforts their developers may take to mitigate the biases in the data.
From there, most systems undergo an “alignment” effort, in which developers try to make the AI “safer” by rating its answers as more or less desirable.
Yes, but: AI’s makers routinely talk about “alignment with human values” without acknowledging how deeply contested all human values are.
In the U.S., for instance, you can say your chatbot AI is trained to “respect human life,” but then you have to decide how it handles conversations about abortion.
You can say that it’s on the side of human rights and democracy, but somehow it’s going to have to figure out what to say about Donald Trump’s claim that the 2020 election was stolen.
As many AI makers struggle to prevent their systems from showing racist, antisemitic or anti-LGBTQ tendencies, they face complaints that they’re “too woke.” The Grok system from Elon Musk’s X.ai is explicitly designed as an “anti-woke” alternative.
Globally, things get even trickier. Some of the biggest differences are between the U.S. and China — but for geopolitical reasons, U.S.-developed systems are likely to be inaccessible in China and vice versa.