AI and Democracy
Just a quick thought prompted by my reading of a talk given by Allison Stanger during the Santa Fe Institute’s 2019 Fall symposium. First she gives us a nice quote from Hannah Arendt’s “The Human Condition” who says the question is not …
“whether we are the masters or slaves of our machines, but whether machines still serve the world and its things or if, on the contrary, they and the automatic motion of their processes have begun to rule and even destroy the world and its things.”
This quote obviously resonates with fundamental interest to those of us struggling to understand the economy as we hurtle into the digital age.
The central issue as Stanger describes it is that all the algorithms based on big data, those that increasingly drive chunks of how we deal with each other, are based on a sort of collectivism. They rely on groups. They rely on large-scale analysis and the identification of structures detected in seas of data. They abstract away the individual.
Contrast that with the underlying assumption of western liberal democracy which is intensely individual. The people who gave us the intellectual tradition upon which liberal democracy is based tried to recapture from the ancient past the notion of an individual, a citizen, from which to construct the political realm. This was in order to overturn the prior existing oppression — or what was perceived as oppression — exerted by the traditional aristocratic, religious, and military elites. That oppression was a consequence of notions of group privileges. So the reaction had to bring the individual to the fore.
So the rights of the individual became paramount.
Let’s set aside that this is a decidedly truncated version of history. The principle is what matters: there is a clash between the underlying values of big data and of liberal democracy.
Perhaps this is why people like those who run Facebook have such a hard time dealing with privacy. It is simply something that doesn’t, cannot, exist in their realm. They have had to abolish it in order to establish their business model. Their machines trample over us as a consequence.
And we have no way to resist because our value system is protected by the institutions we designed for that purpose. In particular the nation state and its panoply of laws etc. The rule of law and the rule of big data have very different foundational principles. Facebook thinks of itself as apart from the rule of law established in our republic. They are separate from any law in any state. They are a global organization answerable to no one. All that matters is their profitability. And to produce profits they have to ignore the tenets of liberal democracy. They, necessarily, have an illiberal worldview.
Our sometimes ambiguous relationship with machinery has always been based on a conflict of values. Machines embody past information in order to preserve it and make it useful into the future. They do this reliably and repetitively. They replace and improve the human element in production by so doing. This magnifies our productivity and, ultimately, our prosperity. But the intelligence embodied in the older machines was always extant before the machine.
This is not quite true with artificial intelligence. Nowadays our machines seek to create intelligence. They have a realm of their own. And that realm is based on a worldview at odds with democracy and its basic individual values.
As Stanger pints out, this is why authoritarian regimes have taken to big data like fish to water. They begin unencumbered by concerns for the individual. So the Chinese are quite content building all sorts of surveillance machines. Whereas we in the West are much more ambivalent.
Yet because of the degradation of our pubic sphere and our decades long surrender of socio-economic decision making to the corporate world we find ourselves under surveillance with our institutions unable to protect us.
I don’t want us to get bogged down in a pedantic discussion with this observation. The basic principle as expressed by Stanger seems to have great importance. The individualistic basis of liberal democracy appears to be in conflict with the group/big data basis of artificial intelligence.
How do we deal with this?
Addendum:
The Stanger talk can be found in:
Complexity Economics. Dialogues of the Applied Complexity Network. Proceedings of the Santa Fe Institute’s 2019 Fall Symposium. Edited by Brian Arthur, Eric Beinhocker, and Allison Stanger.
Hannah Arendt’s “The Human Condition” was published in 1958 so it predates the arrival of big data and its disruption of the privacy of the individual. Her discussion of the emergence and separation of the public and private spheres of life, however, seems remarkably contemporary given that arrival.