The origin of the panopticon can be traced back to the famous English philosopher, Jeremy Bentham who, in the late 18th century, propounded the idea of a centralised arrangement as a principle in the design of prisons, factories, schools and hospitals. It allowed a single person to watch all the inmates, without the inmates being able to decipher whether they were being watched at a point in time or not. In his book, Discipline and Punish, Michael Foucault – the French philosopher, historian and social theorist, furthered the concept of the panopticon (1975) and used it as a metaphor to illustrate the practise of mass surveillance, through which disciplinary societies control and exercise asymmetrical power over their citizens. He commented, ‘…the major effect of the Panopticon is to induce in the inmate a state of conscious and permanent visibility that assures the automatic functioning of power’.
AI surveillance technology is spreading at a faster rate to a wider range of countries than experts have commonly understood.
Cut to 2020. Digital and data surveillance is being enforced en masse by governments the world over, seemingly in the interest of protecting national security as well as thwarting terrorism, crime and social unrest. The issue came to sharp prominence in 2013 when Edward Snowden leaked thousands of Australian, British and Canadian intelligence files that revealed the scale of global surveillance carried out by the members of the UKUSA alliance in their efforts to implement global surveillance. The very fact that Snowden was charged with espionage and theft of government property is indicative of the governments’ belief in their total ownership of the surveillance data. Since then, the surveillance has only intensified and become all pervasive. A report by Carnegie Endowment for international Peace states that, ‘at least seventy-five out of 176 countries globally are actively using AI technologies for surveillance purposes including smart city/safe city platforms (fifty-six countries), facial recognition systems (sixty-four countries), and smart policing (fifty-two countries)’.
Our civic spaces are threatened like never before.
Article 19’s website defines Civic space as, ‘the place where individuals realise their rights. It is the freedom to speak and to access the means to do so: to receive information, participate in public decision-making, organise, associate, and assemble.’ It adds that, ‘a robust and protected civic space forms the cornerstone of accountable, responsive, democratic governance and stable open societies.’ These public spaces can be physical, for example, parks, streets and squares, as well as digital including the social media platforms, messaging apps and the internet. Let us go back to Foucault’s expansion of the concept of a Panopticon to understand how surveillance is carried out in civic spaces.
“They are like so many cages, so many small theatres, in which each actor is alone, perfectly individualised and constantly visible. The panoptic mechanism arranges spatial unities that make it possible to see constantly and to recognize immediately. In short, it reverses the principle of the dungeon; or rather of its three functions — to enclose, to deprive of light and to hide — it preserves only the first and eliminates the other two. Full lighting and the eye of a supervisor capture better than darkness, which ultimately protected. Visibility is a trap” – Foucault, 1975
While access to the civic spaces creates an illusion for the freedom of speech and association, our activities are constantly monitored. The technologies being deployed for these purposes include: mass surveillance, IMSI catchers, remote hacking, mobile phone extraction, social media monitoring, facial recognition cameras, and predictive policing. Increasing sophistication of surveillance technologies and AI algorithms has augmented the capabilities of police and intelligence agencies to become simultaneously invisible and omnipresent as they track social media posts, web search histories and physical movements without the knowledge and consent of the people involved.
Violation of right of privacy, expression and association as a form of biopower.
Not only do these surveillance mechanisms violate the fundamental right of privacy by thwarting people’s freedom of expression, dialogue and redressal, they have other far reaching implications as well. Increased awareness of these technologies leads to fear and apprehension among people on potential ramifications, and they start to consciously self-censor their words, actions and associations. This is a manifestation of the biopower, where governments employ ‘an explosion of numerous and diverse techniques for achieving the subjugations of bodies and the control of populations‘ (Foucault 1976). Freedom House’s research report highlights a ‘sharp global increase in the abuse of civil liberties and shrinking online space for civic activism’. 47 of the 65 countries assessed in their report have carried out arrests of ‘users for political, social, or religious speeches’. At greater risk, are the minorities who are at an increased risk of persecution or harassment. An article by the New York Times revealed how China is investing billions of dollars every year in Xinjiang, home to many Muslim ethnic groups, to test the deployment and efficacy of increasingly intrusive policing systems.
Read more: Resistance to Surveillance – Why protests are becoming increasingly faceless
Advanced democracies are at risk too.
Mass surveillance defeats the purpose of civic spaces as avenues to raise and rally around important public issues. We must not forget that democracies pride themselves on rights on equal participation in political and public affairs. Carnegie Endowment research highlights that contrary to the popular perception, ‘51 percent of advanced democracies and 41 percent of electoral democracies/ illiberal democracies have adopted AI surveillance systems. Guardian reported that the Indian government has been using automated facial recognition systems to identify and to exclude protesters rallying against the redefinition of Indian identity. The Trump administration directed technology companies to help employ artificial intelligence for extreme vetting of prospective immigrants as potential terrorist threats, but later dropped the plan in the face of widespread criticism.
Greater transparency and adequate legal safeguards to protect access to and participation in Civic Spaces.
Policy advocacy organizations such as Article 19, Privacy International and the Centre for the Internet and Society have been vociferous in their demands for inclusion of transparency, accountability, legal scrutiny, effective remedy measures and regular audits in the security and surveillance apparatuses of the nations. With the development of new technologies including advanced biometrics and 5G mobile networks, it is imperative to put necessary filters, checks and scrutiny on the central tower of the Panopticon. Specifically on the impact on Civic Spaces, ARTICLE 19 and its partners are emphasising that ‘the right to participate publicly in decision-making, engage in open debate, criticise, protest, and dissent, in physical and online space, are widely recognised in legislation, policy, and practice’. There is an urgent need to address the enhanced risk and violation of rights on account of governments having ‘direct and unrestricted access’ to data of citizens and organisations.