Citizens or Subjects?
The Individual in the Age of Platformisation and AI

Panel Discussion on the theme "Citizens or Subjects?" at the Goethe-Institut / Max Mueller Bhavan New Delhi.
Panel Discussion on the theme "Citizens or Subjects?" at the Goethe-Institut / Max Mueller Bhavan New Delhi. | © Goethe-Institut / Max Mueller Bhavan

The smallest of online interactions generate vast amounts of user data, that is often extracted and commodified by private companies and governments. The rise of artificial intelligence and a push for digitisation of welfare services have further incentivised data collection. This presents serious challenges to the autonomy and privacy of individuals, who are becoming data subjects rather than just beneficiaries of digital services.

By Akshit Chawla

In this age of platformisation, digitisation and AI, what kind of power imbalances exist? Should the government partner with private entities? Does AI leave scope for a just digital future? In this conversation, three experts discuss these questions, offering insights into navigating the modern digital age.

Power Imbalances in the Digital Age

Whether citizens want to access welfare facilities or communicate on the Internet, they have to comply with the data collection terms presented by the state and private platforms. Citizens have little to no bargaining power in negotiating whether they want to share their data. 

Shashank Mohan: A power imbalance exists between the state and its citizens. Take the example of Salta, Argentina, where the government reached out to Microsoft to solve the problem of teenage pregnancies. The proposed solution from Microsoft involved collecting data of 200,000 residents, including 12,000 girls under the age of 19 years. This project proceeded with minimal transparency about how this data will be used and what outcomes can be expected.

The residents had no choice but to share their data as they lacked adequate negotiating power. Many of them were dependent on the state for food, water, and health services, and came from displaced communities who spoke different languages. Eventually, this program was shut down because of interference by feminist movements. In this case, neither the citizens nor the state derived benefits, but Microsoft got access to a lot of data.

Before implementing technological solutions, one should question if they are really needed. For instance, is facial recognition needed to enter airports? Is an iris and a face scan essential to obtain free ration? Do girls need to be profiled instead of being offered more protection and agency?

Shivangi Narayan: States cannot function without their citizens’ information. Hence, it is fair for the state to collect data to provide certain services. But the problem arises when the state starts collecting data disproportionately and executing its powers beyond certain limits.

The first ones to be affected by a surveillance and AI apocalypse are the marginalised and vulnerable people. For instance, facial databases have been used on individuals who were protesting in Haryana. Those who conform to norms like heterosexuality and do not criticise the government are less likely to be affected. However, individuals who try to deviate from what is considered ‘normal’ may get penalised.

Uthara Ganesh: The state and platforms have a complex and contentious relationship. Governments have become more adept in controlling, storing and processing data, whereas platforms have taken a centralised role in citizens’ lives. This ubiquitous presence of platforms causes the state to feel somewhat threatened, which rightly responds by trying to regulate them. But this can also have negative consequences.

For example, in the case of encryption, governments around the world have tried to create backdoors and pressurised platforms to share data. This not only affects innovation but harms users as well.

Another example can be found in India’s Digital Personal Data Protection Act, 2023. While trying to safeguard children’s interests, the law also ends up restricting their access to the internet.

Government’s Reliance on Private Companies

The Indian government has been partnering with private companies to implement new technological interventions in various fields such as policing.

Shivangi Narayan: It's perfectly fine for a government to partner with a private organisation. But it becomes problematic when private organisations work simply to get access to certain data and do not perform their duties effectively.

Now, technology has become a part of policing in India. Policing has historically been an institution to control people and has not become abusive just because of its integration with technology. Technology just enables this to be done on a larger scale. If earlier something was being done for a neighbourhood, it can now be done for an entire city or a country.

Using technology in policing also presents challenges regarding accountability. The police now have facial recognition and predictive policing capabilities. Now, if a mistake has been made by the police, it is possible that they can just blame these technologies instead of taking accountability.

The Problem with Consent

While platforms and governments do obtain consent before collecting users’ data, it often holds little value because people generally don’t read the complicated legalese presented to them. 

Shivangi Narayan: The concept of consent has become moot in India. This is because people often rely on the state for their livelihood, and thus, they don’t have the option of saying no to the state. When it comes to agreeing to terms on the internet, people don’t know what they’re consenting to. The problem in India is bigger than consent. That is why lawyers should give space to social scientists in the AI regulations space. Merely obtaining consent from users would not solve any problems.

Shashank Mohan: Shifting the burden on people (by obtaining consent) is not the right way to go. Corporations know that people are not reading the legalese. Consent is dead. ChatGPT is scraping data off the internet, Apple’s scrapping Apple devices and Google’s scrapping Google devices. This is no (proper) consent. I think the way forward is to educate, agitate, and organise.

Even India’s data protection law has certain issues. While it proposes a consent framework, it also empowers the state to exempt itself and certain private companies from following it.

Can AI co-exist with digital rights?

Companies developing AI-based technologies have been accused of unethical practices like scraping data off the internet. In a future where strong digital rights are put in place to mitigate harm, can AI technologies continue to exist and thrive?

Uthara Ganesh: I’m a bit of a tech optimist. There is a general resistance that people have when a new technology emerges. In the 1800s, fiction was believed to be corrupting the minds of people but now we know that fiction is good for people. We should allow technologies and their potential to be realised — with the correct guardrails in place.

Platforms can work on adopting a better design language that is in the interest of users. But this requires a push from the policymakers. They need to have a strong understanding of technology and a consultative approach. They can bring in proportional laws that try to solve specific problems.

Shivangi Narayan: Artificial Intelligence can be used with proper rights and accountability in place. AI is doing wonderful things in life sciences and medicine where it is being used with all the checks and balances. It starts to get difficult when AI, which has a mathematical approach, is used for societal issues, which are random. One cannot really predict how societies are going to react.

For example, whether a person is a criminal or not is in the hands of the police. It is discretionary. Such decisions cannot be made by mathematical concepts.

It also depends on the context in which AI is being used. For instance, the technology of a robotic dog is available in both Germany and the US. In Germany, robotic dogs were used to assist homeless people, whereas in the US, the homeless might not be treated in the same way. If we have a country where rights are regulated, artificial intelligence can do wonderful things.

Way forward

With platformisation of digital services and the sporadic rise of AI, what can be done to mitigate the existing risks that users face?

Uthara Ganesh: My contention is that tech is a net good. Laws should not put the onus on users while they are negotiating their relationship with the platform and the government. Full regulation should be encouraged.

Shivangi Narayan: The rise of AI is not natural the way it is being pushed by big tech. For instance, private sector companies are pushing governments to install CCTV cameras in the name of women's safety while they are primarily interested in collecting data. We need to organise and talk about these issues, especially about the imbalances in power structures.

Shashank Mohan: Our current data protection law puts minimal liabilities on the state and private companies. So, citizens’ engagement with the political process can be helpful. We need to keep asking questions from the state and corporations, and educate ourselves about how technology is impacting us. We need to come together. I really believe in the power of mass movements in bringing about change.

The event was moderated by Disha Verma, Associate Policy Counsel at Internet Freedom Foundation.
 

The experts

Uthara Ganesh
Uthara is the Head of Public Policy, India and South Asia at Snap Inc. She focuses on developing Snap's relationships with key government agencies in South Asia and shaping regulatory outcomes in areas such as content regulation, data protection and privacy, emerging tech regulation, and competition. Prior to this role, Uthara led Amazon Web Services' (AWS) and Amazon's public policy efforts, supporting advocacy on content regulation, taxation, data governance, and competition issues across Prime Video, Kindle, Alexa, and Cloud businesses. She also served as Public Policy Lead to Rajeev Chandrasekhar from 2014 to 2017 and has worked at the Centre for Policy Research, UNAIDS,  nd GIZ. Uthara was a LAMP Fellow from 2010 to 2011.

Dr Shivangi Narayan
Shivangi is the author of 'Predictive Policing and the Construction of the Criminal: An Ethnographic Study of Delhi Police' published by Palgrave Macmillan in August 2023. She is a former researcher with the project, Algorithmic Governance and Cultures of Policing - A comparative perspective from Norway, Russia, Brazil, India and South Africa (AGOPOL), funded by the Oslo Metropolitan University and Norwegian Research Council. She is one of the three recipients of the Surveillance and Society Network (SSN) Early Career Researcher Awards for her paper 'CCTV and the Criminal City' published in SSN journal in January 2024. Narayan was awarded her PhD in Sociology by the Centre for Study of Social Systems, Jawaharlal Nehru University, New Delhi in the year 2021.

Shashank Mohan
Shashank is a Programme Manager at the Centre for Communication Governance at National Law University Delhi. He is a lawyer and policy professional who has been studying and observing technology laws and policies from India and around the world for over six years. Shashank has worked extensively on subjects of data governance, platform design and regulation, impact of AI technologies on vulnerable groups, media freedom, decisional autonomy, gender diversity, Global South solidarity, and community building in civil society. Shashank believes in open access to knowledge, data justice, gender plurality, and Global South identity. When not working, he enjoys reading crime-fiction.

Moderator

Disha Verma
Disha Verma is an Associate Policy Counsel at Internet Freedom Foundation. A lawyer by training, Disha worked in health policy with expertise in community health and disease response before transitioning to tech policy. At IFF, she engages with tech deployment and digitalisation in the public sector with a focus on welfare distribution and social security, digital public infrastructure, governance AI, surveillance, policing, and digital transparency.

Top