Quick access:

Go directly to content (Alt 1) Go directly to first-level navigation (Alt 2)

Conversation with Rainer Rehak
"The government seems to want to demonstrate its ability to act—at the expense of our freedom."

Rainer Rehak
© CC BY-SA European Climate Foundation (The image was enhanced using AI.)

In the digital age, threats to privacy from surveillance and data misuse are becoming more prevalent. Recent developments like the use of police data and the proposed EU-level chat control raise critical questions. In this interview, Rainer Rehak from the Weizenbaum Institute discusses the risks of these developments and speaks about the urgent need for measures to protect digital freedom.

What new threats to digital privacy do you see emerging from the use of mass data analysis and artificial intelligence?

Continuous technological advancements come with both benefits and increasing dangers for individuals. The immense improvements in areas of mass data analysis, especially with artificial intelligence methods, stand out in this regard. Additionally, the growing volume of stored information about individuals, such as movement or behavioural data on online platforms and digital devices, also contributes to these risks.

Which government surveillance measures do you believe pose the greatest threat to digital privacy?

There is a fundamental distinction between economic and governmental threats. The background to the current threats to data protection and digital privacy by state organs is the perceived deterioration of security. However, in Germany, this perception is misleading, as almost all types of crime are declining. Despite this, media-driven individual cases are repeatedly used to justify granting security authorities new surveillance powers.

Currently, the so-called security package by the federal government is being hotly debated, which is intended to increase safety following the terrorist attack in Solingen. Among other things, it aims to allow security authorities to search arbitrary images from the internet and use this data for biometric analyses. Furthermore, images from surveillance cameras in public spaces are to be retrospectively analyzed biometrically. However, this massive surveillance push for facial recognition is not justified.

How do you assess the current developments in the use of police data and the EU-level chat control regarding digital privacy?

The idea is to automatically analyse police data by software and use it for testing and training AI applications. This implies an unprecedented merging of police databases, which is currently not allowed. It is also highly doubtful that these initiatives would have prevented the attack in Solingen, but that is no longer the focus of the debate. The government seems to want to demonstrate its ability to act—at the expense of our freedom. Free public spaces, whether on the internet or physically, are shrinking unnecessarily.

Another current surveillance proposal concerns the so-called chat control at the EU level. In summary, it aims to obligate smartphone messengers like WhatsApp or Signal to automatically search for illegal images in messages in the name of child protection and potentially forward such messages to an EU police body. Not only is this a breach of end-to-end encryption, which is crucial for privacy and society, but detection of such images is inherently error-prone. Furthermore, the medium itself is not primarily used for these kinds of crimes against children.

What threats to digital privacy do you see in the private sector, particularly regarding the use of user data?

In the private sector, digital rights, data protection, and privacy are increasingly threatened. More and more user data is accumulated and used for detailed profiling to drive profit-oriented advertising and other behavioural changes among users. While this may seem merely annoying when it comes to simple purchasing decisions, it becomes a serious issue when it concerns political influence and the spread of misinformation. Furthermore, many companies even sell the accumulated data or have it stolen by criminals due to inadequate IT security. All of this creates significant risks without providing much real benefit.

How are civil society organisations and grassroots movements contributing to strengthening digital privacy protection in political decision-making processes?

Civil society organisations and grassroots movements play a crucial role in protecting individuals in the digital age in three ways.

First, they act as "watchdogs," independent experts who monitor political developments and draw attention to relevant issues through statements, demonstrations, campaigns, or artistic contributions, encouraging media and public discourse.

Second, NGOs, both large and small, provide essential input during formal hearings by courts or parliaments, offering well-founded criticism and concrete suggestions. These contributions ensure that digital rights are not sacrificed amidst complex political decision-making processes involving many competing interests.

Third, civil society organisations and grassroots movements often take direct action. Some take legal action against state or private actors when they violate data protection laws. Others develop technical alternatives that protect privacy or serve as prototypes demonstrating what is technically possible. Still, others engage in educational outreach in schools, bars, or pedestrian zones to inform people or help them directly, such as through cryptoparties.

How do you envision a sustainable digital future that balances technological progress with the protection of personal data?

We must first ask ourselves what the terms "progress" and "innovation" really mean and what they are supposed to achieve. Do we want a future where we consume ever more resources and create social inequality, leading us to spend our time staring at precisely personalized advertising screens that exploit human weaknesses? Or do we want a future with intact nature, fundamental rights for all, meaningful human interactions, and fulfilling work? If we choose the latter, there is no contradiction, as progress and innovation should focus on developments that genuinely improve life for everyone, and this requires far less data.

About Rainer Rahek

Rainer Rehak is part of the research group “Digitalization, Sustainability, and Participation” at the Weizenbaum Institute for the Networked Society, he is an associated researcher at the Berlin Social Science Center (WZB) and is currently doing his PhD on systemic IT security and societal data protection at the TU Berlin.

He studied computer science and philosophy in Berlin and Hong Kong and has been working on the implications of the computerization of society for over 15 years. His research fields include data protection, IT security, state hacking, computer science and ethics, fictions of technology, digitization and sustainability, convivial and democratic digital technology, and the implications and limits of automation through AI systems.

Top