Generation A
"Artificial Intelligence does not affect everyone equally"
A participatory design workshop at Goethe-Institut in Nicosia explores the futures of AI in Cyprus and new formats for Goethe-Institut’s educational programme.
It is a mild day in late-February when outside the picturesque villa of Goethe-Institut Nicosia, participants sit in pairs facing each other. Equipped with pens and paper, they are asked to portray each other in this surprisingly analog setup for a workshop on Artificial Intelligence. With no further instructions, they sketch the first portraits. The process is repeated, however, with increasing limitations: one portrait without looking at one’s paper, another one only using one’s non-dominant hand, and one by drawing a single continuous line. The result is a series of pictures, some abstract, some figurative, each depicting the same person in different ways. The portraits are a creative introduction, but also a first reflection on the themes of the workshop. Do you see yourself in this portrait? Which aspects of you are in the picture? Which aspects are missing? How have the limitations affected the pictures? And what did you notice when representing another person?
In juxtaposition, the portraits display their apparent biases in funny and intriguing ways. And they point to a problem that all representations share. Whether drawn by hand or by machine-learning algorithms, representations cannot be neutral. They exaggerate or undermine, they distort and accentuate. This is why the questions raised by the drawing exercise continue to come back in the next two days of the workshop: representations matter when the potentials and risks of Artificial Intelligence are discussed or the question if data can ever be fair.
This exercise shows how participatory design workshops like the one at Goethe-Institut Nicosia address issues of Artificial Intelligence in unexpected ways. Conceived by Berlin-based digital arts curator Jasmin Grimm and designer Nushin Yazdani, the workshop is both a testing ground for new educational formats and part of Goethe-Institut’s participatory strategy for future formats. It is the first in a series of workshops at Goethe-Instituts throughout Europe with the aim to co-design and evaluate new events, exhibitions, and other formats.
After the drawing exercise, the participants interview each other about their interests in AI and how it could change life in Cyprus. "With all the corruption I see, I wish there was a bias-free AI that makes better political decisions than our politicians. I really believe a neutral AI could do better" one student says, triggering a discussion about corruption, but also if an AI system could really ever be neutral or if it indeed needs to be biased to make fair decisions.
What is the impact of machine learning on society? How can cities become smarter? How is personal data used? How can algorithms be used for democratic and common purposes? The questions raised in the workshop come from the everyday experiences of the participants but often touch the philosophical. Nobody explicitly mentions the Green Line between the northern and the southern parts of the island, but the political tensions between the sides linger in the discussions. One student from the northern part is annoyed that he has to wait several months for electronic parts for a project, as many imports first have to pass mainland Turkey. Others talk about their concerns with the implementation of CCTV surveillance in the northern part of the island that is carried out by dubious companies. A student from the southern part of the island is angry about the bad erratic public transport in Nicosia, probably in part resulting from the divide of the city. And with some fatalism scenarios are discussed that predict Cyprus will be underwater in 20 years because of climate change and rising sea levels.
It is remarkable that the Goethe-Institut in Nicosia has managed to assemble such a diverse group, bringing together participants from both Greek and Turkish Cypriot communities. The institute is in a unique position for such exchange being located in what is known as the Green Line, a demilitarized zone under the control of the United Nations Peace Keeping Mission. The Green Line divides Cyprus and the city of Nicosia into a northern and a southern part. Located between the checkpoints of both sides, the institute is one of the few places that can be visited by the Greek and Turkish Cypriot communities without going through checkpoint control.
"90 percent of the bias in systems comes from the data and not the algorithm" Jahna Otterbacher
The discussions are catalyzed by the input of local AI experts. Jahna Otterbacher is an Assistant Professor at the Open University of Cyprus and conducts research on AI and discrimination. It becomes clear throughout her talk that the biases of AI systems often go unnoticed. Otterbacher has a vivid local example: if one searches for "Cyprus" using Google Image Search, one will get different results depending on the language one uses. Entering "Cyprus" in Greek will show pictures of political maps of the divided island. Searching for "Cyprus" in Russian brings up beach resorts while searching in Chinese shows pictures of real estate opportunities. Such invisible biases can foster discrimination when the search term "nurses" standardly returns female nurses, while the search term "surgeon" standardly shows male surgeons. Otterbacher’s research aims to document such problematic discrimination in search algorithms. She says: "Of course people stereotype each other all the time. Technology can, however, increase and incentivize stereotyping or act against it."
"AI is in its teenage years" Loizos Michael
A second impulse comes from Loizos Michael, associate professor at Open University of Cyprus and director of the Computational Cognition Lab. Michael’s talk gives an overview of the technological history of Artificial Intelligence and the trends in current AI research. In particular, he explains how AI in the future could become more transparent and trustworthy. Such trustworthy AI systems would not only take automated decisions but also generate explanations of why they have taken them. Regarding the discussion of biases and AI, Michael says: "Artificial Intelligence and Machine Learning depend on biases. Without biases, a system would not be able to recognize any patterns". And yet he stresses that it matters to get to know and be conscious of biases, one’s own, the ones in the data and in the systems we use.
The workshop ends with a co-design session, in which participants give feedback on and develop new format proposals for Goethe-Institut regarding digital technology and AI. The ideas range from a robot traveling to different Goethe-Instituts in Europe to acquire local dance moves to ‚Vantage Point‘ a programme that tackles disinformation and biased news reporting in Cyprus. A few formats propose to sensitize to the ethics of AI and to the biases of digital technologies. Throughout the workshop, it has become clear that it is unlikely that there ever will be technology free of biases. And yet, it matters to demand: whoever designs something should be in touch with all the people affected by it.