Intimacy and Artificial Intelligence (AI)
Algorithmic Culture

AI © Marvin Luvualu

AI is profoundly shaping culture. Rather than fearing machines that surpass human intelligence, we might consider how human beings are being asked to act more like machines. How might a reconnection to the body, to landscape, and to experiences of diaspora suggest new possibilities?

My ancestors left for Canada from Ireland in the middle of the 19th century, after the phytophtora infestas fungus decimated the lumper potato, the primary foodstuff for most Irish people. Historically, a failure in one crop would lead to an increased reliance on others. However, the Irish didn’t own their land and the British landlords were unwilling to sacrifice their grain destined for export. Over one million people died, and many others fled to North America, England, and Australia to start over. The Irish population still has not returned to the levels of 1846.

The potato moved in the opposite direction, coming from the Andes, where it grew in spotty patches that bumped up against wild potatoes and a colourful range of domesticated varieties. A fungus might attack plants here and there, but the diversity available would allow others to thrive.

The same applies to our culture, and the expanded importance of artificial intelligence across all sectors is having dramatic effects on the kinds of ideas we can draw upon to solve problems going forward.

Growth and Efficiency

Algorithmic culture refers to how the logic of big data and computation changes how the world is perceived and experienced. The dominant logic of big data is all about growth and efficiency. The lumper potato grew easily and cheaply. But this was a cloned crop, planted as cut pieces from previous generations. This led to extreme vulnerability for an essential foodstuff and all those who relied upon it. 

Emphasis on growth and efficiency makes little room for values related to beauty, compassion, or autonomy. Just as the lumper potato came to replace all other food crops, AI systems focus exclusively on approaches that have the potential to generate scale. Working at scale leads to systems that perceive the world as full of objects requiring optimization and control. For AI, this applies to human beings as easily as cars or cargo ships.

Automating Us

The fact that humans are being asked by AI Systems to behave more like machines is troubling. Algorithmic culture asks us to behave as if we do not grow and adapt, decay and radically alter the course of our lives. The system tolerates small changes in our tastes or movements as long as we don’t step outside of the model used to guarantee what happens next. In service to creating a better future, AI prefers that human beings remain stable and suspended, fixed in our responses, reduced in our agency. AI struggles to perceive parts of life that might make us vast - community, spirituality, a connection to nature, physical ecstasy, intimacy, or art.  

 But rather than creating an algorithm that thinks like a human, the goal, it increasingly appears, is to automate us. I spent most of 2019 meeting with artists, philosophers, policy makers, data scientists, and researchers passionate about the possibilities of AI and worried about its current direction. They expressed concerns about how corporations and authoritarian governments were dictating the way AI systems were being developed and deployed. 

We should avoid blaming the nature of algorithms for the pull towards human automation in AI Systems. Technology is inevitably an expression of the objectives of the broader culture. The work we assign our algorithms ultimately reflects the things we value.

Body, Landscape, and Diaspora

Those thinking about AI too often see human bodies, landscapes, and diasporas as problems to be solved through the analysis of data and automated decision making. Technologies are not developed to support a collective inhabitation of a shared world but rather a locally optimized zone of interest that facilitates prediction and control.

Intimacy relies on our perceiving one another as embodied subjects, yet, by AI’s standards, intimacy is inefficient and scales poorly. Algorithmic culture too often selects against intimacy in the decisions it makes. The ubiquity of swiping right or swiping left suggests that the slow and community mediated process of deepening a relationship is being replaced by another kind of online shopping. How might AI be explored through the lens of intimacy? How might intimate practices complicate our experiences of AI?

Just as we inhabit our physical bodies, we inhabit landscapes and live within and from them. The tragedy of the Irish famine demonstrated the impact of failing to acknowledge our entanglement with natural systems. From stone tablets to silicon chips, intelligent machines are not further away from the land than an engraved piece of stone.  We are organized by our landscapes, and so we ought to ask, how could land inform the development and responses to AI? How might we change the focus of surveillance from monitoring and control of visible behaviour to understanding the invisible relationships among people and environments? 

Finally, each of us is situated in a historical, ecological, and social setting. Many are removed from the physical origins of these settings, because of migration or displacement, real and digital.  So, whose ethics are we talking about when we talk about AI ethics? Algorithmic culture encourages that we see strangers as data, without real agency. AI systems are frequently deployed at borders. How might various forms of migration and displacement inform algorithmic culture? How might these experiences allow us to refine and humanize AI systems and then return to ourselves with the gifts of that process? 

We can begin to approach these questions through multiple voices and through laughter. Intimacy with other people, a sense of stewardship over the world around us, and stories of migration and struggle require a state of being outside the machine so that we can better learn how to direct AI Systems with a greater ethical commitment. Without this effort, algorithmic culture will nudge us closer to fitting the needs of a system that has not been designed for us but threatens to consummate us, leaving us little capacity to live and act as ethical agents.
 

About the author

McGrath Jerrold McGrath Jerrold McGrath (@jerroldmcgrath) is the program lead for Goethe Institut – Toronto’s Algorithmic Culture series. Jerrold is a former program director at the Banff Centre for Arts and Creativity and Artscape Launchpad. He currently serves as managing director at UKAI Projects and founder at Ferment AI. Jerrold produces cross-sector collaborations around issues of broad social concern such as artificial intelligence, equity in response to COVID-19, hope and hopelessness, and the digital diaspora. He is a BMW Foundation Responsible Leader, and an Ambassador for Berlin’s STATE Festival. Jerrold is currently writing his first book, In Praise of Disorder.

Goethe-Toronto Algorithmic culture Series

This year, the Goethe-Institut Toronto, is committed to engaging with multiple voices to determine new pathways in exploring AI Ethics. 

The program on Algorithmic Cultures will start with workshops and the production of ‘zines in both Berlin and Toronto as a local and idiosyncratic process that engages, educates, and organizes our friends, family, and community. Spring 2021 will also see the launch of “The Computer is Your Friend”, a podcast documenting a group of artists, data scientists, and ethicists navigating a darkly humorous game world ruled by an erratic tyrant where no one is quite sure of the rules. 

We’re looking forward to working with many outstanding artists and partners to explore how the challenges of AI are tangled up with ideas of the body and intimacy, with our landscapes, and through different experiences of diaspora. 

We believe the possibility of developing AI that makes sense for life in Germany or in Canada, in Berlin or in Iqaluit, rests in reconnecting to lives, landscapes, and cultures and proposing directions for AI that reflect the experiences of being human.