Artist duo Caitlin & Misha find inspiration in naturally occurring systems such as rhizomatic networks of mycelium, the microbiome ecology, and emergent pink noise for the shared experiences they construct in their collaborative art practice. Among other things, they create installations, games, data visualizations, and happenings. They aim to create artworks that provide unique opportunities for shared experiences, thought experiments, and group-based rejuvenation, such as sweating, meditating, humming, jumping, and worrying together.
Caitlin Foley and Misha Rabinovich began working collaboratively in 2010 as part of a group of artists called the DS Institute at Syracuse University where they received MFAs. They have held residencies at places such as Flux Factory, ARoS Aarhus Kunstmuseum in Denmark, and Andrea Zittell’s Wagon Station Encampment which have helped shape their creative network and interest in social practice. They are full time faculty in the Department of Art & Design at the University of Massachusetts Lowell where Caitlin is a visiting faculty lecturer in foundations and Misha is an assistant professor of interactive media. They are recipients of the Andrew W. Mellon Foundation Immersive Scholar Award, NEFA Creative City Grant, and exhibited at such venues as the Science Gallery (London), EFA Project Space (NYC), the New Museum’s Ideas City Festival (NYC), Boston Cyberarts (Boston), Montserrat College of Art (Beverley, MA), Machine Project (LA), Torrance Art Museum in (LA). They are currently working on a commission for the Science Gallery London to create a series of participatory workshops and an animated film for the King’s College Fecal Microbiota Transplantation (FMT) Promise Study.
Coming from a background of materials and transmedia art, Caitlin Foley and Misha Rabinovich elegantly explore the tension between the ubiquitous existence of machines and human emotions.
The piece “Ecology of Worries” is a speculative investigation of non-human or alternative intelligence that deploys a certain degree of uncanniness. The non-anthropomorphic, non-humanoid appearance of these “worrying” creatures are treated with authentic visual imagination, while the actual archival survey information and modal tuning (of the GPT-2 system) behind them add a touch of the real to the scene. The work triggers empathetic projection via the “odd” simulation of intelligent beings.
The connection to “bias” does not seem straightforward. However, the work does offer reflections on what happens when we include a layer of emotion (and its non-computability) into our relationship with machine-generated images – which could blaze a trail for many tech-research entities in the future.
“Ecology of Worries” asks the question of whether we should teach a machine to worry for us. It is enabled by an archive of actual recorded worries we’ve been collecting from people since 2016. The video consists of hand-drawn critters. Some critters are driven by synthetic worries generated with the TextGenRnn neural network trained on the transcribed worries archive. Others are driven to worry by a novel machine learning system called Generative Pretrained Transformer 2 (GPT-2), which was dubbed by commentators as the AI that was too dangerous to release (but it was released anyway). The creatures’ performance of worrying spans a gradient of intelligibility, reflecting the evolution of machine learning systems. The resulting manifestations of the algorithms – with glitches intact – are presented in the form of an augmented reality (AR) bestiary.
By creating synthetic worries at various levels of sophistication and having them emerge as variously evolved creatures we are engaging the empathy of the viewers. The critters in “Ecology of Worries” appear alive not because of any sort of omniscience a tech evangelist might expect from a digital assistant, but due to their very real flaws. The creatures become uncanny through the juxtaposition of familiar and abstract concerns.
Training is the aspect of machine learning that engages the most important political dimension of this technology. Silicon Valley fraternity bros scrounge up photos of their classmates or grab celebrity photos from wherever they can find them to train their AI! Even well-intentioned people often do the easy thing without anticipating problems downstream. Biased data imbues the machines with the biases of their creators. The large-scale deployment of AI by US social media companies across much of the world has made the voices of Alexa, Siri, and Google Maps ever more recognizable. “Ecology of Worries” defamiliarizes the peppy digital assistant voice by training these creatures to worry about our communal woes (as reinterpreted by a machine). So when a machine takes up our worry, does that at least mean it has really heard us? If we bare our souls to machines, don’t we risk them doing the same to us? The Eliza effect, discovered in the late 60s, led people to perceive a chatbot as intelligent and worth confessing to. “Ecology of Worries” flips the dynamic and has the machines confessing to us – thereby putting us in an awkward, pensive, and sometimes hilarious state of mind.