Culture shock in the robot age
From Berlin to Bangkok, robots in home, healthcare, school and work settings are becoming more common. However, research from across the globe suggests this technology is not one-size-fits-all for humanity. In order to build effective social robots, we must understand how culture shapes our perceptions and expectations of them.
In his 2018 book titled Who’s Afraid of AI? Fear and Promise in the Age of Thinking Machines, Thomas Ramge states that “cultural attitudes speed up or slow down the acceptance of innovations.” Ramge makes the point that “robots are enemies in Europe, servants in America, colleagues in China, and friends in Japan.” While research to date on cultural attitudes towards robots might not fully support such strict cultural divisions in the roles envisioned for robots, Ramge’s point is significant. It highlights the role played by cultural attitudes in shaping people’s willingness to embrace new technological innovations.
Emily Cross | © Iman Aryanfar In this piece, I discuss some challenges and opportunities for exploring how culture impacts encounters between people and robots. Many of these ideas stem from a paper I recently wrote with two exceptionally insightful colleagues, Velvetina Lim and Maki Rooksby, whose scholarship I gratefully acknowledge in writing this piece.
Over a decade ago, Microsoft founder Bill Gates prophesized a robotics revolution that would lead to staggering leaps in the sophistication of robots, and predicted “a robot in every home” in the near future. While the ubiquity of home robots has yet to materialise, the number of robotics-related start-up companies launching annually continues to exponentially increase, with a growing percentage focusing on the development of companion robots for the home or assistance robots to serve in complex, human-interactive contexts, like schools, hospitals and care homes.
Concurrently, the number of researchers interested in investigating how people perceive and interact with robots is also increasing. Importantly, an increasing number of researchers interested in these questions are coming from the arts, humanities and social sciences, and this broadening of backgrounds and related perspectives has coincided with an emerging emphasis on understanding the impact and influence of culture when developing and evaluating new robotics technologies.
Admittedly, ‘culture’ is neither a unitary nor an easily definable construct, and anthropologists have rightly criticized many attempts made by those working in the behavioural sciences to quantify cultural influences on human behaviour. However, to generate richer, interdisciplinary characterization of complex phenomena that are at the heart of successful social interactions between humans and robots, the challenge remains to program interactive robots that appropriately and sensitively respect the cultural rules and preferences of their human interaction partners.
Culture shapes each of our social interactions, not only influencing how we relate to others and function as a social group, but also colouring how we think about, perceive, and understand other individuals and our surroundings. Given the increasingly globalised and multicultural world that many of us inhabit today, individuals are exposed to and engage with a more diverse range of cultural norms and practices than at any other time during history. As mentioned above, culture is a difficult construct to fully define, but for many interested in cultural impacts on our relationships with robots, it is often operationalised as national culture – or values, norms, and practices that are undertaken by inhabitants of a given country.
Japan's love of tech is sold to tourists at this robot-themed restaurant in Tokyo | Photo credit: Miikka Luotio / Unsplash One particularly well-studied cultural construct is the dichotomy of individualism-collectivism, which describes how individuals’ representations of themselves relate to others. Broadly put, highly individualistic cultures are characterised by members who value independence, focus on themselves, engage in explicit styles of communication, and whose identity is tightly bound up in the person as an individual. On the other end of the spectrum, individuals in highly collectivistic cultures exhibit more interdependent behaviours and attitudes, focusing more on their relationships with others, which are often characterised by implicit styles of communication, and the sense of self for individuals from more collectivist cultures reflects the nature and values of their social group.
Many of the research studies that have examined cultural impacts on attitudes towards robots have focused on comparing individuals from countries typified as being more individualistic (e.g., Europe and North America) with those from countries considered more collectivist (primarily Asian countries). While it is difficult to draw general conclusions from the findings from this research, one emerging theme is that participants from Asian countries (primarily Japan and South Korea) are often found to be more open to robots in social roles (such as assisting with childcare or housework) compared to participants from North America or Europe. At the same time, Asian participants are often more realistic than their western counterparts in their expectations about what these machines will and will not be able to do. It’s difficult to separate cultural differences that are due to culture, per se, from differences due to exposure to robots in public spaces (very high in Japan) or government support for social robotics technology (high in Japan and South Korea).
It’s worth noting that the individualism-collectivism construct (and associated countries) parallels philosophical systems of the West (e.g. Europe and the Americas) contrasted with those of the East (e.g. Asia and the Middle East), with the former seeking a systematic, consistent and comprehensive understanding of our universe, while the latter taking a more holistic or circular view in understanding our world. For example, Western history of thought is marked by major changes in thinking regarding human existence in order to balance new ideas with maintaining consistency of thought systems (for example, Freud on the unconscious mind). On the other hand, equivalent history in the East is often described as following a more continuous trajectory. For example, in Japan, a cultural inclination exists toward animism, and the Buddhist belief that souls reside in all things whether living or not. Some suggest that such cultural and philosophical leanings may induce more readiness to accept robots across a range of settings in Japan. The fact that Japan’s “robot density” (the number of robots per person in industry) is among the highest in the world lends further credence to this idea. So does the Japanese government’s new “Society 5.0” strategy that aims to increase the use of robotics and AI (among other emerging technologies) to help care for a rapidly ageing population, as well as bolster productivity across the workforce. Here again, while we cannot conclude that Japanese culture, per se, has causally led to the generation or acceptance of robotics-heavy solutions across different sectors of society in Japan, these developments are influenced by and continue to influence cultural norms in a dynamic manner.
Illustration from Lim, Rooksby & Cross (2020) showing country of origin of participants from 50 different studies examining cultural influences on human-robot interactions | Courtesy of the author The figure presented above illustrates the country of origin of participants from 50 different studies examining the impact of culture and cross-cultural differences on human-robot interaction that my colleagues and I recently reviewed. We sought to include every study published, to date, that explicitly studied cross-cultural differences in human-robot interaction. What is evident from this figure, is the strikingly uneven distribution of country of origin for the participant samples that make up these studies, and the fact that individuals from Japan and the USA have been the focus of much of this research. This focus stems from psychological research traditions that typically place American and Japanese cultures as squarely representative of Western and Eastern cultural norms, respectively, as well as clear exemplars of cultures that typify individualistic and collectivist values.
Cultural comparison, however, requires far more nuanced analyses than simply examining what happens when individuals from two very different cultures interact with robots. Even within more localised clusters of Western cultures, such as Italy and the United Kingdom, markedly different preferences have been observed in terms of how individuals from each country might envision using a robot as a tool in their professional lives.
Beyond Japan and the USA, we do not yet have large enough sample sizes or sufficiently powered studies to delve more deeply into the role played by cultural differences in shaping our relationships with robots with much precision. This represents both a significant challenge and limitation to our current understanding, but also an incredible opportunity to involve researchers and community members from all countries and regions (and especially Latin/South America and Africa, two areas particularly underrepresented in this research to date).
Exciting initiatives to amplify the voices of indigenous peoples and their attitudes, hopes, and concerns for robotics technologies and AI more generally (such as the recently published position paper on Indigenous Protocol and Artificial Intelligence, 2020) should add further nuance and understanding to the social roles robots might be accepted to fulfil in different parts of the world. By including more diverse perspectives not just spanning geography, but also technological trust/exposure, this should contribute to designing and programming robots that work harmoniously with people. Andrea Bocelli sings alongside a robot composer in Pisa, Italy | © Independent Photo Agency Srl / Alamy Stock Photo
What do we envision for the future of socially interactive robots? Do we wish for more sophisticated robots that behave (and perhaps even look) more like ‘one of us’? Or would we prefer robots that are designed and optimised for their particular task (such as cleaning, cooking, teaching, etc)? How do purposes, contexts and users influence the requirements for robotic agents? These questions should not be entirely left to those working in robotic design and engineering, but will benefit from sociologists, anthropologists, psychologists, philosophers, neuroscientists and artists joining the debate.
Inclusion of broader cultural practices and groups in the process will critically inform the range of user profiles and expectations as well as requirements for robotic behavior and functions, and will hopefully lead to more nuanced perspectives about cultural influences than what we currently have, which is a very uneven sampling of cultures and countries.
Research could also examine how the cultural background of robots’ creators affects preferences towards culturally-adaptive robots. At the end of the day, robots are created based on conceptions of their designers, programmers, and developers. As such, any robot’s ability to reflect or sensitively respond to cultural norms is necessarily limited by its creator’s attributions, biases, and understanding of the culture of others (which, yet again, argues for a broader community of disciplinary perspectives right at the design and development stage).
One final question to explore is the impact of multicultural experiences on our encounters with robots. Whilst most research to date has focused on how people who identify with a single national culture perceive and respond to robots, it is an undeniable fact that our world is becoming increasingly diverse, in part driven by migrations, resulting in the forging of increasingly multicultural identities. It remains to be seen what effect cultural mixing and bridging, and a population that is becoming increasingly multilingual, multi-national, and multicultural, will have on social cognition towards robotic agents.
Learn more about Emily Cross' views on the future of creative AI here.
Emily Cross | © Iman Aryanfar In this piece, I discuss some challenges and opportunities for exploring how culture impacts encounters between people and robots. Many of these ideas stem from a paper I recently wrote with two exceptionally insightful colleagues, Velvetina Lim and Maki Rooksby, whose scholarship I gratefully acknowledge in writing this piece.
A robot in every home?
Over a decade ago, Microsoft founder Bill Gates prophesized a robotics revolution that would lead to staggering leaps in the sophistication of robots, and predicted “a robot in every home” in the near future. While the ubiquity of home robots has yet to materialise, the number of robotics-related start-up companies launching annually continues to exponentially increase, with a growing percentage focusing on the development of companion robots for the home or assistance robots to serve in complex, human-interactive contexts, like schools, hospitals and care homes.
Concurrently, the number of researchers interested in investigating how people perceive and interact with robots is also increasing. Importantly, an increasing number of researchers interested in these questions are coming from the arts, humanities and social sciences, and this broadening of backgrounds and related perspectives has coincided with an emerging emphasis on understanding the impact and influence of culture when developing and evaluating new robotics technologies.
Admittedly, ‘culture’ is neither a unitary nor an easily definable construct, and anthropologists have rightly criticized many attempts made by those working in the behavioural sciences to quantify cultural influences on human behaviour. However, to generate richer, interdisciplinary characterization of complex phenomena that are at the heart of successful social interactions between humans and robots, the challenge remains to program interactive robots that appropriately and sensitively respect the cultural rules and preferences of their human interaction partners.
Culture shapes each of our social interactions, not only influencing how we relate to others and function as a social group, but also colouring how we think about, perceive, and understand other individuals and our surroundings. Given the increasingly globalised and multicultural world that many of us inhabit today, individuals are exposed to and engage with a more diverse range of cultural norms and practices than at any other time during history. As mentioned above, culture is a difficult construct to fully define, but for many interested in cultural impacts on our relationships with robots, it is often operationalised as national culture – or values, norms, and practices that are undertaken by inhabitants of a given country.
Japan's love of tech is sold to tourists at this robot-themed restaurant in Tokyo | Photo credit: Miikka Luotio / Unsplash One particularly well-studied cultural construct is the dichotomy of individualism-collectivism, which describes how individuals’ representations of themselves relate to others. Broadly put, highly individualistic cultures are characterised by members who value independence, focus on themselves, engage in explicit styles of communication, and whose identity is tightly bound up in the person as an individual. On the other end of the spectrum, individuals in highly collectivistic cultures exhibit more interdependent behaviours and attitudes, focusing more on their relationships with others, which are often characterised by implicit styles of communication, and the sense of self for individuals from more collectivist cultures reflects the nature and values of their social group.
Varying attitudes towards robots
Many of the research studies that have examined cultural impacts on attitudes towards robots have focused on comparing individuals from countries typified as being more individualistic (e.g., Europe and North America) with those from countries considered more collectivist (primarily Asian countries). While it is difficult to draw general conclusions from the findings from this research, one emerging theme is that participants from Asian countries (primarily Japan and South Korea) are often found to be more open to robots in social roles (such as assisting with childcare or housework) compared to participants from North America or Europe. At the same time, Asian participants are often more realistic than their western counterparts in their expectations about what these machines will and will not be able to do. It’s difficult to separate cultural differences that are due to culture, per se, from differences due to exposure to robots in public spaces (very high in Japan) or government support for social robotics technology (high in Japan and South Korea).
It’s worth noting that the individualism-collectivism construct (and associated countries) parallels philosophical systems of the West (e.g. Europe and the Americas) contrasted with those of the East (e.g. Asia and the Middle East), with the former seeking a systematic, consistent and comprehensive understanding of our universe, while the latter taking a more holistic or circular view in understanding our world. For example, Western history of thought is marked by major changes in thinking regarding human existence in order to balance new ideas with maintaining consistency of thought systems (for example, Freud on the unconscious mind). On the other hand, equivalent history in the East is often described as following a more continuous trajectory. For example, in Japan, a cultural inclination exists toward animism, and the Buddhist belief that souls reside in all things whether living or not. Some suggest that such cultural and philosophical leanings may induce more readiness to accept robots across a range of settings in Japan. The fact that Japan’s “robot density” (the number of robots per person in industry) is among the highest in the world lends further credence to this idea. So does the Japanese government’s new “Society 5.0” strategy that aims to increase the use of robotics and AI (among other emerging technologies) to help care for a rapidly ageing population, as well as bolster productivity across the workforce. Here again, while we cannot conclude that Japanese culture, per se, has causally led to the generation or acceptance of robotics-heavy solutions across different sectors of society in Japan, these developments are influenced by and continue to influence cultural norms in a dynamic manner.
Illustration from Lim, Rooksby & Cross (2020) showing country of origin of participants from 50 different studies examining cultural influences on human-robot interactions | Courtesy of the author The figure presented above illustrates the country of origin of participants from 50 different studies examining the impact of culture and cross-cultural differences on human-robot interaction that my colleagues and I recently reviewed. We sought to include every study published, to date, that explicitly studied cross-cultural differences in human-robot interaction. What is evident from this figure, is the strikingly uneven distribution of country of origin for the participant samples that make up these studies, and the fact that individuals from Japan and the USA have been the focus of much of this research. This focus stems from psychological research traditions that typically place American and Japanese cultures as squarely representative of Western and Eastern cultural norms, respectively, as well as clear exemplars of cultures that typify individualistic and collectivist values.
Cultural comparison, however, requires far more nuanced analyses than simply examining what happens when individuals from two very different cultures interact with robots. Even within more localised clusters of Western cultures, such as Italy and the United Kingdom, markedly different preferences have been observed in terms of how individuals from each country might envision using a robot as a tool in their professional lives.
Beyond Japan and the USA, we do not yet have large enough sample sizes or sufficiently powered studies to delve more deeply into the role played by cultural differences in shaping our relationships with robots with much precision. This represents both a significant challenge and limitation to our current understanding, but also an incredible opportunity to involve researchers and community members from all countries and regions (and especially Latin/South America and Africa, two areas particularly underrepresented in this research to date).
Exciting initiatives to amplify the voices of indigenous peoples and their attitudes, hopes, and concerns for robotics technologies and AI more generally (such as the recently published position paper on Indigenous Protocol and Artificial Intelligence, 2020) should add further nuance and understanding to the social roles robots might be accepted to fulfil in different parts of the world. By including more diverse perspectives not just spanning geography, but also technological trust/exposure, this should contribute to designing and programming robots that work harmoniously with people. Andrea Bocelli sings alongside a robot composer in Pisa, Italy | © Independent Photo Agency Srl / Alamy Stock Photo
Into the future
What do we envision for the future of socially interactive robots? Do we wish for more sophisticated robots that behave (and perhaps even look) more like ‘one of us’? Or would we prefer robots that are designed and optimised for their particular task (such as cleaning, cooking, teaching, etc)? How do purposes, contexts and users influence the requirements for robotic agents? These questions should not be entirely left to those working in robotic design and engineering, but will benefit from sociologists, anthropologists, psychologists, philosophers, neuroscientists and artists joining the debate.
Inclusion of broader cultural practices and groups in the process will critically inform the range of user profiles and expectations as well as requirements for robotic behavior and functions, and will hopefully lead to more nuanced perspectives about cultural influences than what we currently have, which is a very uneven sampling of cultures and countries.
Research could also examine how the cultural background of robots’ creators affects preferences towards culturally-adaptive robots. At the end of the day, robots are created based on conceptions of their designers, programmers, and developers. As such, any robot’s ability to reflect or sensitively respond to cultural norms is necessarily limited by its creator’s attributions, biases, and understanding of the culture of others (which, yet again, argues for a broader community of disciplinary perspectives right at the design and development stage).
One final question to explore is the impact of multicultural experiences on our encounters with robots. Whilst most research to date has focused on how people who identify with a single national culture perceive and respond to robots, it is an undeniable fact that our world is becoming increasingly diverse, in part driven by migrations, resulting in the forging of increasingly multicultural identities. It remains to be seen what effect cultural mixing and bridging, and a population that is becoming increasingly multilingual, multi-national, and multicultural, will have on social cognition towards robotic agents.
Learn more about Emily Cross' views on the future of creative AI here.