Why robots can be culturally insensitive and how scientists are trying to fix it.

By | April 15, 2024

A robot chats with an old British man in his bedroom. The robot has a cheerful attitude and a pleasant, high-pitched voice.

The robot – perhaps because of the man’s age – begins to ask him about his memories of World War II: “Please tell me, what was the hardest thing you and your family had to go through?” The old man explains that his father is in the Royal Air Force and they haven’t seen him for almost four years.

So why was a robot openly asking him what one of the most traumatic experiences he’d ever had was? The robot’s behavior was the product of the Caress project (Culturally Responsive Robots and Environmental Sensor Systems for Supporting the Elderly).

This project fits into the new field of “cultural robotics,” which aims to design robots that can take into account the cultural background of the person they are talking to and adjust their behavior accordingly. That’s why the robot talks about war. The man was British, so he guessed he would be interested.

In the future, we can expect robots to be increasingly used in our personal and social lives. There is currently active research in a wide range of areas, including delivery robots for supermarkets, entertainment robots, service robots for healthcare, fetch robots for warehouses, robots for dementia support, robots for people on the autism spectrum, and care robots for the elderly.

There are even robot priests who can give blessings in five languages ​​and robot priests who can educate people about Buddhism.

cultural stereotypes

Cultural robotics is part of a broader movement to make artificial intelligence and robotics more culturally inclusive.

Concerns about this move have been expressed before. For example, large language models (LLMs), such as those used in OpenAI’s ChatGPT, are trained on huge amounts of text. But since the internet is still predominantly in English, Masters are trained primarily on English texts; along with the cultural assumptions and biases therein.

Similarly, the move to make robots and AI more culturally sensitive is well-intentioned, but we are concerned about where this might lead.

For example, one study compared the cultural preferences of China, Germany, and Korea, drawing conclusions about what people in those countries want their robots to look like.

Drawing on previous studies on cultural preferences, they suggested that more “masculine” societies tend to find things “big and fast” beautiful, while more “feminine” societies find things “small and slow” beautiful. They cited studies claiming to show that Korean culture is “medium masculinity” while German culture is “high masculinity” and hypothesized that Korean people are more likely to find service robots (which tend to be small or medium-sized and slow). pleasant.

Another study compared the personal space preferences of Germans and “Arabs.” But these cannot be compared. “Arab” is a potentially offensive term for many people and can be used to describe people of many different cultural and national origins. It’s certainly not on the same level as categories like “German,” which is a non-offensive term for people of a single nationality.

It is also becoming increasingly evident that people respond differently to robots depending on their own cultural background. For example, different cultures have different expectations about personal space, and this affects how far away they prefer robots to stand.

Different cultures also interpret facial expressions differently. One study found that people understand a robot better if it communicates using facial expressions they are familiar with.

The other way?

If we want to avoid designing robots based on broad and crude generalizations and stereotypes, then we will need a more nuanced approach to robotics culture.

Culture is an ambiguous and nuanced concept that is open to many interpretations. One survey lists more than 300 potential definitions of culture.

In our last research we argued that culture is “conceptually fragmented.” In short, our view is that there are so many different ways to understand culture and so many different types of robots that we shouldn’t expect a one-size-fits-all approach.

We think that different applications in robotics will require radically different approaches to culture. For example, imagine an entertainment robot tasked with dancing for the audience in a theater.

The best way to approach culture for this job might be to focus on what kind of entertainment people in the local area prefer. This might involve asking what types of dance styles are common locally and modeling the robot’s design accordingly.

Other applications may require a different approach to culture. For example, for a robot that is expected to interact with the same small number of people over a long period of time (such as a service robot in a care home), it may be more important for the robot to change its behavior over time, becoming more and more important. must adapt to the changing preferences of those it helps.

In this case, it may be more accurate to think of culture as something that emerges slowly and dynamically as a result of the interaction of different issues.

This means that the approach to culture in robotics is likely to be complex, multifaceted, and specific to each situation.

If we design robots based on relatively crude stereotypes and sweeping generalizations about different cultures, then we run the risk of spreading those stereotypes.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Speech

Speech

The authors do not work for, consult, own shares in, or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic duties.

Leave a Reply

Your email address will not be published. Required fields are marked *