The human language uses abstractions and symbols that are interpreted according to culture. The discrepancies in the interpretation of those symbols are what make communication challenging. Take for example the sentence: I see you. This could be interpreted as a quotidian greeting or as a solemn recognition of the divine in the other. Imagine that you go to Mars and you find a drawing of the sun on a cave wall. Is this a statement of the star that provides life to all things? or a record of the destruction of civilization through a cosmic cataclysm? Imagine now that earth is going through an alien invasion. Aliens show us pictures of what look to be white flags. In our culture, we may then conclude that they come in peace. However, this symbol may as well be their banner of war.
In these examples, the words and the images are the same but the meaning is going to be determined by our shared value system. Culture is then that operating system that understands the information we receive through our senses. Since we all have a different OS determined by our ancestry and life experiences our interpretation of shared symbols may vary.
This brings up another important point and that is the assumption that symbols are universal and that translation is possible. Perhaps the closest realization of this idea is Mathematics. The ideas of a set, a group, or a function are universal. We all humans have agreed on the meaning of these symbols. That is not the case with human languages. Take as a reference the words for colours in different languages. There are about 11 main colour categories in English, these are the words that we use to describe different hues. However such categories are not universal. For example, in Japanese, there are about 16. Colours like matcha or mizu are very specific hues that Japanese people easily recognize whereas English speakers don’t have a specific word for such hues.
The recognition of culture is currently a limiting factor for technologies such as ChatGPT. The other day, I asked it to write a song in Spanish about my brother Fredy who loves Chocoramo (a popular Colombian snack). The AI was able to generate an answer in a couple of minutes where the reasonable concept of people loving chocolates and the association between sugar, energy, and happiness was established. However, this song did not refer to the meaning of growing up with Chocoramo in Colombia: it is one of those foods that you would have at school during the break or later at home at snack time; It is something you would give someone you wanted to be your boyfriend or girlfriend, or you would share it with friends. In short, Chocoramo is part of the Colombian identity. The ChatGPT song did not refer to any of these meanings. Additionally, it did not have any semblance of a rhythmic pattern in Spanish and the song would have flopped commercially had I decided to quit science and pursue this as an artistic endeavour.
I hope to see a future where AI is multicultural with a culture OS that changes depending on the user and the context. To get there, we as humans need to lead the way. Alluding to An Advanced Readers’ Picture of Comparative Cognition by Ken Liu, the trip to the focal point of our solar system to listen and try to communicate with other species is worth the trip. Learning from other cultures is how we can improve our communication and how we should train the technology of the future to talk to us