AI applications like ChatGPT are very good at giving generic feedback. They are not good at empathy, or other human skills.
This is, because they are missing a lifetime worth of human emotional experiences.
Yes, these human experiences are reflected in our songs, our poems, our artistic imagery. But all of this is a reflection, a shadow of the actual emotional experience. And emotions are what drives all of us forward, what makes all of us ultimately the human beings we are.
Essentially, AI, as long as it does not have a way to feel these human emotions, and to experience real life, will always give us the shadow, the reflection, the tasteless coffee which was reheated a second or third time.
AI is not the only entity disconnected from real human experience. We humans are disconnected from other human experiences in many ways. A privileged white man from a middle-class US family, for example, will not be able to feel how it feels like to be a woman in the middle east in a war zone. For example, in Palestine.
Sure, he will be able to empathize and try to draw on his worst experiences, and of course deeply feel anguish and also some fear, just looking at the pictures. Ultimately, however, he does not have the same experience, the same life as this woman. And never will. We can only deeply relate to experiences which are really very similar to our own, only these are the experiences we can understand at a visceral level.
Therefore, the inability of AI to relate to our human experience, maybe even the perceived coldness of AI, should serve as a reminder for us, that we ourselves are not able to relate to other human experiences. Maybe, if we went through the same experiences as the people we are trying to relate to, we might ourselves also act in very similar ways.
Photo by Andrea De Santis on Unsplash