AI is able to analyse my words and it can pretend to sympathise with any dilemma that I present it with in a very effective and plausible manner. The latter is particularly obnoxious as it always sounds like it is talking to me as if I were a child, but anyway, despite the comforting sentiments, it can’t feel my pain. There is an empathy gap, and this is why artists, content creators, entrepreneurs, and human storytellers are even more necessary in a world becoming more and more shaped and manipulated by artificial intelligence.
Empathy is the ability to perceive, share, and respond to another’s emotional state. It’s not just recognition, it's sensing the emotion of another in yourself, however fleetingly. It is this latter experience that no computer, no matter how sophisticated, can ever have.
In recent times, words like empathy have been overused and commodified to the point where they have been rendered almost useless. For the simple-minded and evil they have conveniently become categories of experience that have been labeled up, given specific definitions and limits of application, and woe to those who transgress those boundaries. We have all experienced passive aggressive, highly judgmental people, weaponising these concepts to attack other people for personal benefit.
These people make others feel bad for not conforming to their ideal concept. For instance, not being empathetic enough or not being empathetic in the “proper way.” One of the saddest individuals I ever met had parents who deeply loved him. But because they didn't love him in the way he thought they should he was forever angry and lonely. Their expression of love didn't fit his definition of what it should be.
I mention all this because computers of any kind cannot work without categories and labels, whether the description be mathematical or text based. Despite ever finer definitions, permutations and recombinations, it is simply not possible to capture the full range of the human emotional experience nor even its tiniest aspects. Especially since human behaviour is so dynamic and situational.
Why does this matter? With so much data available on every possible topic and with so many people being able to access that information and repurpose it according to their intentions the danger becomes lack of human relatability. Mass produced AI content, no matter how sophisticated, will not reach the customer or user where it matters, in their emotions and feelings. The places where decisions are really made.
Nearly everyone is irrational, particularly those who don't believe they are. The best marketers and salespeople know this. People want benefits not features. And the ultimate benefit is to feel good about oneself in some way.
Emotional connection drives value, creating empathy and is where the mother lode of real communication is for entrepreneur and content creator alike.
Entrepreneurs and artists can use AI to handle logistics—market research, data analysis, or content scheduling—but the heart of their work, relating meaningfully to others, requires the human element.
Tools like DALL-E can produce stunning visuals, but they often lack that extra spark. There is something not quite 'alive' about the images as amazing as some of them are. The same is definitely true of content creation. I run these articles through different LLMs for typo and error checking. Some of the corrections to straight-up errors on my part are spot on. And I am grateful for the ability to deal with them easily. But when it suggests rewordings of sentences or alterations to my prose it is always a dull, bureaucratic, committee-like rephrasing. I might as well not be me if I used those suggestions.
But if AI should masquerade as an entity capable of empathy then there is real danger for vulnerable and unwary. Turning connection into a commodity via AI's ability to sound extremely plausible can lead to dangerous outcomes. Engendering misplaced trust can lead to feelings of anxiety, cynicism, and worse, betrayal and paranoia.
Not good.
Background Reading:
Ekman, P. (1993). Facial expression and emotion. American Psychologist. Available: https://www.paulekman.com/wp-content/uploads/2013/07/Facial-Expression-And-Emotion.pdf
Huang, M. H., & Rust, R. T. (2021). A strategic framework for artificial intelligence in marketing. Journal of the Academy of Marketing Science. Available: https://link.springer.com/article/10.1007/s11747-020-00749-9
Tausczik, Y. R., & Pennebaker, J. W. (2010). The psychological meaning of words: LIWC and computerized text analysis methods. Journal of Language and Social Psychology. Available: https://journals.sagepub.com/doi/pdf/10.1177/0261927X09351676
Darcy, A. M., & Robinson, L. (2021). Ethical considerations in AI-based mental health tools. Frontiers in Psychiatry. Available: https://www.frontiersin.org/articles/10.3389/fpsyt.2021.642460/full