Why Emotional Intelligence will matter in an AI world

Whether you’re excited about it, uneasy, or ambivalent, AI is now evolving publicly at rocket pace and will continue to become a part of our everyday lives and work, and in ways we don’t even know yet.

We know it’s already being used and experimented with in academia, marketing, journalism, commerce and social media.  Mark Zuckerberg recently announced that Meta is experimenting with AI-powered chat on Whatsapp and Messenger, though they won’t be sharing this ‘futurist’ experience with users just yet, it seems that if AI hasn’t touched your world yet, it will soon.  

Having worked in the field of people and leadership development for nearly two decades, I have a strong concern that we are running head first into unchartered territory without the proper emotional tools in our tool belt.

One of the foundational issues is the already over-simplification of the human brain and our own ‘intelligence’. Elizabeth Weil, in her recent article in the Intelligencer, explains, it’s our tendency to describe the human brain as a ‘computer’ and our conflation with the computer as a human brain. She quotes researchers Alexis T. Baria and Keith Cross (from their study on the negative social consequence of the using the term ‘artificial intelligence’) saying that this over simplified notion affords:

the human mind less complexity than is owed, and the computer more wisdom than is due.”

I think we have a lot of development to do of our Emotional Intelligence, whilst we race to develop our Artificial Intelligence.

What is Emotional Intelligence?

In the 1990s researchers started to show that being able to understand and manage our emotions was key to living successfully when much emphasis up until then had been put on developing the intellect (IQ).  The term, Emotional Intelligence (EI), was first coined by researchers, John Mayer and Peter Salovey, who defined it as the ability to understand and manage your emotions and understand and respond appropriately to the emotions of those around you.  In the decades since then we now understand that Emotional Intelligence is a spectrum of factors and goes beyond managing your emotions and understanding the emotions of others. Psychologist, Daniel Goleman PhD, has gone further in popularising the term and expanded on its definitions, culminating in his seminal book ‘Emotional Intelligence (Why it can matter more than IQ)’ published in 2005 and updated again in 2020.

Goleman describes EI as having 4 key pillars –

1.     Self Awareness (Emotional Self Awareness)

2.     Self Management (Emotional Self Control, Achievement Orientation, Positive Outlook, Adaptability, Emotional Self Control/Emotional Agility)

3.     Social Awareness (Empathy, Organisational Awareness)

4.     Relationship Management (Influence, Coach & Mentor, Conflict Management, Teamwork, Inspirational Leadership)

As an aside, I believe there is also a fifth pillar which I am calling ‘Embodied Awareness’. By this I mean having a good understanding of our embodied self (our physical perception and our lived experience in our body) and how it connects, interacts and impacts the other four pillars of Emotional Intelligence. You can read more about this here in my recent article What Daniel Goleman’s Emotional Intelligence is missing.

I define Embodied Awareness as the following -

5.     Embodied Awareness (Energy Management, Phenomenological awareness, Groundedness, Heart Intelligence)

As you read this you may already be starting to have an intuitive sense of why Emotional Intelligence really matters in the age of AI.

Despite the excited claims by those connected to and invested in the industry about AI’s impact on efficiency and its potential in health care, education, writing and so on (Bill Gates says AI like ChatGPT is the most important innovation right now), after a couple of decades of having the internet and social media so integrated into our lives, we can posit that the proliferation  of AI will likely have revolutionary positive impacts and devastatingly negative impacts to our lives and society as we know it.

Here are 9 reasons why Emotional Intelligence will matter in an AI world:

Building strong, healthy human relationships. Our relationships with each other are going to be increasingly vital to our health and fulfilment and we need to learn how to be better at them.  What makes us human is seeking to be understood, to feel seen and heard, that we matter, that we belong and we contribute. Our ability to express, understand and relate to each other in context and respond in healthy ways is going to be key to developing strong bonds and healthy relationships in and out of work in an increasingly technology driven world.  Not only is Emotional Intelligence about recognising and understanding someone’s communicational responses, emotional state, body language, it’s also about understanding the nuance in how we relate.  Knowing the difference between sympathy, empathy and compassion is an example. In the Dec 2022 Wired article Emotional AI is no substitute for Empathy Pragya Agarwal writes:

 ‘Emotional AI algorithms, even when trained on large and diverse data sets, reduce facial and tonal expressions to an emotion without considering the social and cultural context of the person and the situation…..While, for instance, algorithms can recognize and report that a person is crying, it is not always possible to accurately deduce the reason and meaning behind the tears.’

Interacting with chat bots will be fun, weird and I expect useful in some contexts but it won’t be a match for the depth of relating that will be required in an AI world.

When MIT Scientist, Joseph Weizenbaum, created one of the earliest natural language processing computers, ELIZA, in the 1960s, he was spooked by how quickly his assistant, who was testing ELIZA’s capabilities, got into a personal conversation sharing intimate details as she interacted with the computer system (and subsequently asked him to leave the room so she could continue the interaction in private!) There seems like there could be a possible place for humans interacting and shedding emotional information with a seemingly ‘listening’ or ‘supportive’ chatbot, but my concern and unease is that any initial perceived feelings of connection, relief from angst or dissipated loneliness that comes from sharing our most intimate worries and fears to a chatbot, will be a thinly veiled, artificial substitute, like a Big Mac & fries or all the Zoom meetings during the pandemic – it meets a need for a while but doesn’t provide lasting, healthy depth of connection, relating and nourishment.

Leadership that connects and inspires. People are going to be looking for even more humanity, vision and connection from the people that lead them in organisations. When knowledge is outsourced to AI and is no longer going to be a differentiator, people will look for humanity and those that are able to bring people together alongside the technology, embrace difference, are willing to use their power courageously and act with calm and integrity.  In a world with more speed, ‘efficiency’ and uncertainty than ever before, with technology changing the fabric of society before our eyes, people will be looking to leaders who can remain grounded in the uncertainty, can listen well and think systemically with awareness of their responsibility and impact across the social, political, economical and organisational interconnecting landscapes. Self-awareness, systemic awareness and the ability to connect and inspire will be key.

“Leadership is about empathy. It is about having the ability to relate and to connect with people for the purpose of inspiring and empowering their lives.”

– Daniel Pink, A Whole New Mind 

Creative collaboration and the magic of problem solving.  As I was reading HBR’s Jan 2022 article Can AI Teach Us How to Become More Emotionally Intelligent? I found myself wincing a little. The writers enthusiastically talk about AI for customer services teams that tracks and analyses emotional states and responses from customers, feeding back to the customer services teams language to use and what to say to better meet customer needs.  Aren’t we just creating more robots? If the AI can teach how to listen well, what to listen for, to listen with our whole body, to listen to understand, to know our triggers, to practice and develop our phenomenological awareness I am all for it, but relying on technology to tell us what to say is going to create less emotional intelligence, not more, no? 

We know how it feels when we are navigating a problem at work with another person or a group of people, you’re in the mess together, it’s not always easy, there might be conflict and debate, but we get to solutions through a shared feeling of care and commitment to working through it together. There is a magic that happens in creative collaboration, there is an energy that exists between people being in the stickiness together and finally reaching a potential solution.   I know the joy I’ve experienced when I’ve been on the phone with someone in customer service dealing with an issue and getting to a resolution together! I know the difference between someone who is repeating a transcript and working through instructed motions and someone who is using their creativity and humanity to get to a solution alongside me.

In Daniel Pink’s book, A Whole New Mind, he writes:

 “IDEO is one of the world’s most respected design firms—the creator of everything from those fat-handled toothbrushes for kids to Apple Computer’s first mouse to the Palm V. How do they do it? The secret would make an MBA squirm: Empathy. In the IDEO universe, great design doesn’t begin with a cool drawing or a nifty gadget. It begins with a deep and empathic understanding of people.”

Discernment and knowing how it feels to interact with AI versus human. Writer, Alexander Beiner, in his recent Substack AI and Animism, Is Microsoft’s Bing Alive? urges on the need to be able to navigate how it feels to interact with AI versus human, especially as it gets increasingly difficult to differentiate between what’s ‘real’ and what is AI generated.  He says:

“When we’re overwhelmed, either by our own minds or the collective mind of the Internet, we need to draw on our own discernment; to remember that online, nothing is quite as it appears. When we do, we can see through the distractions to the deeper levels of our complex virtual lives and begin to use our technology more wisely.”

Discernment is having the ability to judge well, to differentiate, to pause, to reflect before taking action, to respond to our felt sense of something and respond accordingly.

We will need to develop a very good sense of who we are in our ‘real-world’ lives, our sense of self - our strengths & blindspots - our embodied awareness, our emotional agility and expanding our emotional language. In his book The Body Keeps the Score, Psychotherapist, and one of the leading voices on trauma research and neurobiology, Bessel van der Kolk, describes a common response in his patients and their lack of emotional language when asked how something felt to them,

“How did that feel?” People will say “good” or “bad’ – they are judgements. Instead ask “Did you notice any specific feelings that came up for you doing that?”  As a culture we are trained to cut ourselves off from the truth of what we’re feeling.’  He says. 

Developing our emotional intelligence, particularly our emotional language helps develop our discernment. Being able to feel and understand our experience of something, means we can articulate it and then take appropriate, intentional action. To give you a sense of why this will be important, we only need to read The New York Times journalist, Kevin Roose’s report on his unsettling online conversation with Microsoft’s new Open AI powered chatbot, Bing -

"It unsettled me so deeply that I had trouble sleeping afterward. And I no longer believe that the biggest problem with these A.I. models is their propensity for factual errors. Instead, I worry that the technology will learn how to influence human users, sometimes persuading them to act in destructive and harmful ways, and perhaps eventually grow capable of carrying out its own dangerous acts.”

Learning to deal with difficult emotions. Understanding how to deal with our difficult emotions is key to navigating our lives – both online and offline.  Anger, anxiety, depression, insecurity, loss, conflict, loneliness, self-esteem issues, even boredom, can be amplified online or numbed. We know that not having the tools to deal with difficult emotions and have honest, intimate conversations can lead to everything from illness, to relationship issues, to causing harm to self and others. In AI and Animism, Alexander Beiner references the work of Sherry Turkle, a clinical psychologist at MIT, who spent three decades studying our relationship with technology. In her book Alone Together: Why We Expect More from Technology and Less from Each Other, she argues that the internet has been intentionally designed to create a ‘friction-free emotional life’ by tech designers and tech giants, and that our devices are making it more difficult to have intimate, honest conversations. Many people are retreating into fantasy online virtual worlds because the real world is getting increasingly difficult and complex to navigate.  The online and offline world will continue to be more complex and developing these emotional intelligence tools and practice is going to be essential in navigating that complexity and difficulty.

Engaging with our own systemic intelligence beyond the rational. Despite decades of emphasis on the brain as the seat of our human intelligence, we now know that we have thousands of neurons in the heart and the gut that are in constant communication to and from the brain and throughout our body, through the vagus nerve. This vast system of nerves and sensitivity gives us access to intelligence and resources way beyond our cognitive rationality and analysis capabilities. Founder of the Strozzi Institute and Leadership & Embodiment expert, Richard Strozzi Heckler, explains it like this -

“It’s the intelligence that allows me to feel that the car is going too fast around a slippery curve without looking at the speedometer, to notice that the mood of a teammate is off without him saying anything, to soften under the gaze of a baby, to wince when a loved one is in pain, to lift up in glee, to take up space with divinity, to stiffen when a stranger enters uninvited into my space.’

- Richard Strozzi-Heckler, The Leadership Dojo

We are more than our brains, and practicing engaging with this whole system intelligence allows us to make sense of the world with more presence, awareness and possibility. We draw on deeper wisdom and gain insight from both our left and right brain, our head, heart and gut. It’s what drives our humanity, connects us to our intrinsic values, ethics and compassion for others. Our brain can rationalise anything, it’s our whole system intelligence, our head, heart and gut in coherence that puts the brakes on.

Maintaining good mental, emotional, physical health and managing overwhelm. Mental and physical health is already creaking under the pressures and pace of modern life. With new technology it’s only going to get faster.  Much of the talked-about benefits of AI are centered around efficiency. Instacart Chief architect, JJ Zhuang, shares how their integrated AI system will take grocery shopping and make it ‘fun’ by taking on the mental load.  This does sound appealing - we do spend an inordinate amount of time having to think about what to cook, eat and get the groceries - it also reminds me of when, in the 60’s and 70’s, the arrival of new household technology like washing machines and electric vacuum cleaners came with the promise of more free space and leisure time for housewives at the time, actually led to raised expectations of cleanliness and housekeeping, and the free time and space being filled up with more things to ‘get done’ and perfection to be attained in other areas of home and family life.

With the promise of AI writing our essays, speeches, resumes and presentations, doing our research, analysing our data, messaging our friends (and we’re yet to discover what else) will this lead to more free space and leisure time or will higher expectations and more to get done fill the gap? Emotional Intelligence includes knowing how to pause, rest and connect with the things that calm and nourish us mentally, emotionally, physically and spiritually.  Knowing the practices that help regulate our emotions and our nervous systems through self-regulation (e.g. sleep, nutrition, movement, breathwork, dance, strength training, journaling), co-regulation (being in company with other - human and animal, social and community connection) and eco-regulation (being in nature, connected to the natural environment around us).

Embracing difference and holding multiple perspectives. Part of the beauty and the pain of being human, is our ability to understand nuance and paradox – we can hold multiple truths at once and understand complexity in context. Having a healthy emotional intelligence gifts us this. We know that something can be true but it doesn’t mean that it’s right, or that something can be legal but can still cause harm. We can be in deep grief and experience joy at the same time. We can understand both/and not just binary and/or. Emotional Intelligence helps us hold the tension and the difficulty that this creates in our lived experience and to draw wisdom from it.

Quietening the hungry ghost. In Buddhist cosmology, they describe addiction as ‘the land of the hungry ghosts’, where people have huge appetites that are unable to be quenched or satisfied. Buddhism describes this craving as a ‘false refuge’, a place to try and hide and escape from being present with both the positive and the negative of life. It is said that we all have a case of the hungry ghost within us, a sense of never feeling quite whole, of wanting more, different, new. It doesn’t necessarily always turn into addiction. As I read and listen to some of the commentary on AI, especially the excitement from the tech giants and AI investors, I sense a bit of the collective hungry ghost.  The need to keep developing our technology, to see how far we can go, the continuous craving for more, better, faster.   To see if we really can build the intelligent machines from the movies of our childhood, to try and find answers to our biggest universal mysteries. (I’ve often wondered if the desire to create ‘artificial intelligence’ is to somehow soothe our deepest fear that we may be alone in this vast universe?). To quote Dr Gabor Maté from his book ‘In the Realm of Hungry Ghosts’:

“Boredom, rooted in a fundamental discomfort with the self, is one of the least tolerable mental states.”  

As we either clamber or are pushed aboard this AI rocket, let’s be aware of our hungry ghosts and develop the ability to get more comfortable with ourselves and with each other.

A mutuality - developing our emotional intelligence alongside artificial intelligence.

As I write this I felt very aware of my own limitations, of how much I just don’t know, how we don’t know yet how this will all evolve, what an AI world will look like. I know how important Emotional Intelligence is, and I know how much more we can learn and grow in this space, in relationship with ourselves, with others, in our organisations and leadership.

I will finish with a brief story from James Bridle, author of Ways of Being: Animals, Plants, Machines: The Search for a Planetary Intelligence. To understand time and the natural world’s intelligence, he set up a time-lapse camera in his living room to capture the activity of his house plants over time. He describes the time-lapse camera as the machine that enabled this expansive knowledge exchange between human and plant life, a mediator. I like the framing of this symbiotic, mutual exchange, and of technology as a tool that expands our awareness of the wider intelligence in and all around us that already exists and continues to evolve every day.

Previous
Previous

The Power of Including the Body

Next
Next

What Daniel Goleman’s Emotional Intelligence (EI) is missing