Home / Computers/technology / News / Will Computers Ever Learn Emotional Intelligence?

WILL COMPUTERS EVER LEARN EMOTIONAL INTELLIGENCE?

An emerging trend in artificial intelligence is to get computers to detect how we are feeling and respond on an emotional level.

Emotional computers might even help us develop more compassion for one another. But the idea of creating robots that can understand, compute and respond to human emotions has been explored in movies for decades. From I, Robot to Ex Machina as well as Morgan.

However, it is a common misconception to accept the challenge to create emotionally intelligent computing systems, but it is too great to be met anytime soon. There are a number of definitions for emotional intelligence, but they all share a common core; the ability to be aware of your own and others' emotions, the ability to manage your own and others' emotions, and the ability to use emotions to get things done, and make better decisions and deal effectively with stress.

On the other hand, artificial intelligence is designing a future where society will have better communication with robots. However, scientists are vividly faced with the problem of translating intrinsic human intuition, and complex attributes of what it means to be human, into computer codes and figures. In reality, computers are already demonstrating they can augment – or even replace – human emotional intelligence.

The human brain is a subject of varied complex emotions with different family backgrounds and different ways of interacting with emotions, it makes life more interesting. All this does impose extreme difficulties in developing the intelligence of technology models that would interact swiftly with humans.

Perhaps, and it may even come as a surprise to some, the lack of emotion in computing systems can place them in a good position to be emotionally intelligent – unlike humans, who are not always particularly good at reading others, and are prone to miss emotional signals or being fooled by lies.

An award-winning designer, artist, and researcher, Raphael Arar, think we can teach a computer to make sense with arts and nostalgia scores.

But there are challenges to face. Think about how you feel when you hear your favourite song for the first time in a long time. Can you describe the full context of how it made you feel? It is obviously hard to describe.

"Part of us feels so simple, but under the surface, there’s really a ton of complexity," Arar said at a TED talk. "And translating that complexity into machines is what makes the modern day’s moon shots". There are loads of human emotions; qualities which our fingers obviously cannot access.

But how can we make AI that people actually want to interact with it? Arar is taking arts as a way of designing better experiences that can be used to develop AI systems which would have better communications with humans.

"I view art as the gateway, to help us bridge this gap between humans and machines," Arar said. "Through arts, we are tackling some of the hardest questions like: what does it really mean to feel? Or how do we engage and know how to be pleasant with each other? And how does intuition affect the way that we interact?"

AI technology has been used and developed to make sense of some basic human emotions, like fear, sadness and joy. This was done by converting those characteristics into maths. However, the challenge is more focused on the complex emotions that we have a hard time describing to each other, especially nostalgia.

Arar worked with some data scientists to explore the complex human emotions and to establish a model that can convert these feelings into something mathematical. He created a piece of art; an experience which asked people to share a memory and a model he referred to as "nostalgic score".

These stories that were shared were then analysed using a computer for a simpler emotion, considering words that are associated with nostalgia in the story and the results then provide a nostalgia score.

The results were used to build a light-base nostalgic sculpture that serves as the physical embodiment of the person's contribution. "The higher the nostalgia score, the rosier the hue," and the score "describes how the experience made you feel."

By breaking down the steps involved in having a conversation with some AI robots may help us understand how much more needs to be taught in order for AI to interact with people. We should rather teach the AI computer how to make sense of emotions that we have a hard time explaining to each other.

Although it might be creepy to think that in the future we will be monitored by machines that can detect our emotions, computing systems with emotional intelligence are already surpassing human capabilities. Far from being stuck in the realm of science fiction, they could soon be a reality in our homes, cars and even our office space.

Be sure to watch Raphael Arar's TED talk below. 


LATEST
There Is A Patent For A Foldable iPhone
MIT's Latest Mini Cheetah Robot Can Do Backflips
The Latest Trend Might Be Foldable Smartwatches
Google's New Messaging App Can Tell What You're Texting About
Latest Samsung Galaxy Will Have An Instagram Mode Built Into It
Twitter Wants To Improve The Way Its Users Communicate
Finally Google's Incognito Mode Will Be More Private
Tetris 99 Is Now Available for Nintendo Switch Fans!
Neill Blomkamp Finally Launched The Anticipated Anthem Short Film