AI Emotional Intelligence and Emotion Recognition: The Case of AI Self-Driving Cars

3265

By Lance Eliot, the AI Trends Insider

President Abraham Lincoln was considered an emotional person. I realize this seems counter-intuitive to what most people perceive about Lincoln’s persona and reputation. He undoubtedly dealt with a lot of people that were highly emotional and had to provide a calming influence on them.  We tend to think of Lincoln as someone that remained calm amidst an enormous cultural and political and societal storm in our country.

Meanwhile, he struggled with trying to rein in his own emotions. In a letter he wrote in 1841 to a congressman, Lincoln expressed that if perchance how he (Lincoln) felt was distributed to the whole human family, there would consequently not be even one cheerful face on earth.

I suspect that most of us wrestle with our emotions. It’s usually an ongoing tussle. Many in the high-tech field admire the Star Trek character Mr. Spock or Data due to their ability to keep their emotions in-check. Indeed, many high-tech people try to do so themselves, of which, sometimes others then accuse them of being staid, uncaring, and unmoving. I dare say some of them would take that as a great compliment.

When referring to emotions, there are lots of varied definitions of what kinds of emotions exist. Some try to say that similar to how colors have a base set and you can then mix-and-match those base colors to render additional colors, so the same applies to emotions. They assert that there are some fundamental emotions and we then mix-and-match those to get other emotions. But, there is much disagreement about what are the core or fundamental emotions and it’s generally an unsettled debate.

One viewpoint has been that there are six core emotions:

  •         Anger
  •         Disgust
  •         Fear
  •         Happiness
  •         Sadness
  •         Surprise

I’m guessing that if you closely consider those six, you’ll maybe right away start to question how those six are the core. Aren’t there other emotions that could also be considered core? How would those six be combined to make all of the other seemingly emotions that we have? And so on. This highlights my point about there being quite a debate on this matter.

Some claim that these emotions are also to be considered core:

  •         Amusement
  •         Awe
  •         Contentment
  •         Desire
  •         Embarrassment
  •         Pain
  •         Relief
  •         Sympathy

Some further claim these are also considered core:

  •         Boredom
  •         Confusion
  •         Interest
  •         Pride
  •         Shame
  •         Contempt
  •         Interest
  •         Relief
  •         Triumph

Say what? All of those are to be considered core? You might begin to think it is just a random list of words or maybe an unending list of words. I’m sure that if you turn to a colleague and start to go through the list with them, you’ll end-up in a bitter battle over which words should be considered emotions, and which are not, and also whether they belong in any “core” related list or not. Please don’t go to fisticuffs over this.

Anyway, you’ll find that there are various attempts at trying to pin down the lists of emotions, including charts, matrices, and even a wheel of emotions. This brings up another salient point, namely whether emotions are universal in that all humans have them, or whether it is a culturally induced aspect and thus differs from culture to culture.  Not only might there be a difference by culture, you could also potentially suggest that there are differences over time – did the primitive caveman and cavewoman have the same laundry list of emotions as we now have?

Maybe our emotions are another example of nature versus nurture. Perhaps some emotions are innate and come with our DNA. Other emotions are possibly learned. You could try to take an extreme position and suggest that all emotions are innate, or likewise the extreme posture that all emotions are learned (and none are innate). Good luck on that. It’s an open question and there are lots of researchers poking and prodding at trying to solve foundational questions about emotions.

Are humans uniquely the only creatures that have emotions? At one point in time, some argued that animals cannot have emotions since they are presumably unthinking, and that only humans can feel emotions. Others say that emotions are not in our heads, but instead in our hearts, which for those that are more scientifically based they find rather incredulous because the heart is considered merely an organ for pumping blood. In that strict view, emotions must arise presumably via our minds. In any case, animals do have brains and so ultimately most would agree that they seem to exhibit emotions too.

Humans Feel Emotion Internally and Expess Emotion Externally

Speaking of the notion of exhibiting emotions, it brings up another important points about emotions. Some suggest that we have internal emotions and that we also express emotions externally. You might be feeling unhappy on the inside (let’s consider unhappiness to be an emotion, for purposes of discussion herein), and yet to the rest of us you appear to be expressing a happy emotion. Therefore, others that look at you might not know what your “true” emotional state is.

You might be purposely hiding or burying your true emotional state, seeking to keep it hidden from view. If you ever watch a poker tournament, you’ll see the players often times wearing hoodies and sunglasses, since they are worried that a subtle look in their eye or a movement of their forehead might give away their inner emotion. They might have all aces and be eager to take the pot of money, but if you suspect they are ecstatic then you might not be bluffed into upping the ante.

This also points out that the hiding of emotions might also involve the display of emotions as a false portrayal. You might really be unhappy and intentionally displaying a smiling happy look. The smiling happy look didn’t occur by happenstance. Instead, you are trying to make yourself appear to have a certain emotion, even though on the inside you don’t believe you embody it. Some might say that we wear emotional masks on our faces, which at times differs with our inner emotional state.

Emotion is often characterized as occurring in a state-based manner. From moment to moment, your emotions might change. I think we’ve all encountered the roller coaster emotional person that at first was laughing uproariously and then a few moments later cried a torrent. It is believed that we might have an established emotional state that is our base, and then it is apt to vary depending upon the circumstances and situation.

Many might consider you to be a calm person that appears to be emotionless, and so that’s the base state that you seem to be working from. We don’t know if its innate or learned, but anyway its what you’ve seemingly established as your base state. If you suddenly have a boat anchor dropped on your foot, you could possibly leap outside of your calm base state and become irritated, upset, yelling and screaming. Those that know you would likely say that it was uncharacteristic of you. The situation though would certainly be one that we all might discern made sense that it dragged you out of your base state into a more emotionally laden one.

Your base emotional state might change over time. Maybe you had one base state as a child, another as a teenager, and yet another as an adult. It can change too for a variety of reasons, such as a person that perhaps is highly emotional and seems to wear their emotions on their sleeves, and perhaps they undergo some kind of therapy or treatment to keep better control of their emotions. They then become a less emotional driven person, at least as far as the rest of us can tell.

The advent of smartphones and the taking of selfies has led to interest in being able to discern computationally someone’s emotions from how they look. Social media sites are each clamoring to add computer-based capabilities that try to discern the emotional state of someone as depicted in a picture or a video.

This kind of sentiment analysis can be very handy. If you are running an video ad for the latest new laundry detergent, and you can capture the facial reaction of someone watching the ad, it might computationally be possible to guess at how the person reacted to the ad. A smiling face might imply that they are interested in buying the soap and thus you could next offer then a discount coupon to get them to act on those emotions.

There is a grand race right now of trying to create a really good emotion detector. The most obvious way to do this involves analyzing a picture of someone’s face. Are they smiling? Are they sad? Are they yelling? Are they tight lipped? These are all relatively easy aspects to scan and try to find inside a picture of someone’s face. You might need to deal with difficulties such as something obscuring the face, perhaps they have a hoodie on and are wearing sunglasses, or maybe someone else was standing in front of them. This make things a bit harder and so it isn’t always a slam dunk to find the emotions.

It’s also important to not jump to conclusions simply due to a facial expression. You might have a facial expression that lasted for a split second, and an instant later you had a different facial expression. Does it make sense to scan the first instance and conclude something about your emotional state? Maybe not. It might be that you’d need to look at the face over a series of time shots, such as by inspecting video, rather than looking solely at a brief instant in time.

Even if you do get a clear facial expression to use, there are a variety of twists and turns in terms of interpreting emotion from a facial expression. In some cultures, the mouth that looks like it is shaped into a happy mood might actually be considered a sad look. Or, maybe the person was trying to fool the camera and so put on a fake look for a brief moment, meanwhile they really were in a quite different emotional state.

There are other telltale aspects about your emotion beyond just your facial expression. Your gestures such as waving your arms and the movement of your hands and fingers can be another vital signal of your emotional state. Your manner of speech can also express your emotions. Does your voice sound strained or is it highly confident? There are a slew of ways to try and gauge the emotional state of a human, and so you would need to tie together a variety of elements if you presumably want to get a full sense of the emotions of a person at any given point in time.

One newer avenue of keen interest involves your breathing rate. Recent studies show that by detecting how you are breathing that it can aid in revealing your emotions. Early studies examined babies to see if it was feasible to readily measure a baby’s breathing and deduce its heartrate. It has become a focus now to study human micro-motions of their chest and other body areas to see if it is possible to detect via breathing the status of the human.

It is asserted that the physiological interplay of the heart, the breathing, the body, and the emotions, makes for a potent means of figuring out the emotional state of a person. Some studies suggest that there is a statistical correlation between the variants in your heartbeat intervals and your emotional state, and that the heartbeat intervals can potentially be divined by using your breathing. You should be cautioned though that the morphology of the human body comes to play here and that chest displacement and heart displacement can involve a great deal of signal-to-interference-and-noise (SINR).

At first glance, you might be thinking that using the breathing of a human to try and detect emotions is simply another example of looking at their face or gestures. What makes this added technique quite interesting is that rather than trying to figure out the breathing via a visual or audio means, it is shown that it is possible to do so via RF signals.

This means that for example you could potentially use a WiFi to figure out someone’s emotional state. The WiFi is sending out signals and for which they will possibly bounce off objects such as a human and their chest. The reflections back to the WiFi device could then be used to try ascertain the movement of your chest and therefore your breathing aspects. This is a clever means of turning WiFi into a kind of radar device.

Some might view this as a scary proposition, since it implies that wherever there is WiFi setup, potentially it could “spy” on you and be trying to determine your emotional state. Well, yes, it does seem possible that this can occur. The upside is that when you come home after a long day at the office, and you enter into your house, the WiFi might be able to figure out that you’ve had a tough day and are emotionally distraught, and need some emotional soothing by the music being played in the house and the lighting, etc.

What does this have to do with AI self-driving cars?

At the Cybernetic AI Self-Driving Car Institute, we are developing AI systems for self-driving cars. One aspect about self-driving cars is that they are likely to be including humans inside of the cars, whether due to the human co-driving with the AI or due to the human being driven entirely by the AI.

Discerning the Emotional State of Humans In, Around the Self-Driving Car

It would be advantageous, potentially, for the AI to try and discern the emotional state of the humans that are in or around the AI self-driving car.

I realize this seems kind of spooky like a science fiction story. Nonetheless, when you consider this logically, it makes sense that the AI might be able to do a better job of the driving of the self-driving car if it could discern the emotions of the humans involved.

Suppose a human gets into an AI self-driving car and yells at the AI to take them to the nearest pier and drive off the end of it. Should the AI blindly obey this command? I think we’d all agree that the AI would be “unwise” to abide by such a command.

The AI needs to have Natural Language Processing (NLP) capabilities to parse the words being uttered by the human. The NLP needs to then be able to interpret the words and try to figure out what the human has said. In the effort of understanding the words themselves, it is helpful to also try and comprehend how the words were expressed.

I might say to you that I think Joe is a really smart guy. Did I really mean that he is smart? Or, was I perhaps saying this in a sardonic or cynical manner. It could be that I’m trying to express that Joe is really as dumb as a tree stump. But, I’ve via my words said something quite different. That’s the trick of our human languages and interaction. The words alone are not sufficient to determine what our meaning consists of.

I hope you can see that if we could detect the emotions of a human, it has the possibility of increasing the conversational aspects with humans in terms of what the AI can do. I’m not suggesting that the AI will always be able to detect the emotions, plus as I’ve already mentioned there is the chance that someone is expressing outwardly a different set of emotions than what they really feel inside of them. This whole detection of emotions is a tricky affair and the AI cannot make conclusive conclusions about someone’s emotional state. Instead, it would be used as an indicator and part of the “package” aspects of interpreting human communication such as spoken speech and gestures.

For more about AI and human conversations and AI self-driving cars, see my article: https://aitrends.com/features/socio-behavioral-computing-for-ai-self-driving-cars/

For voice NLP and AI self-driving cars, see my article: https://aitrends.com/selfdrivingcars/car-voice-commands-nlp-self-driving-cars/

For key safety aspects, see my article: https://www.aitrends.com/selfdrivingcars/safety-and-ai-self-driving-cars-world-safety-summit-on-autonomous-tech/

For my article about key trends, see: https://www.aitrends.com/ai-insider/top-10-ai-trends-insider-predictions-about-ai-and-ai-self-driving-cars-for-2019/

There are various levels of self-driving cars. The topmost level is considered Level 5, and refers to an AI self-driving car that has the AI driving the car without any need for human intervention. In fact, often a Level 5 self-driving car will have no provision for human driving, doing so by having removed from the self-driving car any gas and brake pedals and removed the steering wheel. For self-driving cars that are less than the Level 5, those such self-driving cars require a human driver be in the self-driving car and that the human driver be attentive and responsible at all times for the driving of the car. This means that even if the AI is co-sharing the driving task, the human driver is still on-the-hook. I mention this because I’ve mentioned many times abut the dangers of this co-sharing arrangement.

For my article about the levels of self-driving cars, see: ihttps://aitrends.com/selfdrivingcars/richter-scale-levels-self-driving-cars/

For my overall framework of AI self-driving cars, see: https://aitrends.com/selfdrivingcars/framework-ai-self-driving-driverless-cars-big-picture/

For my article about the dangers of co-sharing the driving task, see: https://aitrends.com/selfdrivingcars/human-back-up-drivers-for-ai-self-driving-cars/

When a self-driving car is less than a Level 5, the question arises as to whether or not the human driver is actually paying attention to the roadway and the driving task. If the human driver becomes lulled into believing that the AI is doing just fine, it can create an untoward circumstance if the AI suddenly opts to hand the driving back to the human driver. As such, there are various means to try and detect whether the human driver is staying active in the driving task. For example, the steering wheel might detect if their hands are present on the steering wheel, and there might be an inward facing camera that is detecting whether the driver’s face is facing forward and whether their eyes are on-the-road.

Some believe that we should add the emotional state detection into the matter of trying to discern whether the human driver is paying attention to the driving task. This could be done via the visual, facial recognition of trying to detect emotions. It could be done by the audio detection of emotions by the nature of the words and sounds that the human driver might utter. And, we now know that it might also be possible via the breathing of the human driver, using RF signals to try and detect this.

The odds are high that most AI self-driving cars will have some form of WiFi included into them. This is likely done for the purposes of having human occupants that want to use WiFi, such as those traveling on a family trip and all have brought their laptops and smartphones with them. The WiFi might also be used for the OTA (Over The Air) updating of the AI self-driving car, which is a feature involving connecting the AI to a cloud capability to then share aspects up into the cloud and get updates pushed down into the self-driving car.

For my article about steering wheels and detecting human drivers attention, see: https://aitrends.com/selfdrivingcars/steering-wheel-gets-self-driving-car-attention-ai/

For my article about OTA, see: https://aitrends.com/selfdrivingcars/air-ota-updating-ai-self-driving-cars/

In terms of family road trips and AI self-driving cars, see my article: https://aitrends.com/ai-insider/family-road-trip-and-ai-self-driving-cars/

You might be saying to yourself that for a less than Level 5 self-driving car that it makes sense to be trying to gauge the emotion of the human driver, since it ties to the driving task, potentially, but why would a less than Level 5 self-driving car need to know the emotional state of the occupants?

Admittedly, knowing the emotional state of the occupants does not necessarily have as iron clad a rationale as does the case of the human driver. But, you could say that knowing the emotional state of the occupants could be quite handy. As mentioned earlier, the example of a human occupant that says to drive to the end of the pier might be interpreted differently if the AI is also examining the emotional status of the occupant.

There are other ways that the emotional state of the human occupants might come to play. Some good, some maybe bad, depending upon your viewpoint of the world. Let’s assume that the advent of AI self-driving cars is likely to produce the ridesharing-as-a-service mania that most predict will happen. We’re going to have ridesharing cars aplenty, with zillions of AI self-driving cars constantly roaming and at the ready to take us humans for a ride.

Inside a ridesharing AI self-driving car, there are likely to be ads, either pictures, video, streaming, or whatever. This will be a means to get some extra dough for the owner of the AI self-driving car, and perhaps a means to have a lessened cost to the human passenger for the ride. Similar to how I earlier provided the example of detecting an emotional reaction to a laundry soap ad, imagine how beneficial it might be to use the inward facing cameras of the AI self-driving car and perhaps the RF signal capabilities to determine the human occupant’s emotional reaction to the in-car ads. Could be quite handy.

There are lots of other handy reasons to detect the human emotion involved. Maybe the AI could calm down a passenger that is overly distraught. Maybe the AI should contact someone such as emergency services if the person seems to be emotionally on the verge of some untoward act. And so on.

We must also ask though whether this is a violation of privacy. You might say that once you get inside the AI self-driving car, you’ve given up some amount of privacy by the act of getting into the AI self-driving car. We might also see regulation appear that will cover the privacy aspects of human occupants inside AI self-driving cars. Some point out that today when you get into a human driven ridesharing service you are already giving up various privacy aspects, and so they don’t see this as any different than if the AI was driving. Anyway, it all needs to be sorted out.

For more about the advent of ridesharing as a service due to AI self-driving cars, see my article: https://aitrends.com/selfdrivingcars/ridesharing-services-and-ai-self-driving-cars-notably-uber-in-or-uber-out/

For my article about privacy aspects and AI self-driving cars, see: https://aitrends.com/selfdrivingcars/privacy-ai-self-driving-cars/

In addition to detecting the emotions of humans inside an AI self-driving car, we might also consider the possibilities of detecting emotions of humans outside the AI self-driving car. A person steps up to the AI self-driving car and raises their hands in a threatening manner, as though they are going to pound on the hood of the self-driving car. Let’s assume that the sensors of the AI self-driving car detect this person doing this, which is generally feasible to detect.

Should the AI self-driving car ignore the apparent actions of the person? Maybe this is a pedestrian aiming to do harm to the self-driving car and its occupants, if any. One might argue that if the AI could also detect the emotional state of the person, it might have a better chance of figuring out the intent of the person. We’re though starting to veer again towards some challenging privacy issues. Suppose the AI self-driving car is driving down the street and meanwhile detecting the emotional state of every person that’s walking, biking, sitting, crawling or in any means being seen by the AI system.

For my article about defensive driving by AI systems, see: https://aitrends.com/selfdrivingcars/art-defensive-driving-key-self-driving-car-success/

For my article about dealing with pedestrians, see: https://aitrends.com/selfdrivingcars/avoiding-pedestrian-roadkill-self-driving-cars/

It is said that we humans have a sense of Emotional Intelligence (EI). Indeed, there are those that purport to be able to measure your degree of emotional capabilities via your Emotional Quotient (EQ) or your Emotional Intelligence Quotient (EIQ). These measures are about our ability to detect the emotion in others, along with our own capability related to emotions.

With advances in computational capabilities and AI, we are getting closer to automated systems that can detect emotions and presumably react to them in an “intelligent” manner. AI self-driving cars will certainly benefit from this kind of capability, though we also need to recognize that it creates numerous ethical and privacy concerns too. I’d say that you should be ready for an emotional roller coaster ride on the way to figuring out how much emotional recognition we want in our AI self-driving cars.

Copyright 2018 Dr. Lance Eliot

This content is originally posted on AI Trends.