Computer scientist Rana el Kaliouby is on a mission to humanize technology with artificial emotional intelligence, or what she calls “Emotion AI.” Born in Cairo and raised in the Middle East, Rana is the co-founder and CEO of the Boston-based software company Affectiva. There, she’s developing a “deep learning” platform that combines facial expression with tone of voice to infer how a person is feeling.
Profile: Rana el Kaliouby
Published: September 26, 2018
Rana el Kaliouby: Our devices know who we are, where we are, what we’re doing. They have a sense of our calendar. But they have no idea what we’re feeling.
It’s completely oblivious to whether you’re having a good day, a bad day. Are you stressed? Are you upset? Are you lonely?
Talithia Williams: In other words, our machines have no emotional intelligence. And that’s important. Our host, Rana el Kaliouby would know. She’s devoted her career to solve just that.
It all started back when she was a grad student from Egypt at the University of Cambridge.
el Kaliouby: There was one day where I was at the computer lab, and I was actually, literally in tears because I was that homesick. And I was chatting with my husband at the time and the only way I could tell him that I was really upset was to basically type, “I’m crying.” And that was when I realized, all of these emotions that we have as humans, they’re basically lost in cyberspace and I thought we could do better.
Williams: But to do better, how? Rana’s next stop was MIT where she continued work on a new algorithm. One the could pick up on the important features of human behavior. Could tell whether you’re feeling happy, sad, angry, scared. You name it.
el Kaliouby: It’s in your facial expressions. It’s in your tone of voice. It’s in your very nuanced kind of gestural cues.
Williams: Because she thinks this could transform the way we interact with technology. Our cars would alert us if we get sleepy. Our phones could tell us whether that text really was a joke. Our computers could tell if those web ads are wasting their time.
But where to start? She decided to go with the most emotive part of the human body.
el Kaliouby: The way our face works is basically we have about 45 facial muscles. So, for example, the zygomatics muscle is the one we use to smile. Take all these muscle combinations and you map them to an expression of emotion like anger or disgust or excitement. The way you then train an algorithm to do that is you feed it tens of thousands of examples of people doing these expressions.
Williams: First, her algorithm could only recognize three expressions. But it was enough to push her to take a leap.
el Kaliouby: and I remember very clearly, my dad was like, “What? You’re leaving MIT to run a company? Why would you ever do that?” In fact, the first couple of years, I kept the start up a secret from my family.
Williams: Eventually, Rana would convince her parents. But convincing investors was a whole other story.
el Kaliouby: It is very unusual, especially for women coming from the Middle East to be in technology and to be leaders. I remember this one when I was supposed to be presenting to an audience. And I walked into the room and people assumed I was the coffee lady.
Williams: And investors were not the hardest to convince.
el Kaliouby: All these doubts in my mind, like, are probably shaped by my upbringing, right? Where, women don’t lead companies and maybe I should be back home with my husband. I think I’ve learned over the years to have a voice and to use my voice and to believe in myself.
Williams: And once she did that…
el Kaliouby: We have this golden opportunity to reimagine how we connect with machines and therefore as humans how we connect with one another.
Williams: Today, Rana’s company, called Affectiva, has raised millions and has a learning algorithm that can recognize 20 different facial expressions. Many of her clients are marketing companies who want to know whether their ads are working. And she’s also developing software for automotive safety. But an application she’s especially proud of is this.
Ned Sahin: Most autistic children struggle with the basic communication skills that you and I take for granted.
Williams: A collaboration with neuroscientist Ned Sajin and his company brain power that allows autistic children to read the emotions in people’s faces.
el Kaliouby: Imagine that we have technology that can sense and understand emotion and that becomes like an emotion hearing aid that can help these individuals understand in real-time how people are feeling. It think that is a great example of how A.I.—emotion A.I. in particular—can really transform these people’s lives in a way that wasn’t possible before this kind of technology.
NOVA Wonders Can We Build a Brain?
Anna Lee Strachan
Digital Producer: Michael Rivera
© WGBH Educational Foundation 2018