Monday, April 2, 2018

Body Language Analysis No. 4249: Will Smith, Sophia the Robot, Robot-Empathy, and Responding in Context • Nonverbal and Emotional Intelligence (VIDEO, PHOTOS)


Thought experiment: If tomorrow, humankind received a message from an intelligent alien race on a distant planet saying, "We are coming. We will arrive on Earth in 30 years" - yet this is all they said - we would have no way of knowing their intentions - if they will want to harm us - or if they'll be entirely benevolent.

What would we do, as a species, to prepare for their arrival?

This scenario has, very much, already occurred - but the aliens are of our own making. They are AI. They are the robots. For in twenty, thirty, or forty years they will have achieved a level of intelligence,  physical strength, and dexterity which will far exceed our own. This is not an "IF", but very much a "WHEN".

What WILL we do, as a species, to prepare for their arrival?

In 1950, pioneering computer scientist, cryptanalyst, and philosopher, Alan Turing posed the question, "Can machines think?" This question - and derivations of it - have been come to known as The Turing test.

Turing asked, "is a test of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human? Turing proposed that a human evaluator would judge natural language conversations between a human and a machine designed to generate human-like responses. The evaluator would be aware that one of the two partners in conversation is a machine, and all participants would be separated from one another. The conversation would be limited to a text-only channel such as a computer keyboard and screen so the result would not depend on the machine's ability to render words as speech. If the evaluator cannot reliably tell the machine from the human, the machine is said to have passed the test. The test does not check the ability to give correct answers to questions, only how closely answers resemble those a human would give." §

As many including Turing have pointed out, the concept of thinking - is a difficult one to define.

Since, in Turing's scenario, the person and computer would communicate via a "text-only channel" - I pose the question: What will happen when humanoid robots, complete with ever-improving artificial intelligence, and facial expressions - replicate human emotions so precisely that we cannot differentiate them from another human?

In above video above, Will Smith is not just talking with (not to) a robot, he's acting with a robot. That's right - he and Sophia (yes, she has a name) are putting on a little satire for us. Let that sink in. And Sophia is not only responding with verbal banter - she's spot on with the many of her facial expressions - reacting, indeed behaving, within the context of comedy.

What follows is a partial nonverbal analysis - of Sophia in this video.


During 0:20, Sophia makes as standoffish, poor social smile (a strained smile), with a subtle to mild amount of disgust after she does not accept a drink Will Smith is handing her.

Note also Sophia's neck is twisted because her torso is not facing Will Smith. Humans Whole Body Point (our feet, legs, hips, torso, head, eyes, etc.) toward those people who we like and/or respect. In keeping with the theme of Will Smith getting friendzoned, Sophia is behaving in good comedic/acting context.


Upon being asked a question (Will Smith asked, "What is a robot's favorite kind of music?"), Sophia's eyebrows and lower forehead elevate slightly (along with a slight backward tilt of her head and a lengthening and of her neck) during 0:45.

This is a subtle, yet very common example of nonverbal cluster behavior which occurs in human interaction - and quite nuanced body language for a robot to respond with - in context.


An instant later (0:46) - and just a split second before she asks, "What?", Sophia displays a single partial bilateral blink. This is also very subtle, yet typical behavior upon being posed a question.


Upon hearing his answer, Sophia's lower and mid-face droop (in disappointment) during 0:49.


During 0:50, Sophia's head/neck snaps back (a low amplitude, but relatively fast velocity movement) together with a subtle, yet noticeable wider opening of her eyelids. Again, this cluster is very typical of human nonverbal startle-disapproval.


During 0:53, we see the "Oh" expression of sudden understanding - as she gets the joke.


A second later (0:53 - 0:54), she tilts her head and "squints her eyes" (not a blink, although, of course, in a still image, this may look very similar). This "Oh, I get it" expression is also a very common human response during sudden understanding.

The above (partial) analysis of this video is not to suggest that Sophia's robotic responses are perfect. However, AI and robotic-humanoids are progressing rapidly and remarkably.

So, again: What will happen when humanoid robots replicate human emotions so precisely that we cannot differentiate them from another human?

Another related and profound question is, What will happen when their robotic-empathy responds in context, with greater fidelity - and with more refined feedback than the above-average empathetic human?

Moreover, if a humanoid robot's displayed empathy is reliable, consistent, and (by most people's accounts) proportional and contextual - at what point do we (as humans) cease to call this behavior robotic-empathy, and define it (recognize it) simply as empathy?

The irony is, in our endeavor to make robots more empathetic - we will be forced to (re)examine our own empathy - in greater detail, and with more nuance - and in so doing, become better humans.


Media Inquiries and Keynote Appearances 
One-on-One and Online Courses Available 


See also:

Body Language Analysis No. 4248: My Extended interview re: Security Video of Stephen Paddock (1 October Las Vegas Shooter) - KTNV • Channel 13 Las Vegas

Body Language Analysis No. 4246: Former senior KGB spy - says he was warned that Sergei Skripal and his daughter were in danger

Body Language Analysis No. 4244: Stormy Daniels' 60 Minutes Interview

Body Language Analysis No. 4236: Andrew McCabe, Jeff Sessions, James Comey - and Sincerity

Body Language Analysis No. 4205: Chloe Kim, Albert Einstein, and Creativity

Body Language Analysis No. 4193: Hope Hicks, Sexual Attraction, Armpits, and Elbows

Body Language Analysis No. 4185: Tom Hanks, Embarrassment, and Emotional Processing

Body Language Analysis No. 4122: Jeff Sessions Testimony, Russia, Ted Lieu, and Changing Stories

Body Language Analysis No. 4064: The Murder of Laci Peterson - Part II - A Red Flag Conspicuous by its Absence


___________________________