Sophia body language

As you probably know the human body language is involuntarily linked to the human mind. So for example if we access our memory we tend to look to the left. If we feel uncomfortable we tighten our shoulders etc.

I wonder if those involuntary movements are / will be implemented in Sophia / other humanoids. I think this subtle way of communicaton will help us to better connect with humanoids.

Also humanoids could use kind of a simple sign language for communicating system information without talking.
So for example a finger to the temple of the head could indicate that a lot of computing power is used, One eye closed could indicate that there’s no wifi connection, etc.

2 Likes

This technology has been in development for a while, implemented primarily within the video games industry as a way to speed up workflow and cutting out the cost of facial motion capture.

1 Like

That’s a tool that might be useful for mimicry but it’s not quite what I imagine. There are several levels of human communication:

  • body language
  • facial expression
  • subtle movements hinting at brain activity
  • tone of voice
  • what is actually said
  • maybe others like perspiration, smell, etc.
    So you could express anger (like it is shown in the video) but your eye movements etc. can show if you are really angry or not. I think it would be beneficial to bridge the ‘uncanny valley’. Also imagine a child growing up with a robot nanny and never learning how to read the subtle sings of body language right.
1 Like

Body Language and vocal inflection and the ability to read, process and understand the meaning of it in a contextual way will require AGI.

Taken from my own experience as a human, it has taken me a great number of years and countless human interactions, to be able to accurately read the context of language based upon the unspoken elements of human communication.

To be empathic and sympathetic toward anothers situation, has been critical to my success, which I attribute to accurately deducing another’s meaning other than what was said.

I think narrow AI can approximate this by neural networks and deep learning, but it won’t be able to understand the deeper context.

One thought though, I have met some extremely talented people in my life, some of which have the ability to control to a great extent their body language and spoken tone. Though it is of course almost impossible to hide involuntary micro expressions.

Already we have AI trained to detect lieing by analysing these involuntary movements.

From a heuristic and benevolent point of view, it would be amazing and quite important for human android relations if robots could express themselves in a more human way. Of course this is the vision and goal of Dr. David Hanson.

3 Likes