Skin deep: How scientists are studying expression

Skin deep: How scientists are studying expression


How well do we understand facial
expressions? Emotional expressions are much more nuanced and variable than
people have assumed. And do our faces reliably express our feelings? What emotion is this person feeling? What about this one? We can use facial
expressions to communicate how we’re feeling and but it can also
reflect different personality traits. Is this person distraught or elated? Furious or celebratory? The interest in facial expressions and
the modern science emotion really began in the 1960s with a scientist named
Silvan Tomkins and two of his postdoctoral fellows Paul Ekman and Cal
Izard. They were very interested in testing the hypothesis that everyone
around the world scowls when they’re angry, smiles when
they’re happy, and correspondingly that everyone around the world recognises
those facial movements as expressions of emotion. These ideas stood largely
unchallenged for a generation but many researchers now believe that the picture
is a lot more nuanced. What we’re dealing with is highly variable, highly context sensitive,
temporarily changing patterns. And these contexts are one of the key
factors researchers are investigating as they try to understand facial
expressions. While people are moving their faces they’re also speaking, there are
acoustical changes in the vocalizations there are body postures. The person
carries around with them a whole internal context of their body and then
there’s the external context you know who else is present, what kind of
situation are they in, what are they doing and then there’s a temporal
context and all of that contextual information really matters.
So whereas scientists used to focus primarily on the face now they’re
focusing more on faces in context. Research into expressions has typically
relied on showing subjects posed photographs of faces
representing different emotions but photographs like this are often just
faces in isolation with no context at all. Also, because they are static they
lack the temporal context which could be read in a moving face.
What’s more the researchers have to start by assigning emotions to
expressions which can introduce biases and subjectivity. To get around this
researchers are turning to technology to find more neutral ways of studying
emotion. The approach that we take is more of a bottom-up, objective approach. Rachael Jack’s research uses a computer which randomly generates dynamic facial
expressions. We’ve used this really to look at the main differences between
cultures in terms of the six classic emotions; happy, surprise, fear, disgust,
anger and sad, and we found that East Asian facial expressions of these
emotions tends to be signaled primarily with the eye region, so the eyebrows and
differences in the eyes. Whereas with Westerners the face movements tended to
vary more in the mouth region. These differences, also, in the location of
expressivity on the face, is mirrored very nicely in cultural emoticons.
So in western emoticons what you find is that there’s two eyes and then the mouth
tends to be really to be able to express sadness happiness and so on, and it’s the
opposite pattern with East Asian emoticons in that the mouth is either
minimized or it’s completely absent and it’s the variance of the eyes which
communicates different emotion messages. Differences like this are leading some
researchers to believe the expressions are not as universal as once thought and
that some behaviours might be learnt in childhood. What’s happening in early
childhood is emotion knowledge is being bootstrapped into the brain, into the
brain’s wiring. This is also what happens when you change cultures when you move
from one culture to another culture you have to learn what is the nod of a
head or the or the raise of a lip or the shake of a head you know what do
these things actually mean in this culture. Despite these complexities
the idea that facial expressions could be used as a way to read emotions has
excited big business. Leading tech companies have already started
developing algorithms to detect expressions and one estimate suggests
the industry could be worth fifty six billion dollars by 2024. Systems are
already being trialed all marketed in a wide variety of fields. For hiring purposes, in educational settings, in security settings, and in certain places
even in legal settings. But despite the predictions many scientists think the
claim software could read emotion may be premature. We often make facial
expressions that convey an emotion message but doesn’t necessarily mean
that we have this emotion inside so oftentimes we smile out of politeness, we
want to convey to the other person that you know they’re welcome etc. But this
doesn’t necessarily mean that you’re experiencing an emotion. The other
challenge is that it could be very easy to trick automatic facial
expression detection systems. So we may want the system to believe that we are
feeling sad or happy and so it’d be quite easy for a human then to to create, to generate realistic facial expressions that could trick a system.

You May Also Like

About the Author: Oren Garnes

15 Comments

  1. I hope the researchers keep in mind that for many people, e.g. those on the Autism spectrum, facial expressions might not correspond to their emotions at all.

  2. didn't like this video. we've spent more than 6 minutes for "yeah there are facial expressions, they might not be so universal and researchers do research them, companies invest for them big time" thats what all this video says. waste of time, sorry

Leave a Reply

Your email address will not be published. Required fields are marked *