Face Expression Analysis

People learn how to read and interpret other peoples’ emotions from early childhood through social interactions. With the fast-paced development of neural networks and deep machine learning, scientists have created a new AI type. This AI can read human emotions by performing facial analysis using specialized algorithms. Continue reading to find out about the main features of facial expression analysis.
Table of Contents
What Are Facial Expressions?
Types of Facial Expressions
What Emotions Can Our Faces Express?
What Are Facial Expressions Analysis Methods?
Facial Action Coding System (FACS)
Electromyography Method
Automatic Face Recognition
How Facial Expression Analysis Tech Works
Face Detection
Face Registration
Masking
Feature Detection
Feature Classification
How Algorithms Are Trained for Face Expression Analysis
Where Is Analysis of Facial Expression Used Today?
Neuromarketing
Advertisement and Customer Satisfaction
Research in Psychology
Plastic Surgery
Virtual Assistants for Emotional Communication
HR
Bright Future of Facial Expression Analysis and Recognition
Challenges of Face Analysis
Landmark Detection
Onset and Offset Detection
Deep Learning
Cross-Database Recognition
Emotion Class
Type of Features
Summary
Face Analysis FAQ
What is an example of facial expression?
How do you analyze facial expressions?
What is facial analysis?
What is the use of facial expression analysis?
What Are Facial Expressions?
The human face is the most sophisticated, highly developed, and effective mean of communication. Its functions have been perfected by millions of years of evolution. Facial expressions are the movements of facial musculature reflecting a variety of human mental states. In other words, these are expressive movements of the facial muscles, which are a form of manifesting certain feelings.
Types of Facial Expressions
Paul Ekman and Wallace V. Friesen originally developed the Facial Action Coding System (FACS) in 1978 to classify human facial expressions. According to this system, facial expressions are divided into three types.
- Macro-expressions are the daily routine expressions, usually obvious to all the sides of a communication act. They last between 0.5 and 4 seconds;
- Micro-expressions are short, less than 0.5 seconds, involuntary facial expressions appearing when a person is trying to hide or suppress the emotion. Micro-expressions cannot be consciously controlled;
- Subtle expressions are emotional responses to an event, environment, or another living being. Subtle expressions are not intensified and often mark the moment when a person starts feeling an emotion.
What Emotions Can Our Faces Express?
The human mental state is a very deep and complex notion. It directly affects our facial expressions. Although those reactions can be affected by particular personality traits, some basic classification can be related to any person, regardless of gender, age, race, education, or social status.
Each emotion triggers various muscular movement patterns. It leads to us having different facial expressions while we are, for example, sad, happy, surprised, or angry.
– Anger | – Happiness |
– Fear | – Surprise |
– Disgust | – Sadness |
Although scientists don’t yet fully agree, this is the most accepted contemporary universal classification of emotion.
What Are Facial Expressions Analysis Methods?
The article Measuring Facial Expression of Emotion published in The Dialogues in Clinical Neuroscience journal summarizes three facial expressions analysis methods. They are:
- Facial Action Coding System or Emotion Facial Action Coding System;
- Electromyography method;
- Automatic face recognition.
Each method has its distinctive features, pros, and cons, depending on the purposes they are used for.
Facial Action Coding System (FACS)
Due to its scientific objectivity, FACS has become the most well-spread and popular system worldwide. It’s a detailed, 500-page tutorial on how to read faces. It contains a detailed analysis of possible facial muscle movements, their combinations, and the nature of their performance. It aims to train a system to recognize different combinations at a different speed and with different degrees of manifestation (up to barely noticeable and very fast ones). The tutorial provides photo and video examples and practical exercises.
Electromyography Method
This method uses surface electrodes and enables the diagnostics of bioelectric potentials when reducing muscle fibers. During EMG, the nerve-muscular endings have varying potentials, which are then recorded.
EMG was designed to reveal those human emotions and mental states that cannot be tracked visually on our faces. Even when a person tries to avoid or conceal an emotion, the device will still record subconscious changes in the brain.
Automatic Face Recognition
Automatic face recognition is a practical application of image recognition. Its purpose is to automatically localize the face in the photo or video and identify the person by face. The photo identification feature is already actively used in photo album management software and mobile devices’ unlocking systems.
This method is based on a special algorithm that creates face signatures. They are generated using different physical features of the human face. This information is then transformed into a mathematical formula and run through face databases or compared with the given examples.
How Facial Expression Analysis Tech Works
Facial expression recognition is a sub-area of image processing technology. It is based on a specific algorithm that has several steps.
Face Detection
The recognition systems, powered by emotion AI, read facial expressions using an optical sensor, such as a regular webcam or a smartphone camera. They can identify faces in real-time video, on recordings, or in images.
Computer vision algorithms capture the human face’s basic points: eyes, nose tip, eyebrows, mouth corners – and track their movements to decipher emotions. By matching the collected data with the samples taken from the existing databases, the recognition software can define a person’s feelings.
Face Registration
Image registration is the process of converting different datasets into a single coordinate system. The data can vary from just several photos to data received from various sensors. It can vary in time, depths, and viewpoints. In this case, image registration is necessary to compare or integrate the data obtained from these different measurements.
Masking
Masking is used to clear the images from noise, which is the result of unwanted facial movements. Such “noise” can affect the process of facial expression analysis. In the face detection stage, it’s possible to define reliable data and differentiate it from noise. For example, rigid head orientation and movements can result in the loss of the background. Mouth and eye motions are usually also masked as they cannot be marked as non-rigid and usually result in creating noise.
Feature Detection
Part of face detecting is creating a facial map using features like eyebrows, mouth and eyes corners, nose tips, etc. This face model follows the movements of the actual individual’s face to detect if it matches the map.
Feature Classification
Once the face model is ready, its key features are used to start the classification algorithms. Those algorithms use the features to interpret codes, emotional states, and other important parameters.
How Algorithms Are Trained for Face Expression Analysis
Recognizing human emotions is one of the basic skills that people learn very early. Neural networks are the basis of any AI. They are designed to replicate the cognitive activities of the human brain. Essentially, deep neural networks can be trained to detect, register, and classify human emotions. Face expression analysis uses machine learning systems. Usually, it’s based on the processing of huge amounts of data.
Computer vision is one of the AI areas that automates photo and video analysis using machine learning. Data scientists collect data samples and manually mark the change in an emotional state to teach a neural network. The program learns the patterns and understands what signs of emotion are associated with them.
In some methods, action units are used to identify emotions. They can be lowered eyebrows, raised chin, wrinkled nose, etc. In this case, neural networks independently find out what determines a human emotion, which leads to a ready algorithm.
The most popular tool used in automatic facial expression recognition is a CNN engine. CNN, or convolutional neural network, is a special neural network architecture and the main tool for speech recognition and classifying and recognizing objects and people in images. CNN process includes the training stage and the deployment stage. The latter is implemented with the help of embedded vision processors.
Where Is Analysis of Facial Expression Used Today?
It can be used in marketing, medicine, robotics, empirical academic research – in any field where a deep understanding of human emotional responses to a certain action is required.
Neuromarketing
Companies use facial expression analysis to optimize their products, assess market segments, identify their customer profiles, and target their audience.
Advertisement and Customer Satisfaction
The technology allows to set up special cameras that change the advertisement depending on the mood, gender, and age of passing by people. Businesses can find out the number of unique visitors, their socio-dynamic profile and calculate the satisfaction index.
Research in Psychology
Clinical psychologists widely use the technology in their systematic studies to research how people respond to specific stimuli and social environment and how their responses are affected by personality traits.
Plastic Surgery
Tumors, diseases, infections, and trauma can cause facial nerve paralysis. It can result in reduced quality of life. Then people need difficult treatments, including plastic surgery. Facial analysis enables the planning of such treatments.
Virtual Assistants for Emotional Communication
Well-known virtual assistants Siri and Alexa were created to behave like people, learn, and conduct real conversations. Potentially, such assistants will learn how to show emotions in the near future.
HR
Facial expression analysis can be successfully used in HR practices to evaluate candidates. It also helps determine the employee’s condition, notice fatigue or dissatisfaction, and redistribute the tasks more efficiently. However, some specialists have some legal and privacy apprehensions concerning using the technology at the workplace.
Bright Future of Facial Expression Analysis and Recognition
Interacting with people is the basics of any business. That means that any business can profit from using facial expression detection technologies to better understand people’s emotions and reactions.
Some sources estimate that by 2024, more than half of the Internet ads will be powered by emotional AI. Computer vision will be one of the most important technologies in the next five years.
Challenges of Face Analysis
Facial expression analysis is still a developing field with a lot to discover. Like many other innovative technologies, it has some challenges scientists and researchers still face.
Landmark Detection
Although the algorithms detecting facial landmarks have been perfected, they do not always process data accurately and steadily. Dynamic facial expressions cause a lot of noise. It complicates the process of identifying the correct micro-expressions.
Onset and Offset Detection
For successful further research, it’s vital to precisely detect the onset and offset expression frames. This may be challenging for automatic face analysis as in real-time interactions as the expressions are changing continuously.
Deep Learning
Significant volumes of data are necessary for deep learning. Obtaining the amounts of data sufficient for productive results may be challenging. Another problem is that samples are not always distributed equally.
Cross-Database Recognition
Most researchers claim that a great deal of discovery awaits ahead to develop steady algorithms successfully working throughout various domains. It is a challenging task to match training and test samples from different environments.
Emotion Class
Different systems use different emotion classifications, which can lead to fuzzy and inconclusive results. Recent works support the idea of using objective classes instead of the emotion classifications provided by the databases.
Type of Features
Like the emotion class challenge, the extracted features may contain controversial information based on different parameters such as geometric parameters, appearance features, motion, etc. It is still problematic to establish a systematic approach.
Summary
Facial expression analysis uses several methods and algorithms for automated detection, collection, and analysis of facial muscles movement that either reflects human emotion or represent responses to internal and external stimuli.
Over the past decade, the research in the field has brought drastic results. Now, facial expression analysis technology is used in many areas of human activity.
Face Analysis FAQ
What is an example of facial expression?
Examples of facial expressions can be some usual facial muscle movements such as smiling to show happiness, lowering eyebrows to express frustration, or raising eyebrows to emphasize a word.
How do you analyze facial expressions?
Scientists collect and analyze related data in three main ways.
- They track facial electromyographic activity, recording nerve-muscle potentials;
- They observe and manually code real-time facial activity;
- They run computer vision algorithms and use deep neural networks for facial expression analysis.
Often, for better results, they use a combination of these methods.
What is facial analysis?
Facial expression analysis is a process of automatic detection, collection, and analysis of facial muscle movement and facial feature changes, which reflect particular human mental states and conditions.
What is the use of facial expression analysis?
Face expression analysis algorithms are used in many human activity fields: psychological and medical research, health care, security, IT innovations, marketing, business, HR, and many others.