cancel
Showing results for 
Search instead for 
Did you mean: 

Obtaining the weight of each of the configured emotions in the MacroFacialExpressionDetector script

Almxdena
Protege

Dear colleagues,

I am working with the facial tracking technique using the new Meta Quest Pro. I find it great that it works with the FACS system, as there is enough literature on the internet to configure emotions. However, I am encountering some problems that I need to solve, but I don't know how.

Firstly, I would like to know if it's possible to obtain the weight of each of the configured emotions using the MacroFacialExpressionDetector script. Let me give you an example to make it easier to understand. Suppose I have configured the emotions "Happiness," "Anger," "Sadness," and "Surprise" using the FACS system (https://imotions.com/blog/learning/research-fundamentals/facial-action-coding-system/). Normally, I can set an action when one of the emotions occurs, but I have noticed that I need to force the face a bit to do so. Additionally, I have seen that other algorithms or software provide a weight for each of the configured emotions, i.e., a result such as: the subject is 81% happy and 19% surprised. Is it not possible to obtain a similar result?

On the other hand, I have tried to obtain the weight between 0-1 of each of the 63 microgestures in a csv, but then I cannot translate this information. Any ideas?

8 REPLIES 8

jaume.gallego
Explorer

Hello, 

please, can you share how did you configure the emotions?
Moreover, Did you solve how to have the weighted emotions?

 

Thank you

Yes, I solved this and for sure I can share it with you.

Can you share how did you configure the emotions?

The MacroFacialExpressionDetector script is where you have to define the name of an emotion. There's a line of code similar to this one. Each of the names correspond to an emotion, but just the name. I added Sadness, Anger and Surprise.

public enum MacroExpressionType {
   Happiness = 0, Sadness, Anger, Surprise
}

If you save the script, you will be able to add different microgestures to each emotion in the Unity editor. Just look for the FACS system to understand which configuration correspond to each emotion.


Moreover, Did you solve how to have the weighted emotions?

Well, I got this but out of Unity. On the other hand, I was able to obtaine the weight for each microgesutre. I attach the script.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Oculus.Movement.Effects;
using System.IO;
using TMPro;


public class ExpressionThreshold : MonoBehaviour
{
    [Header("Emotion displayed")]
    public TextMeshProUGUI _text1;
    public TextMeshProUGUI _text2;

    public OVRFaceExpressions faceExpression;

    #region expressions and weights
    //Add as many expressions and weights as you need
    public OVRFaceExpressions.FaceExpression leftEyeBlink;
    public OVRFaceExpressions.FaceExpression rightEyeBlink;
    public float weightL;
    public float weightR;
    #endregion

    void Update()
    {
        weightL = faceExpression[leftEyeBlink];
        weightR = faceExpression[rightEyeBlink];
    }
    
    //We can display the expression weight using this void
    public void WeightExpression (float expressionWeight, OVRFaceExpressions.FaceExpression expression)
    {
        expressionWeight = faceExpression[expression];

        _text1.text = expressionWeight.ToString();
        _text2.text = expression.ToString();

        Debug.Log(expressionWeight);
    }
}

Ggizzle1k
Expert Protege

Great information 

Hey 👋 only you have to is rested everything 

Hello Almxdena,

Thank you for this great answer. Sorry I cannot find the MacroFacialExpressionDetector script. Please, can you tell me in which library can I find this script?

Thank you again.

I am not sure if the script is included in the Meta Interaction SDK or in this example project that you need to import to Unity in order to use the face-tracking https://github.com/oculus-samples/Unity-Movement

jaume.gallego
Explorer

Thank you again for your fast response. I will check both libraries to see if I can find out where is the script and I'll write here where it is exactly. 

jaume.gallego
Explorer

Great, I found it in the GitHub repository you specified:
https://github.com/oculus-samples/Unity-Movement

The path is:
Unity-Movement/Runtime/Scripts/Effects/ExpressionDetection/MacroFacialExpressionDetector.cs