Obtaining the weight of each of the configured emotions in the MacroFacialExpressionDetector script
Dear colleagues, I am working with the facial tracking technique using the new Meta Quest Pro. I find it great that it works with the FACS system, as there is enough literature on the internet to configure emotions. However, I am encountering some problems that I need to solve, but I don't know how. Firstly, I would like to know if it's possible to obtain the weight of each of the configured emotions using the MacroFacialExpressionDetector script. Let me give you an example to make it easier to understand. Suppose I have configured the emotions "Happiness," "Anger," "Sadness," and "Surprise" using the FACS system (https://imotions.com/blog/learning/research-fundamentals/facial-action-coding-system/). Normally, I can set an action when one of the emotions occurs, but I have noticed that I need to force the face a bit to do so. Additionally, I have seen that other algorithms or software provide a weight for each of the configured emotions, i.e., a result such as: the subject is 81% happy and 19% surprised. Is it not possible to obtain a similar result? On the other hand, I have tried to obtain the weight between 0-1 of each of the 63 microgestures in a csv, but then I cannot translate this information. Any ideas?3.1KViews1like8Comments