Skip to content

Commit

Permalink
Update index.html
Browse files Browse the repository at this point in the history
  • Loading branch information
itironal authored Apr 17, 2024
1 parent af14850 commit 7ccf692
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion index.html
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ <h3><b>Abstract</b></h3>
<div class="row" style="margin-bottom:5px">
</div>
<p class="text-left">
PyAFAR is a Python-based, open-source facial action unit detection library for use with adults and infants. Convolutional Neural Networks were trained on BP4D+ for adults, and Histogram of Gradients (HoG) features from MIAMI and CLOCK databases for infants were used with Light Gradient Boosted Machines. In adults, Action Unit occurrence and intensity detection are enabled for 12 AUs. The 12 AUs were selected on the criterion that they occurred more than 5% of the time in the training data that included BP4D. Because 5% baseline was the minimum for which reliability could be measured with confidence, AU with prevalence lower than that were not included. Action unit intensity estimation is enabled for 5 of these AU. In infants, AU occurrence is enabled for 9 action units that are involved in expression of positive and negative affect. For both adults and infants, facial landmark and head pose tracking are enabled as well. For adults, multiple persons within a video may be tracked. The library is developed for ease of use. The models are available for fine-tuning and further training. PyAFAR may be easily incorporated into user Python code.</p>
PyAFAR is a Python-based, open-source facial action unit detection library for use with adults and infants. Convolutional Neural Networks were trained on BP4D+ for adults, and Histogram of Gradients (HoG) features from MIAMI and CLOCK databases for infants were used with Light Gradient Boosting Machines. In adults, Action Unit occurrence and intensity detection are enabled for 12 AUs. The 12 AUs were selected on the criterion that they occurred more than 5% of the time in the training data that included BP4D. Because 5% baseline was the minimum for which reliability could be measured with confidence, AU with prevalence lower than that were not included. Action unit intensity estimation is enabled for 5 of these AU. In infants, AU occurrence is enabled for 9 action units that are involved in expression of positive and negative affect. For both adults and infants, facial landmark and head pose tracking are enabled as well. For adults, multiple persons within a video may be tracked. The library is developed for ease of use. The models are available for fine-tuning and further training. PyAFAR may be easily incorporated into user Python code.</p>
<div class="row" style="margin-bottom:5px">
<div class="col" style="text-align:center">
<img class="thumbnail" src="./images/pyafar_pipeline_updated.jpg" style="width:70%; margin-bottom:0px">
Expand Down

0 comments on commit 7ccf692

Please sign in to comment.