Publication | Open Access
Facial Action Unit Intensity Estimation via Semantic Correspondence Learning with Dynamic Graph Convolution
49
Citations
23
References
2020
Year
EngineeringMachine LearningBiometricsIntensity EstimationFacial Action UnitsFace DetectionFacial Recognition SystemImage AnalysisData SciencePattern RecognitionAffective ComputingSemantic Correspondence LearningVideo TransformerMachine VisionFeature LearningSemantic Correspondence ConvolutionComputer ScienceDeep LearningComputer VisionFacial Expression RecognitionFacial AnimationDynamic Graph Convolution
The intensity estimation of facial action units (AUs) is challenging due to subtle changes in the person's facial appearance. Previous approaches mainly rely on probabilistic models or predefined rules for modeling co-occurrence relationships among AUs, leading to limited generalization. In contrast, we present a new learning framework that automatically learns the latent relationships of AUs via establishing semantic correspondences between feature maps. In the heatmap regression-based network, feature maps preserve rich semantic information associated with AU intensities and locations. Moreover, the AU co-occurring pattern can be reflected by activating a set of feature channels, where each channel encodes a specific visual pattern of AU. This motivates us to model the correlation among feature channels, which implicitly represents the co-occurrence relationship of AU intensity levels. Specifically, we introduce a semantic correspondence convolution (SCC) module to dynamically compute the correspondences from deep and low resolution feature maps, and thus enhancing the discriminability of features. The experimental results demonstrate the effectiveness and the superior performance of our method on two benchmark datasets.
| Year | Citations | |
|---|---|---|
Page 1
Page 1