Facial Action Units-Based Image Retrieval for Facial Expression Recognition

Trinh Thi Doan Pham, Sesong Kim, Yucheng Lu, Seung Won Jung, Chee Sun Won

Research output: Contribution to journalArticlepeer-review

38 Scopus citations

Abstract

Facial expression recognition (FER) is a very challenging problem in computer vision. Although extensive research has been conducted to improve FER performance in recent years, there is still room for improvement. A common goal of FER is to classify a given face image into one of seven emotion categories: angry, disgust, fear, happy, neutral, sad, and surprise. In this paper, we propose to use a simple multi-layer perceptron (MLP) classifier that determines whether the current classification result is reliable or not. If the current classification result is determined as unreliable, we use the given face image as a query to search for similar images. In particular, facial action units are used to retrieve the images with a similar facial expression. Then, another MLP is trained to predict final emotion category by aggregating classification output vectors of the query image and its retrieved similar images. Experimental results on FER2013 dataset demonstrate that the performance of the state-of-the-art networks can be further improved by our proposed method.

Original languageEnglish
Article number8599142
Pages (from-to)5200-5207
Number of pages8
JournalIEEE Access
Volume7
DOIs
StatePublished - 2019

Keywords

  • Convolutional neural networks
  • facial action units
  • facial expression recognition
  • image retrieval

Fingerprint

Dive into the research topics of 'Facial Action Units-Based Image Retrieval for Facial Expression Recognition'. Together they form a unique fingerprint.

Cite this