Real environment facial expression recognition method based on three-dimensional face feature reconstruction and image deep learning

A feature reconstruction, three-dimensional face technology, applied in the field of image processing, to improve the status of facial expression recognition in the real environment and improve the accuracy of the effect

Pending Publication Date: 2022-01-28
NANJING UNIV OF POSTS & TELECOMM
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the influence of arbitrary lighting, occlusion, and pose changes on facial expression images in real environments, it still poses a huge challenge to existing facial expression recognition methods.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real environment facial expression recognition method based on three-dimensional face feature reconstruction and image deep learning
  • Real environment facial expression recognition method based on three-dimensional face feature reconstruction and image deep learning
  • Real environment facial expression recognition method based on three-dimensional face feature reconstruction and image deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] The technical solution of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0021] Step 1: Input the facial expression image in the real environment into the three-dimensional dense face reconstruction network, and output the facial appearance features and facial geometric features. The facial appearance features are as follows: figure 2 As shown, the present invention saves it as a smooth two-dimensional image, and the facial geometric features are as image 3 As shown, it consists of 3D coordinate information of 68 facial key points; among them, the facial appearance feature is the reconstructed appearance feature obtained after the multi-pose face image is transformed into a frontal face. In order to facilitate feature extraction, the present invention processes it as a smooth Two-dimensional image; the facial geometric feature is the three-dimensional coordinate information of several key points of the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a real environment facial expression recognition method based on three-dimensional face feature reconstruction and graph deep learning. The method comprises the following steps: reconstructing and learning a network model 3DF-RLN through an established three-dimensional dense face reconstruction network and end-to-end trainable three-dimensional face features; obtaining facial appearance and facial geometric features through single 2D face image reconstruction, so that facial expression information is effectively represented. The facial topological graph based on the face key points is obtained through the facial geometric features, the correlation between the facial geometric features can be reflected, and the method has great significance in facial expression recognition and related research. The CNN network effectively extracts expression information contained in the facial appearance features, and the GCN network effectively extracts information contained in the facial geometric features. A fusion recognition module formed by channel attention and softmax effectively fuses complementary information contained in the facial appearance features and the facial geometric features, and the expression recognition accuracy is improved. In general, the method improves the accuracy of facial expression recognition, and improves the real environment facial expression recognition effect.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a real-environment facial expression recognition method based on three-dimensional face feature reconstruction and graph deep learning. Background technique [0002] In the prior art, facial expression recognition, as one of the important branches in the field of computer vision, has been successfully applied to telemedicine, fatigue driving monitoring, and many human-computer interaction systems. Many existing expression recognition methods have achieved remarkable success on expression databases in laboratory environments. However, due to the influence of arbitrary lighting, occlusion, and pose changes on facial expression images in real environments, it still poses a huge challenge to existing facial expression recognition methods. [0003] In recent years, with the advancement of face 3D reconstruction technology, the conversion from a single 2D facial i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06V40/16G06V10/80G06V10/82G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/253
Inventor 孙宁陶江龙季丰达
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products