Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Facial expression recognition method, convolutional neural network model training method, devices and electronic apparatus

A technology of convolutional neural network and expression recognition, which is applied in the field of convolutional neural network model training methods, devices and electronic equipment, and can solve the problems of long training time, high cost of expression recognition, scattered and complicated training process, etc.

Inactive Publication Date: 2018-06-29
SENSETIME GRP LTD
View PDF7 Cites 49 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However: first, face feature extraction requires manual design and extraction, which requires expertise in a specific field; second, compared with deep feature (feature map), classic geometric features such as Gabor filter, SIFT, etc. Weak; third, traditional machine learning methods are difficult to use more and more training data, the training time is long, and the training process is scattered and complicated
[0004] As a result, the cost of existing expression recognition is high, and the accuracy of expression recognition is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Facial expression recognition method, convolutional neural network model training method, devices and electronic apparatus
  • Facial expression recognition method, convolutional neural network model training method, devices and electronic apparatus
  • Facial expression recognition method, convolutional neural network model training method, devices and electronic apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0069] refer to figure 1 , shows a flowchart of steps of an expression recognition method according to Embodiment 1 of the present invention.

[0070] The facial expression recognition method of the present embodiment comprises the following steps:

[0071] Step S102: Using the convolutional layer part of the convolutional neural network model and the acquired face key points in the face image to be detected, perform facial expression feature extraction on the face image to be detected to obtain a face expression feature map.

[0072] The trained convolutional neural network model has the function of facial expression recognition, which at least includes an input layer part, a convolutional layer part, a pooling layer part, a fully connected layer part, etc. Among them, the input layer part is used to input the image; the convolution layer part performs feature extraction; the pooling layer part performs pooling processing on the processing results of the convolution layer part...

Embodiment 2

[0083] refer to figure 2 , shows a flowchart of steps of an expression recognition method according to Embodiment 2 of the present invention.

[0084] In this embodiment, a convolutional neural network model with a facial expression recognition function is trained first, and then facial expression recognition of images is performed based on the model. However, those skilled in the art should understand that in actual use, the convolutional neural network model trained by a third party can also be used for facial expression recognition.

[0085] The facial expression recognition method of the present embodiment comprises the following steps:

[0086] Step S202: Obtain sample images for training, and use the sample images to train a convolutional neural network model.

[0087] Wherein, the sample image may be a static image, or a sample image of a sequence of video frames. The sample image contains the information of the key points of the human face and the annotation inform...

Embodiment 3

[0107] refer to image 3 , shows a flow chart of steps of an expression recognition method according to Embodiment 3 of the present invention.

[0108]This embodiment describes the facial expression recognition method of the embodiment of the present invention in the form of a specific example. The facial expression recognition method of this embodiment includes both a convolutional neural network model training part and a facial expression recognition part using the trained convolutional neural network model.

[0109] The facial expression recognition method of the present embodiment comprises the following steps:

[0110] Step S302: Collect facial expression images and perform expression labeling to form a sample image set to be trained.

[0111] For example, ten expressions were manually marked: angry, calm, confused, disgusted, happy, sad, scared, surprised, squinting, and screaming.

[0112] Step S304: Use the face detection algorithm to detect the faces and their key ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiments of the present invention provide a facial expression recognition method, a convolutional neural network model training method, a facial expression recognition device, a convolutional neural network model training device and an electronic apparatus. The facial expression recognition method includes the following steps that: facial expression features are extracted from a face imageto be detected by means of the convolutional layer portion of a convolutional neural network model and acquired face key points in the face image to be detected, so that a facial expression feature image is obtained; ROI (regions of interest) corresponding to the face key points in the facial expression feature image are determined; pooling processing is performed on the determined ROIs through adopting the pooling layer of the convolutional neural network model, so that a pooled ROI feature image can be obtained; and the facial expression recognition result of the face image is obtained at least according to the ROI feature map. With the facial expression recognition method provided by the embodiments of the present invention adopted, subtle facial expression changes can be effectively captured, and at the same time, differences caused by different facial gestures can be better processed; and the detailed information of the changes of the plurality of regions of a face are fully utilized, so that subtle facial expression changes and faces in different postures can be recognized more accurately.

Description

technical field [0001] The embodiments of the present invention relate to the technical field of artificial intelligence, and in particular to an expression recognition method, device and electronic equipment, and a convolutional neural network model training method, device and electronic equipment. Background technique [0002] Facial expression recognition technology refers to assigning an expression category to a given face image, including: anger, disgust, happiness, sadness, fear, surprise, etc. At present, facial expression recognition technology has gradually shown broad application prospects in the fields of human-computer interaction, clinical diagnosis, distance education, investigation and interrogation, and is a popular research direction of computer vision and artificial intelligence. [0003] An existing facial expression recognition technology is a recognition technology based on a traditional machine learning framework. Expression recognition using this trad...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06K9/32G06N3/08
CPCG06N3/084G06V40/175G06V40/171G06V40/172G06V10/25G06F18/24G06F18/214
Inventor 金啸胡晨晨旷章辉张伟
Owner SENSETIME GRP LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products