Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Generation method of convolutional neural networks and expression recognition method

A convolutional neural network and expression recognition technology, applied in the field of image processing, can solve problems such as difficult real-time recognition of expressions, low calculation efficiency, and influence of expressions, and achieve the effects of reducing operation and maintenance costs, improving the accuracy of results, and improving user experience

Active Publication Date: 2018-06-22
XIAMEN MEITUZHIJIA TECH
View PDF7 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are many problems in predicting real expressions based on human faces. Factors such as makeup, plastic surgery, physical fitness and living environment can have a significant impact on the judgment of expressions.
[0003] At present, facial expression recognition methods are mainly divided into two categories: one is through the extracted facial features, such as SIFT (Scale-invariant feature transform, scale-invariant feature transform) features, clustering and other processing to achieve expression recognition recognition, but the accuracy can only reach about 50% to 60%, which is far from meeting the needs of actual products; the other type uses CNN (Convolutional Neural Network, Convolutional Neural Network) deep learning method, which can reach more than 95% Accuracy, but if such recognition accuracy is to be achieved, correspondingly, the structure of the CNN-based expression recognition model will be relatively large, with hundreds of megabytes in motion, and the calculation efficiency is low, it is difficult to recognize expressions in real time, and the application on the mobile terminal is extremely limited.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Generation method of convolutional neural networks and expression recognition method
  • Generation method of convolutional neural networks and expression recognition method
  • Generation method of convolutional neural networks and expression recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present disclosure and to fully convey the scope of the present disclosure to those skilled in the art.

[0030] figure 1 is a block diagram of an example computing device 100 . In a basic configuration 102 , computing device 100 typically includes system memory 106 and one or more processors 104 . A memory bus 108 may be used for communication between the processor 104 and the system memory 106 .

[0031] Depending on the desired configuration, processor 104 may be any type of processing including, but not limited to, a microprocess...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a generation method of convolutional neural networks for conducting expression recognition on the human face in an image, an expression recognition method, calculation equipment and a mobile terminal. The generation method of the convolutional neural network comprises the steps that the first convolutional neural network is established, wherein the first convolutional neural network comprises a first number of processing modules, a first overall average pooling layer and a first classifier which are connected in sequence; according to a pre-acquired facial image data set, the first convolutional neural network is trained, and the first classifier outputs and indicates an expression corresponding to the human face conveniently, wherein the facial image data set comprises multiple pieces of facial image information; the second convolutional neural network is established, wherein the second convolutional neural network comprises a second number of processing modules, a second overall average pooling layer and a second classifier which are connected in sequence; according to the facial image data set, the trained first convolutional neural network and second convolutional neural network are subjected to joint training, and the second classifier outputs and indicates the expression corresponding to the human face conveniently.

Description

technical field [0001] The present invention relates to the technical field of image processing, in particular to a method for generating a convolutional neural network for facial expression recognition in an image, a facial expression recognition method, a computing device, and a mobile terminal. Background technique [0002] Expression recognition plays an important role in many real-world scenarios. For example, in online chatting, by recognizing the expressions of the parties involved in the chat and then sending corresponding expression interactions to the other party, the fun of language interaction is enhanced. However, there are many problems in predicting real expressions based on human faces. Factors such as makeup, plastic surgery, one's own physical fitness and living environment can have a significant impact on the judgment of expressions. [0003] At present, facial expression recognition methods are mainly divided into two categories: one is through the extrac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V40/174G06N3/045
Inventor 李启东李志阳张伟许清泉傅松林
Owner XIAMEN MEITUZHIJIA TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products