Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Micro-expression detection method based on self-adaptive transition frame removal depth network

A deep network and detection method technology, applied in neural learning methods, biological neural network models, instruments, etc., can solve the problems of small scope of application and low detection accuracy of micro-expressions, and achieve the effect of wide application scope and high detection accuracy

Active Publication Date: 2020-06-23
HANGZHOU DIANZI UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The common problem of existing micro-expression detection methods is that the detection accuracy of micro-expression is low or the scope of application is too small
Commonly used databases for micro-expression detection include CASME II, SMIC-E-HS and CAS(ME) 2 , no micro-expression detection method has been verified on three databases at the same time before

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Micro-expression detection method based on self-adaptive transition frame removal depth network
  • Micro-expression detection method based on self-adaptive transition frame removal depth network
  • Micro-expression detection method based on self-adaptive transition frame removal depth network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The present invention will be described in detail below in conjunction with the accompanying drawings. It should be noted that the described embodiments are only intended to facilitate the understanding of the present invention, rather than limiting it in any way.

[0043] figure 1 Take a video numbered 20_EP15_03f in the CASME II database as an example to describe the MesNet training process. As shown in formula (1), Input is the micro-expression frame and neutral frame samples input into the MesNet network, and f(Input) represents the shape and texture feature features extracted from the image using the pre-trained model:

[0044] Features=f(Input). (1)

[0045] In order to further extract micro-expression features, as shown in formula (2), the function f 1 (Features,N) means that features are used as input, and a fully connected layer containing N neurons is connected after the pre-training model:

[0046] FC=f 1 (Features, N). (2)

[0047] Then take FC as input...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a micro-expression detection method based on a self-adaptive transition frame removal depth network. The method comprises the steps of network construction, network training and micro-expression detection, wherein in the network training, data preprocessing is firstly carried out on an original video; removing the transition frame by using a self-adaptive transition frame removing method; and finally, inputting the micro-expression frame and neutral frame samples of which the transition frames are removed into a MesNet network for training. The MesNet essence constructed by the method disclosed by the invention is a binary classification network; the detection of the micro-expression frame does not depend on a frame time sequence relationship, so that MesNet can detect the micro-expression frame from a complete video of a micro-expression database, can also detect the micro-expression frame from any given frame set, and can also judge whether a given single frame is the micro-expression frame or not.

Description

technical field [0001] The invention belongs to the technical field of computer image processing, and relates to a micro-expression detection method based on a deep network for adaptively removing transition frames. Background technique [0002] Different from traditional facial expressions with a duration of 0.5s to 4s, facial microexpressions with a duration of 1 / 25s to 1 / 5s are instantaneous and unconscious responses that reveal people's true emotions. Due to its potential applications in various fields such as emotion monitoring, lie detection, clinical diagnosis, business negotiation, etc., micro-expression recognition has attracted more and more attention of researchers in the past ten years. [0003] Micro-expressions are difficult to induce, difficult to collect data, small sample size, and difficult to recognize by human eyes. The initial recognition of micro-expressions was mainly done manually by professionals such as psychologists. In recent years, the advancemen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/176G06N3/045G06F18/214G06F18/24323Y02D10/00
Inventor 付晓峰牛力柳永翔赵伟华计忠平
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products