Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Micro-expression recognition method based on adaptive motion amplification and convolutional neural network

A convolutional neural network and motion amplification technology, applied in the field of image processing, can solve the problems of neglect, excessive blurring, and noise of amplification results, and achieve the effect of ensuring recognition accuracy, good amplification effect, and good recognition performance

Pending Publication Date: 2021-10-22
JIANGNAN UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the above method uses the motion amplification technique to indiscriminately amplify all samples, although the micro-expression features after amplification are more obvious, it ignores the situation that the intensity of each micro-expression sample is different
Moreover, most of the existing micro-expression recognition methods based on video motion amplification technology use Eulerian Video Magnification (EVM) technology, which requires manual design of parameters, and the process is complicated. Prone to noise or excessive blur

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Micro-expression recognition method based on adaptive motion amplification and convolutional neural network
  • Micro-expression recognition method based on adaptive motion amplification and convolutional neural network
  • Micro-expression recognition method based on adaptive motion amplification and convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0045] The micro-expression recognition method based on adaptive motion amplification and convolutional neural network according to this embodiment specifically includes the following steps:

[0046] Step 1: Convert the micro-expression video into an image sequence sample, and perform face cropping and alignment; this embodiment uses the Dlib face detector in OpenCV to detect faces from the image sequence, and only uses the first frame of the image sequence to detect The face key points are used to crop and enlarge the face of all frames of the image sequence;

[0047] Step 2: Read the start frame of the micro-expression image sequence, and use the vertex frame positioning algorithm to calculate the vertex frame;

[0048] According to this embodiment, the start frame is the image frame at the beginning of the micro-expression image sequence, and the vertex frame is the image frame with the highest intensity in the micro-expression image sequence. The present invention only use...

Embodiment 2

[0089] In order to verify the effectiveness of the micro-expression recognition method provided by the present invention, the present embodiment adopts CK+macro-expression data set to pre-train the ME-Xception network model, and then conducts it on the CASME II data set, SAMM data set, and SMIC data set respectively Leave One Subject Out (LOSO) experiment. Among them, the CASME II data set is a spontaneous micro-expression data set proposed by Fu Xiaolan's team at the Institute of Psychology, Chinese Academy of Sciences in 2014. The SMIC spontaneous micro-expression data set was designed and collected by Zhao Guoying's team at the University of Oulu in Finland in 2012. SAMM spontaneously The micro-expression dataset was proposed by the Moi Hoon Yap research team at the University of Manchester in 2018.

[0090] In this embodiment, the micro-expression video samples are divided into three categories: negative, positive, and surprise. Among them, the negative micro-expression la...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a micro-expression recognition method based on adaptive motion amplification and a convolutional neural network. The method comprises the following steps: 1, converting a certain sample of a micro-expression video into an image sequence, and carrying out the face cutting and alignment; 2, reading an initial frame of the image sequence, and calculating by using a vertex frame positioning algorithm to obtain a vertex frame picture; 3, determining a proper magnification factor by adopting a self-adaptive motion magnification method, and performing motion magnification on the vertex frame according to the determined magnification factor so as to enhance the characteristics of the micro-expression; 4, obtaining optical flow characteristics of the micro-expression video according to the start frame and the amplified vertex frame, and obtaining a horizontal optical flow, a vertical optical flow and optical strain; 5, establishing a convolutional neural network model for micro expression recognition, and performing transfer learning from macro expression to micro expression by using the model; and 6, inputting the optical flow features into the model after transfer learning, outputting time-space features, and training the model to realize micro-expression recognition.

Description

technical field [0001] The invention relates to a micro-expression recognition method based on adaptive motion amplification and a convolutional neural network, and belongs to the technical field of image processing. Background technique [0002] Micro Expression (ME) of human face is a facial expression with short duration, small range of muscle movement and uncontrollable. It usually occurs when human beings try to hide their true inner emotions, and it is not deceptive. The generation of micro-expressions is unconscious, but they can often effectively express a person's true emotions. Therefore, micro-expressions, as a clue to identify lies, have been extensively studied and applied in the fields of psychology, criminal investigation and security. Compared with macro-expressions, micro-expressions tend to occur quickly and last for a short time. Relevant studies have shown that their duration is usually 1 / 25 to 1 / 5 second; There will be 1 to 4 small areas in the face mot...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08G06T7/246G06T7/269
CPCG06N3/08G06T7/248G06T7/269G06T2207/10016G06T2207/20004G06T2207/20081G06T2207/20084G06T2207/30201G06N3/045G06F18/22G06F18/24G06F18/214
Inventor 高美凤陈汤慧于力革
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products