Action recognition model training method, and action recognition method and device

A technology of action recognition and model training, applied in character and pattern recognition, acquisition/recognition of facial features, instruments, etc., can solve the problems of inability to obtain recognition results, limited expression capture accuracy and node selection, and achieve good recognition accuracy , increase the amount of effective data, reduce the effect of dependence

Active Publication Date: 2021-01-05
HARBIN INST OF TECH SHENZHEN GRADUATE SCHOOL
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the extracted node information from the image sequence often only has the characteristics of one aspect of the pixel (the R, G, B information of the pixel is regarded as the same feature aspect), so that the action recognition of the graph neural network based on the feature point as the node, especially It is an expression recognition method, especially limited by the accuracy of expression capture and node selection. When the accuracy is not good or the node selection is inaccurate, ideal recognition results are often not obtained.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action recognition model training method, and action recognition method and device
  • Action recognition model training method, and action recognition method and device
  • Action recognition model training method, and action recognition method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0079] In order to make the above objects, features and advantages of the present invention more comprehensible, specific embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0080] In related technologies, when using graph convolutional neural network for image and video processing to achieve the technical purpose of human posture, face recognition, expression recognition and other action recognition, the method of convolution calculation of node data is adopted. Among them, the adjacency matrix representing the connection relationship between nodes is usually a 0,1 matrix, and the mark 1 for the existence of connection edges; there are also some directed graphs that introduce -1 to indicate the direction, but overall, adjacency The matrix is ​​static during the convolution calculation. Moreover, in the update process of the node data, the adjacency matrix participates in the convolution calculation as a part ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an action recognition model training method, and an action recognition method and device. The training method comprises the steps of obtaining node data of all nodes in a presetaction node set; calculating a connection edge set of all nodes in the node set and edge data of each connection edge according to the node data; constructing a graph structure of action data according to the node data and the edge data of the connecting edge; and taking the graph structure of the action data as model input, taking an expression recognition classification result as model output,and carrying out supervised training on a preset graph convolutional neural network expression recognition model, wherein the edge data of the connecting edge in the graph structure of the action datais used as model input. The action data is recorded and calculated based on the graph structure, the effective data volume participating in deep learning is further improved, better recognition precision can be obtained, and dependence on sample data precision is reduced.

Description

technical field [0001] The present invention relates to the technical field of artificial intelligence, in particular to an action recognition model training method and an action recognition method and device. Background technique [0002] At present, human body action recognition, especially expression recognition, is mainly based on the time series of actions or facial expressions in collected images or videos, and the classification of actions or expressions is realized through deep neural networks, especially deep convolutional neural networks. However, based on images and video sequences to recognize expressions and actions, especially micro-expressions and micro-actions, convolutional neural network algorithms are often limited by the accuracy of expression capture, and it is difficult to have high accuracy. Among them, micro-expression classification is more difficult than macro-expression (emotion classification), mainly because the data is sparse, it needs to be loc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/168G06V40/174G06F18/214
Inventor 王勃然姜京池刘劼
Owner HARBIN INST OF TECH SHENZHEN GRADUATE SCHOOL
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products