Human body behavior recognition method based on deep space-time inference network and electronic equipment

A recognition method and space-time technology, applied in the field of behavior recognition, can solve the problem of inability to recognize human behavior, and achieve the effect of improving feature utilization, improving recognition rate, and fast and accurate recognition.

Pending Publication Date: 2021-03-16
PENG CHENG LAB
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the above-mentioned defects of the prior art, the present invention provides a human behavior recognition method and electronic equipment based on a deep spatio-temporal inference network, aiming at solving the problem that the existing human behavior recognition method based on deep learning can only recognize still pictures and cannot The problem of human behavior recognition in video data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body behavior recognition method based on deep space-time inference network and electronic equipment
  • Human body behavior recognition method based on deep space-time inference network and electronic equipment
  • Human body behavior recognition method based on deep space-time inference network and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] In order to make the object, technical solution and advantages of the present invention more clear and definite, the present invention will be further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0045] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. The following description of at least one exemplary embodiment is merely illustrative in nature and in no way taken as limiting the invention, its application or uses. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body behavior recognition method based on a deep space-time inference network and electronic equipment, and the method comprises the steps of inputting target video data into a pre-trained deep space-time inference network model, and determining a first activation frequency corresponding to a clustering center of each node in the deep space-time inference network model; constructing a first behavior feature tree according to the first activation times; obtaining a first frequent sub-tree set from the first behavior feature tree by using a frequent sub-tree mining algorithm; and according to the first frequent sub-tree set, identifying human body behaviors in the target video data. Behavior recognition is performed through the human body dynamic behavior space-time features extracted by the deep space-time inference network and the different-level feature relationship reflected by the frequent subtree, the features of the human body parts in the behaviorsare concerned, and the association and coordination between the parts are concerned, so that the feature utilization rate in the network is improved, the recognition rate of the network is improved,and the human body behaviors in the video data can be quickly and accurately recognized.

Description

technical field [0001] The invention relates to the technical field of behavior recognition, in particular to a human behavior recognition method and electronic equipment based on a deep spatio-temporal reasoning network. Background technique [0002] With the development of computer video surveillance technology, intelligent surveillance systems are widely used in public places such as traffic intersections, airports, shopping malls, parking lots and commercial office buildings. Among them, human behavior recognition has become one of the most challenging problems in the field of intelligent monitoring. Its goal is to enable computers to recognize human behavior based on human body information, especially motion information. [0003] According to different recognition methods, the current research methods on behavior recognition can be divided into two categories: traditional methods and deep learning methods. Traditional methods mainly include methods based on template ma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08G06N5/04
CPCG06N3/088G06N5/04G06V40/20G06N3/045G06F18/23G06F18/2155
Inventor 丁玉隆崔金强尉越
Owner PENG CHENG LAB
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products