Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human action recognition and its neural network generation method, device and electronic equipment

A technology of human action recognition and neural network, applied in the field of human action recognition and neural network generation, can solve the problem of low ability of action recognition

Active Publication Date: 2020-10-27
BEIJING KUANGSHI TECH CO LTD
View PDF13 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the object of the present invention is to provide a human body action recognition and its neural network generation method, device and electronic equipment, to solve the problem that the current image recognition neural network in the prior art has a low recognition ability for action recognition. technical problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human action recognition and its neural network generation method, device and electronic equipment
  • Human action recognition and its neural network generation method, device and electronic equipment
  • Human action recognition and its neural network generation method, device and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0061] An embodiment of the present invention provides a neural network generation method for human body action recognition, as a neural network generation method for fusing key point information of the human body, such as figure 1 As shown, the neural network generation method includes:

[0062] S11: Detect the target image, and obtain the detection result of the human target point.

[0063] Wherein, the target image may be a dynamic video, a still picture, etc. acquired by an image acquisition device such as a common camera or a depth camera. Furthermore, the detection result of the human body target point may be position information of several key points of the human body and angle information between the several key points of the human body.

[0064] In this embodiment, the target image to be subjected to action recognition is detected first, that is, before the target image is officially input into the action recognition neural network, the target image is detected first...

Embodiment 2

[0077] An embodiment of the present invention provides a neural network generation method for human body action recognition, as a neural network generation method for fusing key point information of the human body, such as figure 2 As shown, the neural network generation method includes:

[0078] S21: Detect the target image through the human body pose estimation algorithm, and obtain the detection result of the human body target point.

[0079] In this step, the target image is detected and recognized based on the human body pose estimation technology, and the detection result of the human body target point is obtained. Wherein, the human target point detection result includes at least one of position information of human body joint points, angle information of human body joint points, position information of key body parts and angle information of key body points. For example, the target points of the human body can be the top of the head, the neck, the left shoulder, the ...

Embodiment 3

[0108] A human body action recognition method provided by an embodiment of the present invention is an action recognition method that fuses key point information of the human body, such as Figure 4 As shown, the human action recognition method includes:

[0109] S31: Detecting the target image to obtain a human target point detection result.

[0110] S32: Recognize the target image to obtain a preliminary action recognition result.

[0111] S33: Fusing the features according to the detection result of the human target point and the preliminary action recognition result to obtain a fusion result.

[0112] S34: Generate an action recognition neural network through training according to the fusion result.

[0113] As a preferred solution, the specific implementation manner of the above steps S31, S32, S33 and S34 is the same as that of the first or second embodiment, and will not be described in detail here.

[0114] S35: Recognize the target image through the motion recognit...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a human body action recognition and its neural network generation method, device and electronic equipment, and relates to the technical field of image recognition. The neural network generation method used for human body action recognition includes: detecting the target image to obtain the human body target point Detection results; identifying the target image to obtain a preliminary action recognition result; merging features according to the human body target point detection result and the action recognition result to obtain a fusion result; generating an action through training according to the fusion result The recognition neural network solves the technical problem in the prior art that the recognition ability of the current image recognition neural network for action recognition is low.

Description

technical field [0001] The invention relates to the technical field of image recognition, in particular to a method, device and electronic equipment for human body action recognition and neural network generation thereof. Background technique [0002] At present, action recognition, as an important basis for automatic video analysis, will play an important role in a series of application scenarios such as intelligent monitoring, new retail, human-computer interaction, and education and teaching. [0003] For example, in security monitoring scenarios, if abnormal behaviors such as pickpocketing, lock picking, and fighting can be well identified, it can play an important role in reducing manpower monitoring costs and maintaining public security; in the new retail field, action recognition can help Better understand user behavior, automatically analyze customer preferences, and improve user experience. [0004] However, the current neural network for action recognition mainly ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V40/20G06V10/464G06F18/24G06F18/253
Inventor 吴骞张弛
Owner BEIJING KUANGSHI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products