Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time action recognition method and device and electronic equipment

A real-time action and action recognition technology, applied in the field of biometric identification, can solve problems such as slow recognition speed

Active Publication Date: 2019-04-26
北京汉王智远科技有限公司
View PDF14 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] Embodiments of the present application provide a real-time action recognition method and device to at least solve the problem of slow recognition speed in existing real-time action recognition methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time action recognition method and device and electronic equipment
  • Real-time action recognition method and device and electronic equipment
  • Real-time action recognition method and device and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0072] This embodiment provides a real-time action recognition method, such as figure 1 As shown, the method includes: Step 10 to Step 12.

[0073] Step 10, determine the real-time action image corresponding to the current action node during the action to be recognized.

[0074] The occurrence of a certain human action is composed of a series of sequential process actions, and each process action can be considered as an action node of the action. For example, when a "falling action" occurs, there will be sequentially occurring process actions such as "body leaning", "hand raising", "falling to the ground", wherein each process action such as "body leaning", " Hand raised", "falling to the ground" are considered as an action node of the "falling action". The complexity of an action is different, and the number of action nodes constituting the action is also different. For example, for an action of "raise hand", the action nodes constituting the action may only include an act...

Embodiment 2

[0088] This embodiment provides a real-time action recognition method, such as figure 2 As shown, the method includes: Step 20 to Step 29.

[0089] Step 20, training a single-frame image action recognition model.

[0090]In some embodiments of the present application, the real-time action image is input into the pre-trained single-frame image action recognition model, and before the step of determining the single-frame image recognition result corresponding to the above-mentioned real-time action image of the action to be recognized, it also includes: training A single-frame image action recognition model.

[0091] During specific implementation, training the single-frame image action recognition model includes: obtaining a sample image set composed of several action images corresponding to at least one iconic action node in the occurrence of each preset action; Network training to obtain a single-frame image action recognition model. The preset action in the embodiment of...

Embodiment 3

[0147] Correspondingly, such as Figure 4 As shown, the present application also discloses a real-time action recognition device, the device comprising:

[0148] The real-time action image determination module 41 is used to determine the real-time action image corresponding to the current action node during the action to be recognized;

[0149] A single-frame image recognition module 42, configured to input the real-time motion image into a pre-trained single-frame image motion recognition model, and determine a single-frame image recognition result corresponding to the real-time motion image;

[0150]The recognition result determination module 43 of the motion to be recognized is configured to determine the The recognition result of the action to be recognized;

[0151] Wherein, the image sequence associated with the real-time action image consists of the real-time action images corresponding to the preset number of action nodes before the current action node corresponding ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a real-time action recognition method, belongs to the field of biological feature recognition, and solves the problem that a real-time action recognition method in the prior artis low in recognition speed. The real-time action recognition method provided by the invention comprises the steps of determining a real-time action image corresponding to a current action node in ato-be-recognized action generation process; Inputting the real-time action image into a pre-trained single-frame image action recognition model, and determining a single-frame image recognition resultcorresponding to the real-time action image; Determining an identification result of the action to be identified according to a single-frame image identification result corresponding to the real-timeaction image and / or an image sequence identification result corresponding to an image sequence associated with the real-time action image; Wherein the image sequence associated with the real-time action image is formed by sequentially arranging the action images associated with the real-time action image, so that the problem of low identification speed during action identification in the prior art is solved.

Description

technical field [0001] The present application relates to the field of biological feature recognition, and in particular to a real-time action recognition method, device and electronic equipment. Background technique [0002] Behavior recognition has been a research hotspot in the field of computer vision in recent years. Action recognition is a kind of behavior recognition, which has been widely used in intelligent monitoring, human-computer interaction, virtual reality and other fields. Human actions have various modalities, such as appearance, depth, optical flow, and body bones. In the prior art, action recognition involves the following types of research methods: [0003] Methods based on global features, including frame difference method and optical flow method, etc. These methods are more effective for extracting motion features, but are more sensitive to changes in motion time intervals; [0004] Methods based on local features include SIFT algorithm and Harris algo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06N3/045
Inventor 白帆彭菲黄磊张健
Owner 北京汉王智远科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products