Unlock instant, AI-driven research and patent intelligence for your innovation.

Action recognition method, device, equipment and storage medium

An action recognition and action technology, applied in the field of computer vision, can solve the problems of flexibility and scalability limitations, low accuracy of action recognition, and deviation of key points of human bones from the real position, so as to achieve the effect of improving accuracy

Active Publication Date: 2021-06-11
BIGO TECH PTE LTD
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In the process of realizing the present invention, the inventors found that there are at least the following problems in the prior art: for the first method, since only the action type contained in the video can be obtained, the position information and speed information of the key points of the human skeleton cannot be obtained. Therefore, the flexibility and scalability of the above methods are limited
For the second method, since the key point recognition algorithm of the human skeleton is sensitive to light, environment and motion speed, the key points of the human skeleton are often lost or the key points of the human skeleton deviate from the real position. high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action recognition method, device, equipment and storage medium
  • Action recognition method, device, equipment and storage medium
  • Action recognition method, device, equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0030] When motion recognition is applied to human-computer interaction and entertainment games on mobile terminals, it has high requirements for the recognition accuracy, real-time and flexibility of motion recognition algorithms. Specifically: for recognition accuracy, it needs to meet complex In a changing environment, accurate recognition results can still be obtained, so that accurate feedback results can be generated based on accurate recognition results, and user experience can be improved; for real-time performance, it needs to be able to run on different application systems in real time, such as the Android system Or the IOS system; for flexibility, it needs to provide the action type of each video frame, and also needs to provide the position information and speed information of the key points of the human skeleton in the video frame to meet the application requirements of the upper layer.

[0031] None of the motion recognition algorithms in the traditional technolog...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an action recognition method, device, equipment and storage medium. The method comprises: determining the shallow features of each video frame according to the key points of the human skeleton of each video frame in the video to be identified; obtaining the image features of each video frame; obtaining the action features of each video frame according to the shallow features and the image features; Input the action features of each video frame into the action recognition model to obtain the action recognition result of each video frame, the action recognition result includes the state and state probability of the action; determine the execution state of the target action of each video frame according to the action recognition result of each video frame . In the embodiment of the present invention, the action feature combines shallow features and image features, which improves the accuracy of action recognition. When determining the execution state of the target action of each video frame, not only based on the action recognition result of the video frame, but also combined with the action recognition results of other video frames, thereby further improving the accuracy of action recognition.

Description

technical field [0001] Embodiments of the present invention relate to computer vision technology, and in particular to an action recognition method, device, equipment and storage medium. Background technique [0002] Action recognition is one of the most challenging research directions in the field of computer vision. It is widely used in the field of mobile entertainment interaction, such as real-time short video production, live real-time interaction and somatosensory games. [0003] In the prior art, the following two methods are usually used to identify actions. Specifically: method 1, input the RGB image and optical flow information of the video into the convolutional neural network to obtain the type of action contained in the video; method 2, use the human body The bone key point recognition algorithm obtains the key points of human bones in each video frame, and inputs the key points of human bones into the convolutional neural network to obtain the action types cont...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06N3/04
CPCG06V40/20G06V20/40G06N3/045G06V40/25G06V20/44G06V10/82G06N3/08G06N3/063G06N3/044G06V20/41
Inventor 张树业王俊东梁柱锦梁德澎张壮辉叶天才周卫
Owner BIGO TECH PTE LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More