Unlock instant, AI-driven research and patent intelligence for your innovation.

Action recognition method and device

A technology of action recognition and target frame, applied in the field of computer vision, can solve the problem of low recognition accuracy and achieve the effect of improving accuracy

Inactive Publication Date: 2022-02-18
PEKING UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the technical problem of low recognition accuracy existing in the prior art, the present invention provides an action recognition method and device

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action recognition method and device
  • Action recognition method and device
  • Action recognition method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0063] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention more clear, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention.

[0064] figure 1 It is a schematic flowchart of an action recognition method provided in Embodiment 1 of the present invention, as shown in figure 1 As shown, the action recognition method includes:

[0065] Step 101. Receive video data.

[0066] It should be noted that the subject of execution of the present invention may specifically be an action recognition device, and its physical form may be a terminal device composed of hardware such as a processor, a memory, a logic circuit, and an electronic chip.

[0067] Specifically, in step 101, a piece of video data is received, wherein the video data includes several frames of data information, and the source of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The action recognition method and device provided by the present invention determine the target frame and several consecutive frames before the target frame in the received video data, and extract the data information of the target frame and the data information of several consecutive frames before the target frame from the video data . Perform a preset number of convolution processing on the preset number of gain parameters, the data information of the target frame and the data information of several consecutive frames before the target frame to obtain high-order feature data, and add the high-order feature data to the video data In the process, the data to be extracted is formed, the time series feature extraction is performed on the data to be extracted, and the feature vector is obtained. Finally, the action recognition result is obtained according to the feature vector, so that the high-order features of the video data can be extracted, and the accuracy of action recognition can be improved.

Description

technical field [0001] The invention relates to computer vision technology, in particular to an action recognition method and device. Background technique [0002] With the development of computer vision technology, the use of video acquisition equipment for action recognition has become the focus of research. Existing action recognition methods need to extract data such as joint positions from video streams, and input these data into a three-layer bidirectional long-short-term memory recurrent artificial neural network, and the neural network extracts the dynamic characteristics of the data. Then, the extracted dynamic features are input to the classifier network, and finally the action type corresponding to the data of the video stream is obtained. [0003] However, due to the limitations of the three-layer bidirectional long-short-term memory recurrent artificial neural network, it can only extract the dynamic features of the data in the entire time series, and cannot ex...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/20G06V20/40G06V10/764G06V10/40G06V10/82G06N3/04G06N3/08
CPCG06V20/42G06V20/46
Inventor 胡越予刘家瑛张昊华郭宗明
Owner PEKING UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More