Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video action recognition method and device and machine equipment

A motion recognition and video technology, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems such as high parameter and computational complexity, inability to control parameters and computational complexity, and inability to model time domain information. achieve the effect of improving performance

Active Publication Date: 2019-08-23
TENCENT TECH (SHENZHEN) CO LTD +1
View PDF3 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In order to solve the technical problem that the time domain information cannot be explicitly modeled in the related art, and the parameters and calculation complexity are high, and the parameters and calculation complexity cannot be controlled, the present invention provides a video action recognition method, device and machine equipment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video action recognition method and device and machine equipment
  • Video action recognition method and device and machine equipment
  • Video action recognition method and device and machine equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numerals in different drawings refer to the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present invention. Rather, they are merely examples of apparatuses and methods consistent with aspects of the invention as recited in the appended claims.

[0043] figure 1 It is a schematic diagram of the implementation environment involved in the present invention. In an exemplary embodiment, the implementation environment includes a video source 110 and a server 130. For videos captured by the video source 110, such as short videos, the server 130 is performing recognition of actions in the video to obtain action recognition results. .

[0044] Fo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video action recognition method and device and machine equipment. The method comprises the following steps: acquiring video data for action recognition; when the video data is subjected to feature extraction of each layer of the network in the neural network, enabling the extracted spatial domain features on the video image to enter a bilinear layer through the network structure of the neural network; performing bilinear correlation operation on the adjacent frames of video images according to the spatial domain characteristics to obtain the time-spatial domain characteristics of each frame of video image in the video data; classifying actions in the video through the time-space domain features to obtain an action recognition result of the video data. For featureextraction of each layer of the network in the neural network, parameters and calculation complexity in bilinear correlation operation on a bilinear layer are controlled, and then time-space domain features are extracted under the condition of control complexity, so that explicit modeling on a time domain relationship is realized, and the performance of action recognition is effectively improved.

Description

technical field [0001] The invention relates to the technical field of computer vision applications, in particular to a video action recognition method, device and machine equipment. Background technique [0002] Video-based action recognition has always been an important field of computer vision research. The realization of video action recognition mainly includes feature extraction and representation, and feature classification. For example, the classic density trajectory tracking method extracts different features near the trajectory point of the optical flow field, including optical flow histogram and gradient histogram, etc., and uses Fisher encoding to obtain the final video feature representation, and then the support vector Classifiers such as machine learning and random forests are learned on the training set to obtain the final classifier for feature classification. This is a method of manually designing features. In recent years, with the powerful feature repres...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06V20/46G06F18/241
Inventor 厉扬豪宋思捷刘家瑛刘婷婷黄婷婷马林刘威
Owner TENCENT TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products