Video-based human body interaction action recognition method

A technology of action recognition and human body, applied in the field of computer vision, to achieve the effect of improving rationality and reducing the limitations of requirements and features

Active Publication Date: 2018-07-03
NORTH CHINA UNIVERSITY OF TECHNOLOGY
View PDF9 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a video-based human interaction action for the existing human interaction action recognition method, which is difficult to effectively extract the features of human interaction actions and establish complex interaction models between multiple targets. Identification technology

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video-based human body interaction action recognition method
  • Video-based human body interaction action recognition method
  • Video-based human body interaction action recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Apparently, the described embodiments are only some of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without paying creative efforts all belong to the protection scope of the present invention.

[0040] The present invention mainly consists of the following steps: moving target detection, feature extraction, initial classification, human body interaction action recognition.

[0041] The experimental platform selects a high-performance computer, and Visual Studio 2010 configures the development platform of OpenCV2.4.2.

[0042] For the block process of the general design scheme, please refer tofigure 1 , the detailed technical scheme is...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a video-based human body interaction action recognition method which comprises the following steps of (S1) carrying out moving target detection on an inputted video frame imageby using an inter-frame difference method, (S2) carrying out feature extraction on a moving target obtained after processing, wherein the step (S2) includes the steps of (S21) extracting human body interaction action features on the moving target obtained after processing by using a mode of combining local space-time features and global optical flow features, (S22) describing optical flow and space-time interest points to form feature descriptors HOF and HOG, and (S23) passing the local space-time features and the global optical flow features through a BP neural network to obtain a probability matrix of an action class under a certain feature, (S3) giving different weights to probability matrixes obtained by using different features and carrying out weighted summation to obtain a fusion probability matrix, wherein an action class with the largest probability is an action class of a frame, and (S4) inputting an initial classification sequence into an improved normal distribution BP neural network to obtain final interaction action classification.

Description

technical field [0001] The invention belongs to the field of computer vision and can be used for human body interaction posture or action analysis research. Background technique [0002] Vision-based human interaction analysis has always been one of the research hotspots in the field of computer vision, which not only has important theoretical significance, but also has broad application prospects. In the intelligent video surveillance system, due to the existence of phenomena such as "robbery" and "fighting", the importance of analyzing and understanding the interaction between people in the scene is particularly prominent; in the huge video database, human interaction Action recognition can automatically retrieve according to predefined action patterns, making it very convenient to retrieve specific events in video databases; virtual reality mainly uses computers to perform visual operations and simulations on complex data to create virtual simulation scenes, and based on ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06N3/08G06F3/01
CPCG06F3/011G06N3/084G06V40/20G06V10/40
Inventor 叶青郭新然张永梅
Owner NORTH CHINA UNIVERSITY OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products