Unlock instant, AI-driven research and patent intelligence for your innovation.

Human interaction action recognition method based on video

An action recognition, human body technology, applied in the field of computer vision, to reduce the limitations of requirements and features, and improve the effect of rationality

Active Publication Date: 2021-09-07
NORTH CHINA UNIVERSITY OF TECHNOLOGY
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a video-based human interaction action for the existing human interaction action recognition method, which is difficult to effectively extract the features of human interaction actions and establish complex interaction models between multiple targets. Identification technology

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human interaction action recognition method based on video
  • Human interaction action recognition method based on video
  • Human interaction action recognition method based on video

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Apparently, the described embodiments are only some of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without paying creative efforts all belong to the protection scope of the present invention.

[0040] The present invention mainly consists of the following steps: moving target detection, feature extraction, initial classification, human body interaction action recognition.

[0041] The experimental platform selects a high-performance computer, and Visual Studio 2010 configures the development platform of OpenCV2.4.2.

[0042] For the block process of the general design scheme, please refer tofigure 1 , the detailed technical scheme is...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video-based human interaction action recognition method, which includes the following steps: S1: using the frame difference method to detect the moving target on the input video frame image; S2: performing feature extraction on the moving target obtained after processing, including : S21: Extract human interaction action features by combining local spatiotemporal features and global optical flow features from the processed moving objects; S22: Describe the optical flow and spatiotemporal interest points to form feature descriptors HOF and HOG ; S23; pass the local spatiotemporal features and global optical flow features through the BP neural network to obtain the probability matrix of the action category under a certain feature; S3: weighting by assigning different weights to the probability matrix obtained by using different features Sum to obtain the fusion probability matrix, and the action category with the highest probability is the action category of the frame; S4: Input the initial classification sequence into the improved normal distribution BP neural network to obtain the final interactive action classification.

Description

technical field [0001] The invention belongs to the field of computer vision and can be used for human body interaction posture or action analysis research. Background technique [0002] Vision-based human interaction analysis has always been one of the research hotspots in the field of computer vision, which not only has important theoretical significance, but also has broad application prospects. In the intelligent video surveillance system, due to the existence of phenomena such as "robbery" and "fighting", the importance of analyzing and understanding the interaction between people in the scene is particularly prominent; in the huge video database, human interaction Action recognition can automatically retrieve according to predefined action patterns, making it very convenient to retrieve specific events in video databases; virtual reality mainly uses computers to perform visual operations and simulations on complex data to create virtual simulation scenes, and based on ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/46G06N3/08G06F3/01
CPCG06F3/011G06N3/084G06V40/20G06V10/40
Inventor 叶青郭新然张永梅
Owner NORTH CHINA UNIVERSITY OF TECHNOLOGY
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More