Multi-feature fusion behavior identification method based on key frame
What is Al technical title?
Al technical title is built by PatSnap Al team. It summarizes the technical point description of the patent document.
A multi-feature fusion and recognition method technology, which is applied in the field of multi-feature fusion behavior recognition based on key frames of human motion sequences, can solve the problems of missing target information, difficulty in realizing, and video features that cannot accurately express video information, etc., and achieve detailed The effect of reducing the subtle differences and improving the accuracy of recognition
Active Publication Date: 2019-08-06
NORTHWEST UNIV(CN)
View PDF5 Cites 58 Cited by
Summary
Abstract
Description
Claims
Application Information
AI Technical Summary
This helps you quickly interpret patents by identifying the three key elements:
Problems solved by technology
Method used
Benefits of technology
Problems solved by technology
However, with the continuous improvement of video complexity, a single video feature can no longer accurately express the required video information.
Moreover, with the continuous increase of the amount of video data and information, in the process of behavior recognition, we miss important target information due to the existence of redundant data, and the detection of huge amounts of data one by one is contrary to Principles of Video Analysis and Difficult to Implement
Method used
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more
Image
Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
Click on the blue label to locate the original text in one second.
Reading with bidirectional positioning of images and text.
Smart Image
Examples
Experimental program
Comparison scheme
Effect test
Embodiment Construction
[0046] The technical solution of the present invention will be described in detail below in conjunction with the embodiments and the accompanying drawings, but is not limited thereto.
[0047] The present invention is developed on the Ubuntu16.04 system, the system is equipped with GeForce video memory, and the experiment is configured
[0048] OpenCV3.1.0, python and other tools required in the process have built an openpose pose extraction library locally.
[0049] A kind of key frame-based multi-feature behavior recognition method of the present invention, such as figure 1 shown, including the following steps:
[0050] Step 1. Input the video into the openpose pose extraction library to extract the joint point information of the human body in the video. Each human body contains 2D coordinate information of 18 joint points. The human skeleton representation and index are as follows figure 2 Shown, and the joint point coordinates and position sequence of each frame are def...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more
PUM
Login to view more
Abstract
A multi-feature fusion behavior identification method based on a key frame comprises the following steps of firstly, extracting a joint point feature vector x (i) of a human body in a video through anopenpose human body posture extraction library to form a sequence S = {x (1), x (2),..., x (N)}; secondly, using a K-means algorithm to obtain K final clustering centers c '= {c' | i = 1, 2,..., K},extracting a frame closest to each clustering center as a key frame of the video, and obtaining a key frame sequence F = {Fii | i = 1, 2,..., K}; and then obtaining the RGB information, optical flow information and skeleton information of the key frame, processing the information, and then inputting the processed information into a double-flow convolutional network model to obtain the higher-levelfeature expression of the RGB information and the optical flow information, and inputting the skeleton information into a space-time diagram convolutional network model to construct the space-time diagram expression features of the skeleton; and then fusing the softmax output results of the network to obtain a final identification result. According to the process, the influences, such as the timeconsumption, accuracy reduction, etc., caused by redundant frames can be well avoided, and then the information in the video can be better utilized to express the behaviors, so that the recognition accuracy is further improved.
Description
technical field [0001] The invention belongs to the technical fields of computer graphics and human-computer interaction, and in particular relates to a multi-feature fusion behavior recognition method based on key frames of human motion sequences. Background technique [0002] Vision is the most important medium for information transmission in human activities. Studies have found that about 80% of information is obtained through vision. In recent years, with the development of computer technology, especially the rapid popularization of the Internet, the subject of computer vision has become one of the most active and popular subjects in the computer field. Computer Vision (Computer Vision) refers to the machine vision that uses cameras and computers to simulate human vision to identify, track, measure, etc. targets, and further image processing through recognition and analysis. Human action recognition, as an emerging research field in computer vision, has been extensively...
Claims
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more
Application Information
Patent Timeline
Application Date:The date an application was filed.
Publication Date:The date a patent or application was officially published.
First Publication Date:The earliest publication date of a patent with the same application number.
Issue Date:Publication date of the patent grant document.
PCT Entry Date:The Entry date of PCT National Phase.
Estimated Expiry Date:The statutory expiry date of a patent right according to the Patent Law, and it is the longest term of protection that the patent right can achieve without the termination of the patent right due to other reasons(Term extension factor has been taken into account ).
Invalid Date:Actual expiry date is based on effective date or publication date of legal transaction data of invalid patent.