Image Feature Representation and Human Movement Tracking Method Based on the Second Generation Striplet Transform

A strip wave transform, human motion technology, applied in image analysis, image data processing, instruments, etc., can solve the ambiguity of motion tracking and recovery, cannot effectively represent the direction of geometric texture, closed boundary gray discontinuity blur, etc. question

Inactive Publication Date: 2011-12-07
XIDIAN UNIV
View PDF1 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, most of the image feature representations describing the human body are based on contour and edge information, which is not strict in theory, and it is difficult to better describe the internal information of the image.
At the same time, a major problem faced in such edge-based methods is that fast image changes often do not correspond to discontinuity jumps along edge curves, which on the one hand lead to blurring of gray-level discontinuities in closed boundaries, and on the other hand the planned Texture variations do not cluster along geometry curves
The final result is that it cannot effectively represent the geometric texture direction in the image, and cannot fully describe the posture and feature information of the person in it, resulting in ambiguity and ambiguity for later motion tracking and recovery.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image Feature Representation and Human Movement Tracking Method Based on the Second Generation Striplet Transform
  • Image Feature Representation and Human Movement Tracking Method Based on the Second Generation Striplet Transform
  • Image Feature Representation and Human Movement Tracking Method Based on the Second Generation Striplet Transform

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0049] The present invention is an image feature representation and human body motion tracking method based on the second-generation striplet transform, referring to figure 1 , the specific implementation process of the present invention is as follows:

[0050] (1) Input the training and test video image sets to be processed and convert them into continuous single sequence images. According to the image content, judge the main human target that needs to be recognized, extract the rectangular frame containing the human body, and uniformly convert the size of each image to approximate The initial image of 64 × 192 pixels in proportion to human motion is used as a training sample for subsequent processing. Because the present invention does not need to cut out the background of the video image, it saves computing resources and time complexity.

[0051] (2) Perform two-dimensional discrete orthogonal wavelet transform on each training sample image, and the number of layers of wav...

Embodiment 2

[0076] Based on the second-generation strip wave transform image feature representation and human body motion tracking method with embodiment 1, wherein in the step (3), the Lagrangian function to obtain the optimal value λ is a penalty scaling factor, and the value 2 / 35, T is 20 for the quantization threshold, R g is the number of bits required to process the optimal geometric flow parameter d through entropy coding, obtained by calculation, R b is the size of the number of bits required for quantization encoding {Q(t)} determined by calculation.

[0077] In addition, the learning regression method adopted in step (8) is realized by a double Gaussian process. Similarly, feature extraction with clear edges, which is relatively consistent with Embodiment 1, can be obtained, especially reflecting the texture information in a motion state.

Embodiment 3

[0079] Based on the second-generation strip wave transform image feature representation and human body motion tracking method are the same as those in Embodiment 1-2, the present invention is verified by means of simulation.

[0080] (1) Experimental condition setting

[0081]In the present invention, the classification category of moving images is taken as "walking", and verifications are carried out on different subcategories of recognized moving video sequence databases. Matlab environment is used for simulation programming.

[0082] like figure 2 As shown in (a)-2(d), the "walking" video sequence image is a female character walking in a circular gait on the red carpet parallel to the direction of the camera's viewing angle. The original image size is 640×480, after step 1 processing The size of each human body image is 64×192, and it includes frame image segments facing the camera and facing away from the camera. in figure 2 a is the first screenshot of the sequence,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an image feature representation and human body motion tracking method based on the second-generation strip wave transformation, which mainly solves the problem that the existing feature extraction is based on the human body contour and edge ambiguity, and cannot reflect the geometric flow direction and texture of the internal image. question. The implementation process is: input the video image to be processed, extract the frame diagram of human body parts; perform two-dimensional multi-scale wavelet transformation on the image; use quadtree division and bottom-up fusion rule to find the optimal geometric flow direction; quantify the optimal The geometric flow direction signal is transformed by one-dimensional wavelet and reorganized into two-dimensional form to obtain the Bandelet2 coefficient matrix; the maximum geometric flow statistical feature is extracted as the final image feature representation; the mapping relationship between image features and three-dimensional motion data is learned through the regression process, and the new Predict and recover 3D pose from training video images. The invention has the advantages of fast tracking, accurate result, small amount of calculation, reduced blurring of image representation, and can be used for human target recognition and three-dimensional posture reconstruction.

Description

technical field [0001] The invention belongs to the technical field of computer vision and video image processing, and mainly relates to a video image feature texture representation method, specifically a second-generation strip wave transform image feature representation and human body motion tracking method, which is used for video human body motion tracking and three-dimensional Posture recovery. Background technique [0002] Video human motion tracking is one of the major hotspots in the field of computer vision in the past two decades. People are the core content and reflect the core semantic features of images. This type of technology has been initially applied in many fields such as motion capture, human-computer interaction, and video surveillance, and has great application prospects. The understanding and interpretation of video human motion tracking belongs to the category of video image processing, and also involves many disciplines such as pattern recognition, m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20
Inventor 韩红苟靖翔王瑞冯光洁
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products