Human body posture classification method based on computer vision

A computer vision and human body posture technology, applied in computer parts, calculation, image analysis, etc., can solve the problems of no length extraction, blur, low accuracy and clarity, etc., and achieve the effect of high accuracy

Pending Publication Date: 2021-10-01
XIDIAN UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The existing computer vision-based human posture classification methods can all be classified very well when they are classified. They will use machines to model human movements in three models, and then establish coordinates. After the coordinates are established , when performing the extraction action, the extraction is not performed according to a certain length of the coordinates, because when the human body is performing an action, there will be a range of activities. We habitually name and extract without length, so it leads to classification. Sometimes, there will be ambiguity, resulting in low accuracy and clarity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body posture classification method based on computer vision
  • Human body posture classification method based on computer vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] A computer vision-based human posture classification method, comprising the following steps:

[0054] S1. Skeleton data acquisition: the purpose of constructing human body representation based on 3D skeleton data is to extract compact and discriminative human body posture or action feature descriptors to represent human body posture or action;

[0055] 1) Use the structured light color-depth sensor light projector to emit structured light, which is modulated by the surface height of the object to be measured after being projected onto the surface of the object to be measured;

[0056] 2) The modulated structured light is collected by the receiver and sent to the computer for analysis and calculation to obtain the three-dimensional surface data of the measured object;

[0057] 3) After a specific algorithm, the depth information of the object can be analyzed. In addition, the color camera can also be used on the color-depth sensor to obtain the color image frame correspo...

Embodiment 2

[0083] This embodiment is roughly the same as the method provided in Embodiment 1, and the main difference is that in step S5022, marking is performed every 15 cm in length.

Embodiment 3

[0085] This embodiment is roughly the same as the method provided in Embodiment 1, the main difference being that in step S5022, the marking is performed every 20 cm in length.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the technical field of computer vision human body recognition and classification, in particular to a human body posture classification method based on computer vision. The invention comprises the following steps: segmenting an obtained coordinate from a small numerical value to a large numerical value; marking once every 5 cm in length; obtaining the value of the mark; collecting the motion frame number of the human body motion obtained by the color-depth sensor light projector according to the marked numerical value; acquiring the video collected on the action frame number, sorting and collecting the video, and labeling the video; after the labels are labeled, putting the labels into the table; and performing number classification on different actions by using table screening. The invention is high in precision and very clear, and no fuzzy concept is generated; a hyperlink is inserted into the label, and the video of the action can be directly seen when the action label is clicked; and finally, proofreading is carried out, so that the proofreading is very convenient.

Description

technical field [0001] The invention relates to the technical field of computer vision human body recognition and classification, in particular to a computer vision-based human body posture classification method. Background technique [0002] Human posture based on computer vision: Extracting effective motion features from video sequences is an important part of human action recognition, which directly affects the accuracy and robustness of action recognition, and the same feature can describe different types of human actions Capabilities are not the same. Therefore, we often choose different types of features according to the video quality and application scenarios, which are related to the specific application scenarios and the action categories that researchers care about. For example: in the case of the distant view, the trajectory analysis of the target can be used; in the case of the close view, it is necessary to use the information extracted from the image sequence ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06T7/66
CPCG06T7/66
Inventor 郭少龙张夏雨
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products