Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body posture feature extracting method based on 3D joint point coordinates

A technology of human body posture and joint points, applied in the field of human-computer interaction, can solve problems such as inability to distinguish actions

Inactive Publication Date: 2017-02-22
CHINA AGRI UNIV
View PDF4 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, some global features based on joint points cannot distinguish actions well in some cases. Therefore, a more effective human body posture feature extraction method based on joint point coordinates must be sought.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body posture feature extracting method based on 3D joint point coordinates
  • Human body posture feature extracting method based on 3D joint point coordinates
  • Human body posture feature extracting method based on 3D joint point coordinates

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] The present invention will be described in further detail below in conjunction with the accompanying drawings and embodiments. The following examples are used to illustrate the present invention, but should not be used to limit the scope of the present invention.

[0056] A human body posture feature extraction method based on 3D joint point coordinates, such as figure 1 As shown, the method includes the following steps:

[0057] S1. Obtain the joint point coordinates of the specific posture;

[0058] S2. Establish a user space coordinate system, and convert the joint point coordinates in the device coordinate system to the user space system;

[0059] S3, extracting global features based on the position of the body part;

[0060] S4. Extracting local features based on the local joint structure;

[0061] S5. Fusion of global features and local features to form a final pose description feature.

[0062] Preferably, obtaining the joint point coordinates of the specifi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body posture feature extracting method based on 3D joint point coordinates. Two features are extracted in total, one is the global feature based on the body part location (BPL), the other is the local feature based on the local joint structure (LJS), and the final feature is the feature formed after the two features are fused. The experiment proves that the extracted features can effectively describe human body postures, and the human body postures can be well recognized when the features are applied to a support vector machine classification model.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction, in particular to a human body posture feature extraction method based on 3D joint point coordinates Background technique [0002] In the field of human-computer interaction, the accurate recognition of human motion is the premise of somatosensory interaction, so it is necessary to construct an effective motion representation method, specifically, to propose an effective human body posture feature extraction method. Traditional human action recognition is based on RGB images, extracting low-level image features to construct high-level semantic feature descriptions, and extracting a feature description for an action. This method requires a large amount of training data when dealing with different perspectives. In addition, the calculation cost of features is usually high, and it is difficult to achieve the application level for real-time interactive systems. [0003] In ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/011G06V40/20
Inventor 陈洪杜利强王庆
Owner CHINA AGRI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products