Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion identification method based on human bone joint point distance

An action recognition and node distance technology, applied in character and pattern recognition, instruments, computer parts, etc., can solve the problems of not meeting the needs of interactive action recognition, immature recognition, etc., to reduce randomness and subjective selection. Interference, objective and credible recognition results, and obvious action characteristics

Inactive Publication Date: 2017-11-24
TIANJIN UNIV
View PDF13 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current bone-based action technology recognition is not yet mature
[0003] The two patents CN106203503A and CN106228109A previously applied by the inventor of this application are the most relevant public patents to this application. Their disadvantage is that they can only recognize actions in a single subject scene and cannot meet the needs of interactive action recognition in complex scenes.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion identification method based on human bone joint point distance
  • Motion identification method based on human bone joint point distance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The present invention will be further described in detail below in conjunction with the accompanying drawings and through specific embodiments. The following embodiments are only descriptive, not restrictive, and cannot limit the protection scope of the present invention.

[0034] An action recognition method based on the distance of human skeleton nodes, the steps are as follows:

[0035] 1) Mapping of bone sequence to picture

[0036] Assuming that there is a series of skeleton sequences of human actions, it is now necessary to recognize the actions, usually the number of frames of the skeleton sequences of each action is t x uncertain.

[0037] In the first step, use bilinear interpolation to fix the frame number of the skeleton sequence for all actions to t.

[0038] In the second step, it is assumed that from each frame skeleton map V xyz Extract m human skeleton nodes, use

[0039] to represent the three-dimensional position information of the jth bone node ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a motion identification method based on a human bone joint point distance. The method comprises the following steps of (1) projecting each frame of a bone sequence on three planes of a Cartesian orthogonal system and generating a human bone distribution map; (2) extracting bone joint points in the human bone distribution map; (3) calculating a euclidean distance among the bone joints of each frame of the bone sequence and making the distances of all the joint points form frame distance vectors; (4) making all the frame distance vectors of the bone sequence form a sequence distance matrix according to a time sequence; (5) using a pseudo-color coding method to carry out color coding on a distance two-dimensional matrix and acquiring a color texture map; and (6) using a depth learning method to carry out picture classification so as to complete a human motion detection identification task. The motion identification method based on the bone joint point distance is not influenced by an environment and interaction motion and is suitable for interactive motion identification under a complex scene so that the method possesses a wide application value.

Description

technical field [0001] The invention belongs to the field of multimedia information processing, relates to computer intelligence, pattern recognition and machine learning, and is an action recognition method based on the distance of human skeleton nodes. Background technique [0002] With the continuous development of computer intelligence technology, human action recognition has broad application prospects in future life. For example: intelligent monitoring, human-computer interaction somatosensory games, video retrieval, etc. Therefore, the study of action recognition methods has far-reaching research value. In recent years, with the maturity of computer vision technology, easy-to-use, low-cost depth sensors, such as Kinect cameras, have been widely used. Human action detection and recognition based on deep skeletal video sequences has attracted increasing attention due to the advantages of depth cameras, which are insensitive to illumination changes and capable of relia...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/23G06V20/46G06F18/254
Inventor 侯永宏杨梦頔李传坤王利伟
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products