Method and device for human target recognition

A human body target and human body technology, applied in the field of target recognition, can solve the problems of cumbersome maintenance and debugging process, can not meet real-time application, insufficient accuracy, achieve good uniqueness and spatial invariance, simplify the detection and recognition process, and strengthen self- adaptive effect

Active Publication Date: 2019-02-22
山东佳音信息科技有限公司
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In this way, at least two sets of algorithms are needed to complete the detection and recognition, the programming is complicated, and the maintenance and debugging process is also very cumbersome
At the same time, the detection and recognition are divided into two independent steps, which makes the calculation speed slow and cannot meet the needs of real-time applications
Human body part recognition is also largely affected by human body detection results, resulting in insufficient accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for human target recognition
  • Method and device for human target recognition
  • Method and device for human target recognition

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0058] Please refer to figure 1 , figure 1 A specific flow chart of a method for human body target recognition is provided for this embodiment, and the method includes:

[0059] Step S110, obtaining a depth image.

[0060] In this embodiment, the depth image is obtained by a depth sensor, wherein the depth image includes a depth value of each pixel obtained by the depth sensor.

[0061] Please refer to figure 2 , assuming that the field angle of the depth sensor in this embodiment is (α, β), and the resolution of the obtained depth image is (m, n). Coordinates are established on the depth image in units of pixels, and the depth value of the pixel p=(x, y) is recorded as D(x, y).

[0062] Step S120, extracting image pixel features in the depth image.

[0063] Extracting the image pixel features may include: depth gradient direction histogram features, local simplified ternary pattern features, depth value statistical distribution features, and depth difference features be...

no. 2 example

[0103] Please refer to Figure 7 , the human target recognition device 10 provided in this embodiment includes:

[0104] A first acquisition module 110, configured to acquire a depth image;

[0105] A first feature extraction module 120, configured to extract image pixel features in the depth image;

[0106] The human body deep learning module 130 is used to identify and classify the input image pixel features;

[0107] A judging module 140, configured to judge whether the classification of the image pixel features matches the existing human body part labels in the human body deep learning model;

[0108] The output module 150 is configured to output the label corresponding to the pixel feature when the classification of the image pixel feature matches the existing label in the human body deep learning model.

[0109] In this embodiment, the human body deep learning model is used to use the image pixel features as the input of the bottom input layer, perform regression clas...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A method and device for human target recognition provided by the present invention, the method comprising: obtaining a depth image; extracting image pixel features in the depth image; inputting the image pixel features into a human body deep learning model for identification and classification; Judging whether the classification of the image pixel features matches the existing human body part labels in the human body deep learning model; if the classification of the image pixel features matches the existing labels in the human body deep learning model, then the output and The label corresponding to the pixel feature. The invention adopts a deep learning model to identify image pixel features, and at the same time completes the detection and identification of human body objects, simplifies the detection and identification process, and improves the detection and identification efficiency.

Description

technical field [0001] The present invention relates to the technical field of target recognition, in particular to a method and device for human target recognition. Background technique [0002] With the gradual maturity of depth image sensor technology, cheap depth image sensor devices have been widely used in various fields. Since the depth image is not affected by factors such as light, image color difference, and motion state, it is especially suitable for use in the field of human target recognition. Therefore, the method of human target recognition based on depth images has become a research hotspot in this field. [0003] The existing human target recognition based on depth image needs to detect human body first, and then identify human body parts on this basis. In this way, at least two sets of algorithms are needed to complete the detection and identification, the programming is complicated, and the maintenance and debugging process is also very cumbersome. At t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/593G06K9/00
CPCG06T2207/20081G06T2207/30196
Inventor 谭志国滕书华李洪
Owner 山东佳音信息科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products