A kinect-based object support relationship inference method

A technology that supports relationships and objects. It is applied in the field of scene analysis and can solve problems such as the inability to reflect the relationship between objects, the accuracy of the algorithm is not high, and the extraction is not comprehensive enough.

Active Publication Date: 2018-07-27
HEFEI UNIV OF TECH
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Focusing on the extraction of the overall structure information of the scene, it can only outline the simple overall three-dimensional information of the scene, but in the current situation where depth information can be collected, the overall structure information of the scene can no longer meet the needs of scene structure extraction
[0005] The extraction of the object bounding box in the scene can only extract the bounding box of a single object, ignoring the connection between different objects in the scene, which cannot reflect the connection between objects well
[0006] There are few researches on the existing support relationship extraction algorithm. In the existing technology, the extraction of the core scene constraint relationship in the scene is not comprehensive enough, resulting in a low accuracy rate of the algorithm.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A kinect-based object support relationship inference method
  • A kinect-based object support relationship inference method
  • A kinect-based object support relationship inference method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0122] In order to verify the effect of the algorithm in the present invention, the present invention conducts experiments on RGB-D images of No. 1001-1200 in the NYU-Depth2 database. The depth information and color information of the image are extracted, and the classifier is trained by using the existing calibration results. Finally, the correct support area of ​​each area is used as the evaluation condition, and compared with the existing support relationship calculation algorithm.

[0123] When judging the correctness of the support relationship, there is still controversy about whether the structural level classification of the support area is correct. Therefore, the correct rate structure with structural level and the correct rate structure without structural level are displayed here. The specific results are shown in Table 1. 2.

[0124] It can be seen from the above table that the method proposed by the present invention can more accurately and effectively extract the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a Kinect-based object supporting relationship inference method. The method comprises the following steps: segmenting an image by using a hierarchical segmentation algorithm based on an RGB-D image; building a scene structure level model, a scene supporting relationship model based on a structure level and a supporting relationship restraining rule; evaluating a scene supporting relationship by using a linear plan; inferring the stability of an object in a scene by using the scene supporting relationship. By adopting the Kinect-based object supporting relationship inference method, depth data acquired by Kinect is effectively utilized, the supporting relationship in the scene is extracted innovatively, and the vision and target identification work of a robot can be effectively assisted.

Description

technical field [0001] The invention relates to the field of scene analysis, in particular to a Kinect-based object support relationship inference method. Background technique [0002] With the development of camera acquisition equipment, image information including depth data has been able to be acquired. How to make good use of depth information to make it play a greater role in the fields of robot vision and object recognition has become a research hotspot. In recent years, a lot of research has been done on the scene analysis based on RGB-D data, but the analysis of the scene structure is still lacking, and the support relationship obtained through the scene structure can well assist robot vision and determine the stability of objects in the scene. Being able to provide novel recognition features for object recognition will surely become an important part of scene understanding in the future. [0003] Among the existing scene structure extraction methods, the main meth...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/30G06T7/50
CPCG06T7/12G06T7/50G06T7/97G06T2207/10024G06T2207/10028
Inventor 洪日昌何川汪萌刘学亮郝世杰杨勋
Owner HEFEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products