Unlock instant, AI-driven research and patent intelligence for your innovation.

Kinect-based object supporting relationship inference method

A technology that supports relationships and objects, applied in the field of scene analysis, can solve problems such as insufficient extraction, low algorithm accuracy, and overall scene structure information that cannot meet scene structure extraction.

Active Publication Date: 2015-04-08
HEFEI UNIV OF TECH
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Focusing on the extraction of the overall structure information of the scene, it can only outline the simple overall three-dimensional information of the scene, but in the current situation where depth information can be collected, the overall structure information of the scene can no longer meet the needs of scene structure extraction
[0005] The extraction of the object bounding box in the scene can only extract the bounding box of a single object, ignoring the connection between different objects in the scene, which cannot reflect the connection between objects well
[0006] There are few researches on the existing support relationship extraction algorithm. In the existing technology, the extraction of the core scene constraint relationship in the scene is not comprehensive enough, resulting in a low accuracy rate of the algorithm.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Kinect-based object supporting relationship inference method
  • Kinect-based object supporting relationship inference method
  • Kinect-based object supporting relationship inference method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0120] In order to verify the effect of the algorithm in the present invention, the present invention conducts experiments on RGB-D images No. 1001-1200 in the NYU-Depth2 database. The depth information and color information of the image are extracted, and the existing calibration results are used to train the classifier. Finally, the correct support area of ​​each area is used as the evaluation condition, and it is compared with the existing support relationship calculation algorithm.

[0121] When judging the correctness of the support relationship, it is still controversial whether the classification of the structure level of the support area is correct. Therefore, the correct rate of the structure with the structure level and the correct rate structure without the structure level are displayed here. The specific results are shown in Table 1. 2 shown.

[0122] It can be seen from the above table that the method proposed by the present invention can accurately and effectivel...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a Kinect-based object supporting relationship inference method. The method comprises the following steps: segmenting an image by using a hierarchical segmentation algorithm based on an RGB-D image; building a scene structure level model, a scene supporting relationship model based on a structure level and a supporting relationship restraining rule; evaluating a scene supporting relationship by using a linear plan; inferring the stability of an object in a scene by using the scene supporting relationship. By adopting the Kinect-based object supporting relationship inference method, depth data acquired by Kinect is effectively utilized, the supporting relationship in the scene is extracted innovatively, and the vision and target identification work of a robot can be effectively assisted.

Description

technical field [0001] The invention relates to the field of scene analysis, in particular to a Kinect-based object support relationship inference method. Background technique [0002] With the development of camera acquisition equipment, image information including depth data can be acquired at present. How to make good use of depth information to make it play a greater role in robot vision, object recognition and other fields has become a research hotspot. In recent years, a large number of studies have been done on scene analysis based on RGB-D data, but the analysis of scene structure is still lacking. Obtaining the support relationship through the scene structure can well assist robot vision, determine the stability of objects in the scene, and also Being able to provide novel recognition features for target recognition will surely become an important part of scene understanding in the future. [0003] The existing scene structure extraction methods mainly extract the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00
CPCG06T7/12G06T7/50G06T7/97G06T2207/10024G06T2207/10028
Inventor 洪日昌何川汪萌刘学亮郝世杰杨勋
Owner HEFEI UNIV OF TECH