Indoor attitude estimation method based on thermodynamic map for single image

A technology for indoor object and attitude estimation, applied in computing, design optimization/simulation, biological neural network models, etc., can solve problems such as inability to handle smooth and textureless objects, application requirements that cannot meet accuracy, sensitivity to illumination and occlusion, etc. , to achieve the effects of wide application range, improved positioning accuracy, and robust occlusion

Active Publication Date: 2018-12-21
HANGZHOU NORMAL UNIVERSITY
View PDF5 Cites 30 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] To sum up, the problems existing in the existing technology are: the traditional method based on feature point matching cannot deal with smooth and textureless objects; the method based on template matching is sensitive to illumination and occlusion; the method based on dense feature matching needs to The sample space is used to extract features, which is time-consuming and the posture generally requires subsequent optimization; the end-to-end method based on the convolutional network is not good at solving multi-target and complex scenes and occlusion between objects, and cannot meet the application requirements of higher accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor attitude estimation method based on thermodynamic map for single image
  • Indoor attitude estimation method based on thermodynamic map for single image
  • Indoor attitude estimation method based on thermodynamic map for single image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] In order to make the technical solution of the present invention clearer, the content of the invention will be described in more detail below in conjunction with the examples, but the protection scope of the invention is not limited to the following examples.

[0038] Given a single RGB image and the data synthesized by ShapeNet as a CAD model library, the pose estimation of the target object in a single indoor scene picture is completed. The overall flow chart is as figure 2 Shown:

[0039] S10: extracting features of the target object through the CONV5 convolutional neural network;

[0040] S11: Then predict the target candidate frame (object in the indoor scene) by using the RPN neural network;

[0041] S12: Predict the heat map corresponding to the 8 vertices of the target object by using the FCN according to the previously obtained target object features and the target candidate frame;

[0042] S13: Calculate the 6D attitude of the object by using EPnP based on...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of target attitude estimation, and discloses a single image indoor object attitude estimation method based on thermodynamic map. The object attitude estimation method based on thermodynamic map extracts candidate frames of a plurality of target objects through an RPN network. The thermodynamic images of eight vertices of each object bounding box on two-dimensional images are extracted by full convolution network (FCN), and then the 6D attitude estimation of the object is calculated by using PnP method. ShapeNet is used as CAD model library to synthesize a large amount of training data. The object posture estimation detection technology based on the thermodynamic map adopted by the invention has strong robustness, can estimate the posture of different indoor objects when the background is disorderly indoor scene and the object is partially occluded, has wide application range, is insensitive to light and does not require the object to have obvious texture appearance.

Description

technical field [0001] The present invention relates to the technical field of pose estimation, in particular to a method for estimating the pose of an indoor object in a single image based on a thermal map. Background technique [0002] Pose estimation of objects in indoor scenes plays an important role in the motion planning of social robots and human-computer interaction in virtual reality and augmented reality. At present, in the research of pose estimation, it is mainly divided into feature point matching, template matching, dense feature matching and end-to-end methods based on convolutional networks. There are certain problems in these methods, and the performance is not very stable in complex practical environment applications. For example, the traditional method based on feature point matching relies on texture to extract feature points, and then calculates the rotation and translation between the corresponding points of the 3D object based on the 2D feature points...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/50G06N3/04
CPCG06F30/20G06N3/045
Inventor 刘复昌白玉孟凡胜
Owner HANGZHOU NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products