Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Iterative 6D pose estimation method and device based on deep learning

A technology of pose estimation and deep learning, which is applied in the field of iterative 6D pose estimation based on deep learning, can solve the problems of RANSAC algorithm time-consuming, not reflecting pose estimation, PnP algorithm does not belong, etc., and achieves simple structure and fast running time. Reduce, improve the effect of accuracy

Pending Publication Date: 2022-03-01
HEBEI UNIV OF TECH
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The main problems of this type of method are: first, the loss function used to train the deep network focuses on finding the 2D key points of the projection of the 3D key points of the object in the image, and does not reflect the real purpose of pose estimation; secondly, solving the 6D pose of the object The PnP algorithm is not part of the network, which makes the entire network not end-to-end trainable; third, the RANSAC algorithm is very time-consuming, especially when there are a large number of outliers
Therefore, pose estimation that satisfies both accurate and fast robust requirements is a challenging problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Iterative 6D pose estimation method and device based on deep learning
  • Iterative 6D pose estimation method and device based on deep learning
  • Iterative 6D pose estimation method and device based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to make the technical solution of the present invention clearer, the specific implementation of the present invention will be described more fully below in conjunction with the accompanying drawings of the embodiments of the present invention. The described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without any creative work belong to the scope of the present invention.

[0028] The following describes the iterative 6D pose estimation method and device based on deep learning according to the embodiments of the present invention with reference to the accompanying drawings. figure 1 It is a schematic flowchart of an iterative 6D pose estimation method based on deep learning provided by an embodiment of the present invention. The iterative 6D pose estimation method based on deep learning in the embodiment of the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an iterative 6D pose estimation method and device based on deep learning, and the method comprises the steps: carrying out the coding of the output of a 3D-2D key point corresponding relation extraction network, and coding the output of the 3D-2D key point corresponding relation extraction network into an input format of a pose coarse estimation network; the method comprises the following steps that: firstly, object 6D postures are input into a pose coarse estimation network constructed by utilizing an MLP, a pooling layer and a full connection layer, so that the pose coarse estimation network is combined into an integral network, the object 6D postures can be directly output, an end-to-end 6D object pose estimation network is formed, and a loss function of the 6D object pose estimation network is a function capable of directly reflecting and resolving object 6D pose parameters; and optimizing 6D attitude parameters output by the 6D object pose estimation network by using an orthogonal iterative algorithm. According to the method, the problems of long time consumption and poor repeatability of calculating the target attitude based on the PnP algorithm when many abnormal values exist are solved, and the attitude estimation efficiency and the attitude estimation robustness and accuracy are improved.

Description

technical field [0001] The invention belongs to the field of object 6D pose estimation based on monocular vision, and in particular relates to an iterative 6D pose estimation method and device based on deep learning. Background technique [0002] Object 6D pose estimation refers to estimating the 6D pose of the object in the camera coordinate system, including rotation and translation, that is, obtaining the rotation and translation transformation from the object's own coordinate system to the camera coordinate system. Estimating the 6D pose of an object from a monocular RGB image is a fundamental problem in computer vision. It has many important applications such as robotic grasping, autonomous navigation, augmented reality, etc. Along with the development of depth cameras, many recent solutions are based on depth maps. However, depth cameras have limitations in frame rate, field of view, resolution, and depth range, and it is difficult to detect small, thin, transparent,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V10/46G06V10/26G06V10/82G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 陈鹏郑逐隧
Owner HEBEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products