Unlock instant, AI-driven research and patent intelligence for your innovation.

Pose estimation method based on region-level feature fusion

A pose estimation and feature fusion technology, applied in computing, computer components, instruments, etc., can solve problems such as poor prediction performance, and achieve the effect of improved pose estimation accuracy and good robustness

Pending Publication Date: 2022-03-08
SHANGHAI NORMAL UNIVERSITY +1
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The purpose of the present invention is to overcome the defects of poor predictive performance of existing pose estimation methods under severe occlusion and complex backgrounds, and provide a pose estimation method based on region-level feature fusion, which combines depth information and color features together, Then use the neural network to process the obtained region-level fusion features, and use the symmetric reduction function to process multiple region-level fusion features to generate a global feature, and then add the global features to each region-level fusion feature to obtain more details. , Multi-scale color and depth region fusion features, which make the algorithm very robust in the case of background chaos and severe occlusion. At the same time, the pose estimation is divided into two steps, first calculate the 3D translation prediction, and then combine the region Level fusion features can obtain 3D rotation predictions more accurately, making the network's attention more concentrated, reducing the scope of the solution space, making it easier to solve, faster in calculation speed, and more timely in response

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pose estimation method based on region-level feature fusion
  • Pose estimation method based on region-level feature fusion
  • Pose estimation method based on region-level feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The specific implementation manners of the present invention will be further elaborated below in conjunction with the accompanying drawings. Apparently, the described embodiments are only some of the embodiments of the present invention, not all of them.

[0038] In describing the present invention, it should be understood that the terms "upper", "lower", "front", "rear", "left", "right", "top", "bottom", "inner", " The orientation or positional relationship indicated by "outside", etc. is based on the orientation or positional relationship shown in the drawings, and is only for the convenience of describing the present invention and simplifying the description, rather than indicating or implying that the referred device or element must have a specific orientation, so as to Specific orientation configurations and operations, therefore, are not to be construed as limitations on the invention.

[0039] The present invention proposes a pose estimation method based on regio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of artificial intelligence, and discloses a pose estimation method based on region-level feature fusion, which comprises the following steps: S1, acquiring images of a to-be-detected object through a three-dimensional camera, including a color image and a depth image; s2, inputting the color image into a first neural network, and extracting color features of a to-be-detected object; s3, converting a corresponding area of the to-be-detected object in the depth image into a point cloud image, inputting the point cloud image into a second neural network, extracting geometric features of the to-be-detected image, and generating three-dimensional translation prediction; s4, carrying out pixel-by-pixel fusion on the color features and the geometric features to generate a plurality of region-level fusion features, inputting the plurality of region-level fusion features into a multi-layer perceptron, and generating a plurality of three-dimensional rotation predictions and corresponding confidence coefficients; and S5, combining the three-dimensional translation prediction and the three-dimensional rotation prediction with the maximum confidence coefficient to generate 6D pose estimation.

Description

technical field [0001] The invention belongs to the technical field of artificial intelligence, and relates to a pose estimation method based on region-level feature fusion. Background technique [0002] 6D pose estimation is to estimate the rotation and translation of objects in 3D space. Specifically, 6D pose is represented by a rigid transformation [R|t], where R represents 3D rotation and t represents 3D translation. 6D pose estimation is an important link in many real-world applications, such as robotic grasping and manipulation, autonomous navigation, and augmented reality. [0003] Traditionally, the problem of 6D object pose estimation is solved by matching feature points between 3D models and images. However, these methods require the object to be richly textured in order to detect the feature points to be matched, thus, they cannot handle objects without texture because the surface of an untextured object cannot provide enough information to extract 2D keypoints, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V10/774G06V10/82G06V10/80G06K9/62G06N3/04G06V10/56
CPCG06N3/045G06F18/253
Inventor 安康王万诚曾莉宋亚庆上官倩芡管西强李一染
Owner SHANGHAI NORMAL UNIVERSITY