Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Complex component point cloud splicing method and system based on feature fusion

A feature fusion and point cloud splicing technology, applied in the field of computer vision, can solve problems such as inability to accurately estimate the relative pose transformation of local point clouds, inability to meet the application requirements of high-precision three-dimensional measurement, and inability to fit non-rigid body transformations, etc., to achieve improved Accuracy, good feature expression ability, good noise robustness effect

Active Publication Date: 2021-07-23
HUAZHONG UNIV OF SCI & TECH
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The six-degree-of-freedom transformation model estimated by traditional rigid body registration algorithms such as ICP and RANSAC cannot fit more complex non-rigid body transformations, so the relative pose transformation between local point clouds cannot be accurately estimated, resulting in the accuracy of point cloud stitching Decline
[0008] In summary, for the existing point cloud stitching method, it is not reliable in the context of the actual application of 3D measurement of large and complex components, and cannot meet the specific application requirements of high-precision 3D measurement

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Complex component point cloud splicing method and system based on feature fusion
  • Complex component point cloud splicing method and system based on feature fusion
  • Complex component point cloud splicing method and system based on feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0070] A complex component point cloud stitching method based on feature fusion, such as figure 1 shown, including the following steps:

[0071] (S1) Collect the 3D point cloud of each part of the component, and perform positioning to obtain the pose of each local 3D point cloud; there are partially overlapping points between adjacent local 3D point clouds;

[0072] figure 2 Shown are 6 pairs of adjacent local 3D point clouds, and there is overlap between each pair of adjacent local 3D point clouds;

[0073] In practical applications, each local 3D point cloud model can be obtained through a 3D scanner, and the positioning of each piece of point cloud in the center can be obtained through a multi-sensor target positioning method, so as to obtain the position of each local 3D point cloud in the world coordinate system posture;

[0074] In the positioning process, the local 3D point clouds that are adjacent to each other in space have their positioning order also adjacent. ...

Embodiment 2

[0149] A method for splicing point clouds of complex components based on feature fusion, this embodiment is similar to the above-mentioned embodiment 1, the difference is that in step (S3) of this embodiment, the feature fusion matching network used is based on two phases The adjacent point cloud is used as input, which is used to extract various features of points in each point cloud and fuse them into feature descriptors of corresponding points, and calculate the matching correspondence between two adjacent point clouds based on the fused feature descriptors; The various features extracted by the fusion matching network can be FPFH, SHOT, Super4PCS, etc.;

[0150] In terms of structure, the feature fusion matching network in this embodiment is also similar to the feature fusion matching network structure in the above-mentioned embodiment 1. The difference is that in this embodiment, the first feature fusion extraction in the feature fusion matching network The module is used...

Embodiment 3

[0157] A complex component point cloud stitching system based on feature fusion, including: a data acquisition module, a preprocessing module, a feature fusion matching module, and a registration stitching module;

[0158] The data acquisition module is used to collect the 3D point cloud of each part of the component, and perform positioning to obtain the pose of each local 3D point cloud; there are partially overlapping points between adjacent local 3D point clouds;

[0159] The preprocessing module is used to convert each local 3D point cloud to the world coordinate system according to the positioning result and perform uniform downsampling, extract the key points of each local 3D point cloud, and obtain the corresponding key point cloud;

[0160] The feature fusion matching module is used to input the key point cloud into the feature fusion matching network to obtain the matching correspondence between any two adjacent key point clouds; the feature fusion matching network us...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a complex component point cloud splicing method and system based on feature fusion, and belongs to the field of computer vision, and the method comprises the steps: collecting the local three-dimensional point cloud of a component, and carrying out the positioning, wherein the adjacent local three-dimensional point clouds are partially overlapped; converting the local three-dimensional point clouds to a world coordinate system according to a positioning result, and extracting key points after uniform downsampling to obtain corresponding key point clouds; inputting the key point clouds into a feature fusion matching network to obtain a matching corresponding relationship between any two adjacent key point clouds, wherein the feature fusion matching network takes two adjacent point clouds as input and is used for extracting multi-scale features or multiple features of points in the point clouds and fusing the multi-scale features or the multiple features into feature descriptors of corresponding points so as to calculate a matching corresponding relation of the adjacent point clouds; and carrying out point cloud registration by using a matching corresponding relation of adjacent key point clouds, and splicing the key point clouds into an integral three-dimensional point cloud model based on a point cloud registration result. According to the invention, the precision of point cloud splicing of complex components can be improved.

Description

technical field [0001] The invention belongs to the field of computer vision, and more specifically relates to a method and system for splicing complex constructed point clouds based on feature fusion. Background technique [0002] Large and complex components, including aircraft wings, ship blades, wind blades, high-speed rail bodies, etc., are widely used in many major technical fields such as aerospace, energy and transportation. In order to achieve high-precision machining of the surface of large and complex components, it is necessary to develop an accurate and reliable three-dimensional measurement system to evaluate the surface accuracy of its various parts. Just like using a ruler to measure the scale of an object, if you want to get a sufficiently reliable measurement result, the accuracy of the ruler needs to be high enough. The same is true for the measurement of the surface accuracy of complex components. In order to obtain reliable measurement results, three-dim...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/33G06T5/50G06T3/40G06K9/62G06K9/46
CPCG06T7/344G06T5/50G06T3/4038G06T2200/32G06T2207/10028G06T2207/20221G06V10/462G06F18/22G06F18/214
Inventor 陶文兵张世舜
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products