Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

A method and system for point cloud splicing of complex components based on feature fusion

A feature fusion and point cloud splicing technology, applied in the field of computer vision, can solve the problems of inability to accurately estimate the relative pose transformation of local point clouds, inability to meet the requirements of high-precision 3D measurement applications, inability to fit non-rigid transformations, etc. The effect of accuracy, good feature expression ability, and good noise robustness

Active Publication Date: 2022-04-22
HUAZHONG UNIV OF SCI & TECH
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The six-degree-of-freedom transformation model estimated by traditional rigid body registration algorithms such as ICP and RANSAC cannot fit more complex non-rigid body transformations, so the relative pose transformation between local point clouds cannot be accurately estimated, resulting in the accuracy of point cloud stitching Decline
[0008] In summary, for the existing point cloud stitching method, it is not reliable in the context of the actual application of 3D measurement of large and complex components, and cannot meet the specific application requirements of high-precision 3D measurement

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method and system for point cloud splicing of complex components based on feature fusion
  • A method and system for point cloud splicing of complex components based on feature fusion
  • A method and system for point cloud splicing of complex components based on feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0070] A complex component point cloud stitching method based on feature fusion, such as figure 1 shown, including the following steps:

[0071] (S1) Collect the 3D point cloud of each part of the component, and perform positioning to obtain the pose of each local 3D point cloud; there are partially overlapping points between adjacent local 3D point clouds;

[0072] figure 2 Shown are 6 pairs of adjacent local 3D point clouds, and there is overlap between each pair of adjacent local 3D point clouds;

[0073] In practical applications, each local 3D point cloud model can be obtained through a 3D scanner, and the positioning of each piece of point cloud in the center can be obtained through a multi-sensor target positioning method, so as to obtain the position of each local 3D point cloud in the world coordinate system posture;

[0074] In the positioning process, the local 3D point clouds that are adjacent to each other in space have their positioning order also adjacent. ...

Embodiment 2

[0149] A method for splicing point clouds of complex components based on feature fusion, this embodiment is similar to the above-mentioned embodiment 1, the difference is that in step (S3) of this embodiment, the feature fusion matching network used is based on two phases The adjacent point cloud is used as input, which is used to extract various features of points in each point cloud and fuse them into feature descriptors of corresponding points, and calculate the matching correspondence between two adjacent point clouds based on the fused feature descriptors; The various features extracted by the fusion matching network can be FPFH, SHOT, Super4PCS, etc.;

[0150] In terms of structure, the feature fusion matching network in this embodiment is also similar to the feature fusion matching network structure in the above-mentioned embodiment 1. The difference is that in this embodiment, the first feature fusion extraction in the feature fusion matching network The module is used...

Embodiment 3

[0157] A complex component point cloud stitching system based on feature fusion, including: a data acquisition module, a preprocessing module, a feature fusion matching module, and a registration stitching module;

[0158] The data acquisition module is used to collect the 3D point cloud of each part of the component, and perform positioning to obtain the pose of each local 3D point cloud; there are partially overlapping points between adjacent local 3D point clouds;

[0159] The preprocessing module is used to convert each local 3D point cloud to the world coordinate system according to the positioning result and perform uniform downsampling, extract the key points of each local 3D point cloud, and obtain the corresponding key point cloud;

[0160] The feature fusion matching module is used to input the key point cloud into the feature fusion matching network to obtain the matching correspondence between any two adjacent key point clouds; the feature fusion matching network us...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a complex component point cloud splicing method and system based on feature fusion, belonging to the field of computer vision, including: collecting and positioning the three-dimensional point clouds of each part of the component; partial overlap between adjacent local three-dimensional point clouds ;According to the positioning results, convert each local 3D point cloud to the world coordinate system and extract key points after uniform downsampling to obtain the corresponding key point cloud; input the key point cloud to the feature fusion matching network to obtain any two adjacent key points The matching correspondence of point clouds; the feature fusion matching network takes two adjacent point clouds as input, and is used to extract multi-scale features or multiple features of points in each point cloud and fuse them into feature descriptors of corresponding points to calculate Matching correspondence of adjacent point clouds; use the matching correspondence of adjacent key point clouds for point cloud registration, and stitch each key point cloud into an overall 3D point cloud model based on the point cloud registration results. The invention can improve the splicing accuracy of point clouds of complex components.

Description

technical field [0001] The invention belongs to the field of computer vision, and more specifically relates to a method and system for splicing complex constructed point clouds based on feature fusion. Background technique [0002] Large and complex components, including aircraft wings, ship blades, wind blades, high-speed rail bodies, etc., are widely used in many major technical fields such as aerospace, energy and transportation. In order to achieve high-precision machining of the surface of large and complex components, it is necessary to develop an accurate and reliable three-dimensional measurement system to evaluate the surface accuracy of its various parts. Just like using a ruler to measure the scale of an object, if you want to get a sufficiently reliable measurement result, the accuracy of the ruler needs to be high enough. The same is true for the measurement of the surface accuracy of complex components. In order to obtain reliable measurement results, three-dim...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/33G06T5/50G06T3/40G06K9/62G06V10/46G06V10/774G06V10/74
CPCG06T7/344G06T5/50G06T3/4038G06T2200/32G06T2207/10028G06T2207/20221G06V10/462G06F18/22G06F18/214
Inventor 陶文兵张世舜
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products