Foreground point cloud and background point cloud fusion method and system, and equipment

A fusion method and background point technology, applied in image data processing, instruments, computing, etc., can solve the problems of data set generation and labeling efficiency, insufficient diversity of data samples, unreasonable fusion, etc., to improve generalization ability, reduce The effect of manual duplication of work and solving the lack of diversity

Active Publication Date: 2021-07-27
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the above-mentioned problems in the prior art, that is, the unreasonable fusion of the foreground point cloud object and the real background in 3D perception and the problems of insufficient diversity of data samples, manual repetitive work, and low efficiency of data set generation and labeling, the present invention provides A fusion method of foreground point cloud and background point cloud is proposed, the method includes:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Foreground point cloud and background point cloud fusion method and system, and equipment
  • Foreground point cloud and background point cloud fusion method and system, and equipment
  • Foreground point cloud and background point cloud fusion method and system, and equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0063] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, not to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention are shown in the drawings.

[0064] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present application will be described in detail below with reference to the accompanying drawings and embodiments.

[0065] The invention provides a fusion method of the foreground point cloud and the background point cloud. Firstly, the whole background point cloud is two-dimensionally divided under the perspective of the top view, a two-dimensional grid is established, and the number of point clo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of point cloud fusion, particularly relates to a foreground point cloud and background point cloud fusion method and system, and equipment, and aims to solve the problems of irrationality of foreground point cloud object and real background fusion, insufficient data sample diversity, manual repeated work and low data set generation and labeling efficiency in three-dimensional perception. The method comprises the following steps: dividing a background point cloud into two-dimensional grids; extracting center point coordinates, a height average value, the maximum value and the minimum value of the bottom surface of a non-empty region, and establishing a two-dimensional grid feature map and a non-empty region indication map; constructing a circular neighborhood delta of the non-empty region, and establishing a flat position indication diagram; generating a size scaling, a position offset and a plurality of angle values of a foreground target to be placed, and traversing, searching and constructing a directed rectangular neighborhood under each placement condition; and placing a foreground object, and carrying out foreground and background point cloud fusion. According to the method, the foreground point cloud object and the point cloud background are reasonably fused, the manual repeated work is greatly reduced, and the data set generation and labeling efficiency is improved.

Description

technical field [0001] The invention belongs to the field of point cloud fusion, and in particular relates to a fusion method, system and equipment for a foreground point cloud and a background point cloud. Background technique [0002] With the rapid development of computer vision, 2D vision tasks such as image classification, object detection, medical imaging and object tracking have been well solved and applied in many fields. However, 3D perception tasks are fraught with challenges due to the increased dimensionality of data. In areas with high safety requirements such as autonomous driving and unmanned aircraft, accurate 3D information understanding is essential. Depth cameras and LiDAR (Light Detection and Ranging) are common sensors for obtaining 3D information. Compared with depth cameras, LiDAR is more accurate and has a wider sensing range, making it a key component of many 3D perception systems. However, the annotation of point clouds generated by LiDAR sensors...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50G06T3/60G06T3/00
CPCG06T5/50G06T3/60G06T3/0037G06T2207/10028G06T2207/20221
Inventor 王飞跃王晓田永林沈宇郭中正
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products