Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A 3D Point Cloud Segmentation Method for Visual Positioning

A technology of 3D point cloud and visual positioning, which is applied in the field of visual positioning, can solve problems such as unsuitable segmentation, achieve the effect of improving the accuracy of recognition and positioning, and the design is reasonable

Active Publication Date: 2022-05-27
WUHU HIT ROBOT TECH RES INST
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This patent solution is dedicated to the segmentation of box-like objects, and cannot be applied to the segmentation of circular barrels or circular barrel-like objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A 3D Point Cloud Segmentation Method for Visual Positioning
  • A 3D Point Cloud Segmentation Method for Visual Positioning
  • A 3D Point Cloud Segmentation Method for Visual Positioning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The specific embodiments of the present invention will be described in further detail below through the description of the embodiments with reference to the accompanying drawings.

[0031] The visual positioning 3D point cloud segmentation method solves the problem that after the lidar sensor obtains 3D point cloud information, the traditional method (circle or ellipticity detection) cannot realize the complete recognition of the object - the recognition rate is low; although there are corresponding measures for the workpiece Recognition but its positioning accuracy is low - the problem of large positioning error.

[0032] The visual positioning 3D point cloud segmentation method includes the following steps:

[0033] S1: The robot carries a three-dimensional laser sensor to obtain the three-dimensional point cloud information of the workpiece;

[0034] S2: use the PCL point cloud library to realize the filtering, sampling and smoothing of the point cloud, and remove t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a three-dimensional point cloud segmentation method for visual positioning. The robot carries a three-dimensional laser sensor to obtain the three-dimensional point cloud information of the workpiece, uses the peripheral density of the point cloud to realize the mathematical relationship between the workpiece centers, and successfully introduces the segmentation method between the workpieces. Successfully realize the segmentation between two circles or quasi-circular objects, which can improve the recognition and positioning accuracy of connected circular or quasi-circular workpieces. After obtaining the 3D point cloud information, the complete recognition of the object cannot be realized by using the traditional method of circular or elliptical detection-the problem of low recognition rate and large positioning error.

Description

technical field [0001] The invention relates to the technical field of visual positioning, in particular to a three-dimensional point cloud segmentation method for connected circular or quasi-circular workpieces. Background technique [0002] At present, when robots are used to load and unload circular barrels or quasi-circular barrels in factories, it is necessary to obtain accurate positioning from external equipment. Although two-dimensional machine vision can provide accurate positioning of barrels under certain premise, due to light in industrial sites Due to the influence of many factors such as dust and dust, its positioning is very limited, and solid 3D machine vision is applied here; however, after the lidar sensor obtains 3D point cloud information, the traditional method (circular or elliptic detection) cannot be used to achieve object detection. Complete identification - problems with low identification rate and low positioning accuracy. [0003] For example, as...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/10G06T5/30
CPCG06T7/10G06T5/30G06T2207/10028Y02P90/30
Inventor 王磊樊璇陈健高云峰曹雏清
Owner WUHU HIT ROBOT TECH RES INST
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products