Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Two-dimensional target detection-based point cloud recognition and segmentation method, device and system for workpiece most suitable for grabbing

A target detection, the most suitable technology, used in character and pattern recognition, image analysis, computer parts and other directions, can solve the problem of rapid identification and segmentation of out-of-order workpieces, large amount of calculation, and the recognition results cannot be directly applied to pose estimation, etc. question

Pending Publication Date: 2021-09-10
SUZHOU ZIJINGANG INTELLIGENT MFG EQUIP
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The 3D point cloud recognition and segmentation technology is a common workpiece point cloud recognition and segmentation technology in the out-of-sequence workpiece grasping system. This technology allows the template point cloud to be registered with all workpieces in the out-of-order workpiece point cloud one by one according to the feature point pairs. The workpiece with the smallest registration error with the template point cloud is identified as the most suitable workpiece to be captured and segmented separately, but the recognition and segmentation method based on the 3D point cloud has a large amount of calculation and slow calculation speed, which is not suitable for rapid identification and segmentation of out-of-sequence workpieces requirements
With the rapid development of deep learning, two-dimensional image recognition and segmentation technologies based on deep learning emerge in endlessly. Among them, the two-dimensional target detection algorithm based on deep convolutional neural network has the performance of independently completing the learning of target features and extracting target feature information. The rapid and high-precision recognition of 2D images of out-of-sequence workpieces, but because the workpiece pose estimation part of the out-of-order workpiece grasping system needs to input the 3D point cloud that is most suitable for grasping the workpiece to calculate the 6 degrees of freedom of the workpiece, therefore, The recognition results of two-dimensional image recognition and segmentation technology cannot be directly applied to the pose estimation of out-of-sequence workpieces

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Two-dimensional target detection-based point cloud recognition and segmentation method, device and system for workpiece most suitable for grabbing
  • Two-dimensional target detection-based point cloud recognition and segmentation method, device and system for workpiece most suitable for grabbing
  • Two-dimensional target detection-based point cloud recognition and segmentation method, device and system for workpiece most suitable for grabbing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0052] In this embodiment, a DLP4500 projector (resolution 912×1140) and two Daheng Mercury industrial cameras (resolution 2448×2048) are used to build a structured light camera; the configuration of the selected computer is as follows: the processor model is IntelCore i5-7400 The model of CPU and graphics card is NVIDIA GeForce GTX 1050; the selected out-of-sequence workpieces are the out-of-order stacked small workpieces in the loading station of the manufacturing production line.

[0053] An embodiment of the present invention provides a method for identifying and segmenting the most suitable point cloud of a captured workpiece based on two-dimensional target detection, such as figure 1 -2, specifically includes the following steps:

[0054] Step 1, input the color image (resolution 2448×2048) and disparity image (resolution 2448×2048) of the out-of-sequence artifacts generated by the binocular structured light camera, see Figure 2(a) and Figure 2(b) for details, and Pre-p...

Embodiment 2

[0066] Based on the same inventive concept as in Embodiment 1, the embodiment of the present invention provides a point cloud recognition and segmentation device most suitable for grabbing workpieces based on two-dimensional target detection, including:

[0067] The detection unit is used to perform two-dimensional target detection on the color image, and output the confidence level and bounding box position of all detected workpieces based on the target detection result;

[0068] The selection unit is used to select the workpiece with the highest confidence as the most suitable workpiece for grabbing;

[0069] The determination unit is used to determine and segment the area covered by the bounding box on the disparity map according to the position parameter of the bounding box of the workpiece with the greatest confidence, so as to realize the two-dimensional positioning and segmentation of the workpiece;

[0070] The generation unit is used to perform 3D reconstruction on th...

Embodiment 3

[0073] An embodiment of the present invention provides a point cloud identification and segmentation system most suitable for grabbing a workpiece based on two-dimensional target detection, including a storage medium and a processor;

[0074] The storage medium is used to store instructions;

[0075] The processor is configured to operate according to the instructions to perform the method according to any one of Embodiment 1

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a two-dimensional target detection-based point cloud recognition and segmentation method, device and system for the workpiece most suitable for grabbing. The method comprises the steps of: carrying out two-dimensional target detection of a color image, and outputting the confidence coefficients of all detected workpieces and the positions of bounding boxes to which the workpieces belong based on a target detection result; selecting the workpiece with the maximum confidence coefficient as the workpiece most suitable for grabbing; according to the position parameter of the bounding box to which the workpiece with the maximum confidence coefficient belongs, determining and segmenting an area covered by the bounding box on a disparity map to realize two-dimensional positioning and segmentation of the workpiece; and performing three-dimensional reconstruction on the area where the upper bounding box of the disparity map is located, and generating point cloud of the workpiece most suitable for grabbing. According to the invention, the three-dimensional reconstruction technology and the two-dimensional target detection technology are fused, the workpiece most suitable for grabbing in the out-of-order workpieces can be quickly recognized, and the point cloud of the workpiece can be independently reconstructed, so that the situation that the workpiece most suitable for grabbing is recognized through complex global feature matching in the pose estimation of three-dimensional point cloud is avoided, the calculation amount is remarkably reduced, and the speed of recognizing and segmenting the workpiece most suitable for grabbing is accelerated.

Description

technical field [0001] The invention relates to the fields of two-dimensional target detection and point cloud recognition and segmentation, in particular to a method, device and system for point cloud recognition and segmentation most suitable for grasping workpieces based on two-dimensional target detection. Background technique [0002] In recent years, with the development of 3D vision technology, out-of-sequence workpiece grasping technology based on 3D vision has begun to be applied and promoted in the industrial field. In practical applications, out-of-order workpieces are stacked and placed, and the occluded workpieces are not suitable for being grasped by robots. Therefore, the out-of-order workpiece grasping system based on 3D vision is used for pose estimation and grasping Before that, it is necessary to identify the workpiece with the lowest occlusion rate among the out-of-sequence workpieces—the most suitable for grabbing workpieces. [0003] The general recogn...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/10G06T7/73G06K9/62
CPCG06T7/10G06T7/73G06T2207/10024G06T2207/10028G06T2207/30164G06T2207/20228G06T2207/20016G06T2207/20084G06F18/253
Inventor 徐月同俞炯炎王郑拓傅建中
Owner SUZHOU ZIJINGANG INTELLIGENT MFG EQUIP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products