Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual 3D taking and placing method and system based on cooperative robot

A robot and 3D technology, applied in the field of visual 3D pick and place, can solve the problems of high production difficulty, difficulty in dealing with scene illumination changes, occlusion and stacking, and high installation and use upper limit, achieving strong resistance to ambient light interference, ensuring reliability and efficiency. The effect of satisfying the system measurement accuracy

Active Publication Date: 2021-03-12
新拓三维技术(深圳)有限公司
View PDF7 Cites 37 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Most of the existing visual perception processes can only be used for 2D pose estimation of flat single-target objects, which are easily affected by factors such as lighting and background environment, and are difficult to cope with actual situations such as scene lighting changes, occlusion stacking, etc.
At present, the visual perception system based on deep learning method that has made great progress has a long production cycle and high difficulty in making network training data sets; at the same time, the network model has weak generalization ability and low robustness, which is not conducive to practical application. indoor scene
[0005] However, for existing robot systems, the upper limit of installation and use is high, the programming process is cumbersome, human-machine cooperation is not emphasized, the trajectory is mostly taught manually, without trajectory planning, the operation efficiency is low, and the reliability is poor. Job failed, can only be used in structured environments
[0006] There is a lack of an automated and intelligent pick-and-place system based on collaborative robots and high-precision vision algorithms in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual 3D taking and placing method and system based on cooperative robot
  • Visual 3D taking and placing method and system based on cooperative robot
  • Visual 3D taking and placing method and system based on cooperative robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] In order to make the technical problems, technical solutions and beneficial effects to be solved by the embodiments of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0041] It should be noted that when an element is referred to as being “fixed” or “disposed on” another element, it may be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or indirectly connected to the other element. In addition, the connection can be used for both fixing function and circuit communication function.

[0042] It is to be understood that the terms "length", "width", "top", "bottom", "front"...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a visual 3D taking and placing method and system based on a cooperative robot. The method comprises the steps that internal and external parameters of a camera of a binocular structured light three-dimensional scanner are calibrated; the hands and eyes of the cooperative robot are calibrated, and a calibration result matrix is obtained; a three-dimensional digital model of to-be-taken-and-placed target objects is collected; the calibrated binocular structured light three-dimensional scanner is adopted to obtain point cloud data of the to-be-taken-and-placed target objects which are stacked in a scattered mode, and the point cloud is segmented to obtain scene point clouds of the multiple to-be-taken-and-placed target objects; the to-be-taken-and-placed target object with the highest grabbing success rate is selected as a grabbing target according to the scene point clouds of the multiple to-be-taken-and-placed target objects; the three-dimensional digital model ofthe grabbing target and scene point pair features are registered, pre-defined taking and placing pose points are registered into a scene, and a registered pose estimation result is obtained and serves as a grabbing pose of the grabbing target; and a preliminary grabbing path track of the cooperative robot is planned. The target object can be accurately recognized, and the grabbing positioning precision is high.

Description

technical field [0001] The invention relates to the technical field of visual 3D pick-and-place, in particular to a visual 3D pick-and-place method and system based on a collaborative robot. Background technique [0002] With the automation and intelligence of industrial manufacturing and logistics transportation, the industrial robot pick-and-place system with multi-sensor integration is the core force in the future automated intelligent manufacturing and intelligent logistics fields. At present, the industrial robot pick-and-place system is mainly used in the fields of workpiece assembly, material loading, product handling, object sorting, defect detection, and packaging. In the traditional structured environment, the robot pick-and-place system that performs single repetitive operations through offline programming cannot complete the scattered stacking of target objects, and at the same time, it cannot estimate the pose of the captured target in the scene. It is just a gr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16G06T7/246G06T7/80
CPCB25J9/1697B25J9/1682G06T7/246G06T7/80
Inventor 唐正宗赵建博冯超宗玉龙
Owner 新拓三维技术(深圳)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products