Robot Vision Guidance Method and Device Based on RGB-D Data Fusion

A robot vision and data fusion technology, applied in the field of robot vision, can solve problems such as not being able to meet high-speed production, time-consuming, and low universality of point cloud analysis methods

Active Publication Date: 2020-10-30
佛山缔乐视觉科技有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The main purpose of the present invention is to provide a robot vision guidance method and device based on RGB-D data fusion. The accuracy of the 3D processing target positioning method based on 3D point cloud and deep learning is too poor, the resolution accuracy is not high, and it does not meet the current needs of high-precision processing, and the existing point cloud analysis method is not universally applicable. These technical issues applicable to specific automated processing systems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot Vision Guidance Method and Device Based on RGB-D Data Fusion
  • Robot Vision Guidance Method and Device Based on RGB-D Data Fusion
  • Robot Vision Guidance Method and Device Based on RGB-D Data Fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] It should be understood that the specific embodiments described here are only used to explain the present invention, but not to limit the present invention.

[0030] Various embodiments implementing the present invention will now be described with reference to the drawings. In the following description, the use of suffixes such as “module”, “part” or “unit” used to denote elements is only used to facilitate the description of the present invention, and has no specific meaning in itself. Therefore, "module" and "component" can be mixed.

[0031] Reference figure 1 , figure 1 It is a schematic flowchart of the first embodiment of the robot vision guidance method based on RGB-D data fusion of the present invention. Such as figure 1 In the illustrated embodiment, the robot vision guidance method based on RGB-D data fusion includes the following steps:

[0032] S10, processing target data collection.

[0033] That is, the RGB two-dimensional image and depth data containing the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a robot vision guidance method and device based on RGB-D data fusion. Based on an RGB-D composite sensor, starting from processing target data collection, processing target identification, processing target segmentation, processing path point acquisition, and processing The guide point conversion step finally obtains the processing guide point sequence, thereby reducing the calculation time, meeting the requirements of real-time processing, and reducing the performance requirements of software and hardware, which can save costs, reduce the difficulty of development, and meet the requirements for high-speed large-scale production model requirements.

Description

Technical field [0001] The invention relates to the field of robot vision, in particular to a robot vision guidance method and device based on RGB-D data fusion. Background technique [0002] As a powerful tool for manufacturing automation equipment (robot system) must move towards high speed and intelligence. An important means of intelligent automation equipment is to equip the machine with "eyes" and a "brain" that can cooperate with this eye. This "eye" can be a monocular camera, a binocular camera, a multi-eye camera, a three-dimensional scanner, or an RGB-D (RGB+Depth) sensor. The core work content of intelligent automation equipment includes: analyzing the image data obtained by this "eye" (such as image recognition), and then guiding the robot system to complete specific processing or assembly operations based on the analysis results. Therefore, image data analysis based on two-dimensional images, which is widely used at present, is a key basic core technology. However...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/33G06T7/50G06T7/73G06T7/149G05B19/25
CPCG05B19/25G06T2207/10028G06T2207/20081G06T2207/20084G06T2207/30108G06T7/11G06T7/33G06T7/50G06T7/73
Inventor 刁世普郑振兴秦磊
Owner 佛山缔乐视觉科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products