Visual positioning method of large-size material and carrying robot

A visual positioning, large-scale technology, applied in instruments, image analysis, image enhancement and other directions, can solve the problems of high labor cost, inconvenient handling, time-consuming and laborious manual handling, etc., to ensure the effect of stress and eliminate potential safety hazards.

Active Publication Date: 2019-11-15
GUANGDONG BOZHILIN ROBOT CO LTD
View PDF11 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, with the continuous increase in the size and weight of materials, the labor intensity of construction personnel is high, the handling is inconvenient, the labor cost is high, and the handling process does not meet the requirements of building sa

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual positioning method of large-size material and carrying robot
  • Visual positioning method of large-size material and carrying robot
  • Visual positioning method of large-size material and carrying robot

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0054] Example one

[0055] This embodiment provides a visual positioning method for large-size materials, which is suitable for positioning materials before material handling, and is especially suitable for materials with rectangular cross-sections.

[0056] The visual positioning method is executed by a large-size material handling robot, the handling robot includes a manipulator and a moving component, the manipulator is used to grab the material, and the moving component is used to move the robot to the place where the material is placed; further, The handling robot includes a depth camera and a controller, the depth camera is used to obtain a depth image of the material to be transported, and the height of the camera relative to the ground is greater than the height of the wood; the controller is used to position the material according to the depth image obtained by the depth camera , And control the manipulator to grab the material according to the positioning, and control th...

Example Embodiment

[0073] Example two

[0074] This embodiment is based on the above-mentioned embodiment. After the handling robot transports the materials to the destination, the materials need to be placed according to the location of the wood on the site. Therefore, it is necessary to check the wood and the placed wood at the material placement. The materials are identified and positioned to ensure the proper placement of materials.

[0075] When placing materials, it is necessary to judge whether the wooden squares are placed correctly according to the placed materials or the wall, and then stack the newly moved materials according to the placement requirements. For the placed materials, the same method as in the above-mentioned embodiment is used for positioning, and the method of identifying and positioning the wooden square is as follows:

[0076] After analyzing and obtaining the coordinates of each point on the depth image in the three-dimensional coordinate system, according to the coordina...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a visual positioning method of a large-size material and a carrying robot, and relates to the technical field of machine vision. The visual positioning method comprises the following steps of acquiring a depth image of a material through a depth camera; carrying out analyzing to obtain coordinates of each point on the depth image in a three-dimensional coordinate system; and identifying a material plane according to the coordinates of all points, and calculating the center coordinates of the plane of the material. The depth image of the material is acquired at the viewing angle of the carrying robot through the depth camera, the plane of the material can be identified through image processing and analysis, and the coordinates of the geometric center of the materialin the three-dimensional coordinate system can be calculated, so that visual guidance is provided for the carrying robot, the carrying robot can grab the material according to the center point of thematerial, and the material is guaranteed to be uniformly stressed, balanced and the like in the carrying process. In this way, the carrying robot can be free of the auxiliary of constructors, materialconveying is independently completed, and potential safety hazards are eliminated.

Description

technical field [0001] The invention relates to the technical field of machine vision, in particular to a visual positioning method for large-sized materials and a handling robot. Background technique [0002] At present, the logistics handling of large-sized materials is usually completed by manpower, mainly by multiple construction workers to lift the materials and transport them to the destination with the help of simple trolleys. However, as the size and weight of materials continue to increase, the labor intensity of construction personnel is high, the handling is inconvenient, the labor cost is high, and the handling process does not meet the requirements of building safety construction. Taking construction as an example, the size of architectural PC prefabricated interior wall panels can reach 2600mm*600mm*100mm, and the weight can reach 150kg. Manual handling is time-consuming and laborious, and the risk factor is high. [0003] In actual industrial scenarios, large...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01B11/00G06T1/00G06T7/70
CPCG01B11/002G06T1/0014G06T2207/10028G06T7/70
Inventor 孙伟俊黄洋阳化
Owner GUANGDONG BOZHILIN ROBOT CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products