Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot sorting method and system based on vision sense

A robot and vision technology, which is applied in the field of vision-based robot sorting method and system, can solve the problem that the parts grasping method cannot be recognized and sorted

Inactive Publication Date: 2017-11-21
WUHAN UNIV OF SCI & TECH
View PDF5 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The embodiment of the present invention provides a vision-based robot sorting method and system to solve the technical problem that the parts grabbing method in the prior art cannot be identified and sorted

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot sorting method and system based on vision sense
  • Robot sorting method and system based on vision sense
  • Robot sorting method and system based on vision sense

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0062] The present embodiment provides a vision-based robot sorting method, the method comprising:

[0063] Step S101: Obtain a top-view 3D model of the target part by using a structured light system;

[0064] Step S102: Match the top-view 3D model of the target part with the reference part model in the pre-built reference model library to obtain the type of the target part and the top-view 3D model of the target part and the corresponding reference part Transformation matrix between models;

[0065] Step S103: Obtain first position information of the target part according to the transformation matrix and the reference part model;

[0066] Step S104: According to the first position information of the target part, obtain the second position information under the robot coordinates;

[0067] Step S105: According to the second position information, pick up the target part, and combine the type of the target part to realize part sorting.

[0068] In the above method, since the t...

Embodiment 2

[0096] Based on the same inventive concept as that of Embodiment 1, Embodiment 2 of the present invention provides a vision-based robot sorting system. Please refer to Figure 5 , the system includes:

[0097] The acquiring module, 201, is used to acquire the top-view three-dimensional model of the target part by using the structured light system;

[0098] The matching module 202 is configured to match the top view three-dimensional model of the target part with the reference part model in the pre-built reference model library, obtain the type of the target part and the top view three-dimensional model of the target part and the corresponding The transformation matrix between the reference part models of ;

[0099] A first obtaining module 203, configured to obtain first position information of the target part according to the transformation matrix and the reference part model;

[0100] The second obtaining module 204 is used to obtain the second position information under t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot sorting method and system based on a vision sense. The robot sorting method based on the vision sense includes the following steps that, a structured light system is adopted to obtain a downward three-dimensional model of a targeted part; the downward three-dimensional model of the targeted part is matched with a base part model of a base model base which is built in advance, and the type of the targeted part and a transformational matrix between the downward three-dimensional model of the targeted part and the corresponding base part model are obtained; first position information of the targeted part is obtained according to the transformational matrix and the base part model; second position information based on a robot coordinate is obtained according to the first position information of the targeted part; and the targeted part is picked up according to the second position information, and sorting of parts is achieved by combining the type of the targeted part. The robot sorting method and system based on the vision sense solve the technical problem that a part grabbing method is incapable of recognizing and sorting in the prior art.

Description

technical field [0001] The invention relates to the technical field of industrial mechanical arm sorting, in particular to a vision-based robot sorting method and system. Background technique [0002] With the development of artificial intelligence technology, robots have replaced manpower for operations and have been widely used in industrial environments, and have also become the basic signs and technologies for the transformation from traditional manufacturing to modern manufacturing. [0003] In the prior art, a robot or an industrial mechanical arm can be used to grab objects. [0004] When the inventor of the present application realized the technical solution of the present invention, he found that at least the following problems existed in the prior art: [0005] In the prior art, when robots or industrial mechanical arms are used to grab items, there are certain requirements for the type of parts and the posture of placement, and only independent items or parts can...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B07C5/34
CPCB07C5/3412
Inventor 韩浩伍世虔王欣宋运莲蒋俊张俊勇王建勋张琴
Owner WUHAN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products