Robot autonomous classification grabbing method based on YOLOv3

A robot and degree-of-freedom technology, used in instruments, manipulators, image analysis, etc., can solve problems such as poor robustness and weak generalization ability, and achieve the effects of high precision, strong generalization ability, and fast speed.

Pending Publication Date: 2020-04-28
TIANJIN UNIV
View PDF4 Cites 39 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this algorithm is easily affected by the working environment, sensitive to li

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot autonomous classification grabbing method based on YOLOv3
  • Robot autonomous classification grabbing method based on YOLOv3
  • Robot autonomous classification grabbing method based on YOLOv3

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] In order to further understand the invention content, characteristics and effects of the present invention, the following embodiments are enumerated hereby, and detailed descriptions are as follows in conjunction with the accompanying drawings:

[0045] The Chinese interpretation of English in this application is as follows:

[0046] YOLOv3: A single-stage target detection algorithm proposed by Joseph Redmon in 2018;

[0047] PCA: principal component analysis method;

[0048] Darknet-53: Deep convolutional neural network, used to extract image features, is the core module of the YOLOv3 algorithm;

[0049] kinect: visual sensor, which can obtain RGB information and depth information of objects.

[0050] RGB: three-channel color image;

[0051] RGB-D: a general term for three-channel color images and depth images;

[0052] R-CNN: Regional convolutional neural network, a target detection algorithm proposed by Ross Girshick et al. in 2014.

[0053] Fast R-CNN: Fast reg...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a robot autonomous classification grabbing method based on YOLOv3, and the method is characterized in that the method comprises: collecting and constructing a target object sample data set; training the YOLOv3 target detection network to obtain a target object recognition model; acquiring a color image and a depth image of the target object; processing the color image by using the trained YOLOv3 target detection network to obtain the category information and the position information of the target object to be grabbed, and further processing by combining the depth imageto obtain the point cloud information of the target object; and performing minimum bounding box solving on the point cloud information, calculating the main direction of the point cloud by combining aPCA algorithm, calibrating the coordinate data of the X, Y and Z axes of the target object, and calculating the six-degree-of-freedom pose of the target object relative to the robot coordinate system. According to the method, the YOLOv3 algorithm is adopted, the object grabbing pose is estimated through point cloud preprocessing, PCA and other methods, and then the robot grabs the target objectsin a classified mode.

Description

technical field [0001] The invention relates to a robot autonomous classification and grasping method, in particular to a robot autonomous classification and grasping method based on YOLOv3. Background technique [0002] At present, my country's aging population is serious and there is a shortage of labor force. The demand for service robots is also increasing. However, the unstructured environment in which service robots work has also brought many technical problems. One of the most important problems is the unstructured environment. Autonomous grasping by robots in environments. Grasping, as one of the main ways for robots to interact with the real world, is an urgent problem to be solved. Different from the grasping of workpieces by industrial robots in a structured environment, the automatic grasping of service robots in an unstructured environment faces many challenges, such as dynamic environments, lighting changes, mutual occlusion between objects, and the most import...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/70G06K9/00G06K9/32B25J11/00B25J19/02
CPCG06T7/70B25J11/008B25J19/023G06T2207/10028G06V20/64G06V10/25
Inventor 王太勇冯志杰韩文灯彭鹏张凌雷
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products