Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot target grabbing area real-time detection method based on SE-RetinaGrasp model

A real-time detection and robot technology, applied in biological neural network models, instruments, computer parts, etc., can solve the problem of low accuracy of grasping area prediction effect, improve detection accuracy, ensure diversity, and strengthen grasping. effect of ability

Pending Publication Date: 2020-01-21
GUANGDONG UNIV OF TECH
View PDF1 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention provides a method for real-time detection of robot target grasping areas based on the SE-RetinaGrasp model in order to overcome the defects of low accuracy of grasping area prediction described in the above-mentioned prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot target grabbing area real-time detection method based on SE-RetinaGrasp model
  • Robot target grabbing area real-time detection method based on SE-RetinaGrasp model
  • Robot target grabbing area real-time detection method based on SE-RetinaGrasp model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The accompanying drawings are for illustrative purposes only and cannot be construed as limiting the patent;

[0042] In order to better illustrate this embodiment, some parts in the drawings will be omitted, enlarged or reduced, and do not represent the size of the actual product;

[0043] For those skilled in the art, it is understandable that some well-known structures and descriptions thereof may be omitted in the drawings.

[0044] The technical solutions of the present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0045] Such as figure 1 As shown, it is a flow chart of the real-time detection method for the robot target grasping area based on the SE-RetinaGrasp model of this embodiment.

[0046] This embodiment proposes a real-time detection method for a robot target grasping area based on the SE-RetinaGrasp model, including the following steps:

[0047] S1: Download the training data set through the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a robot target grabbing area real-time detection method based on an SE-RetinaGrasp model. The method comprises the following steps of downloading a training data set through aninterface, and images, containing target objects, of the target objects grabbed by a robot are collected through a visual sensor to construct the training data set; preprocessing the images in the training data set; a RetinaNet model and an SENet module are adopted to construct a grabbing detection model; inputting the preprocessed training data set into a grabbing detection model, and training the grabbing detection model by adopting a transfer learning method and a random gradient descent method; and collecting a to-be-detected robot target grabbing image in real time through a visual sensor, and inputting the to-be-detected robot target grabbing image into the grabbing detection model to obtain a target grabbing area detection image with a grabbing frame. The prediction effect and thedetection accuracy of the capture area can be improved, and the capture capability of the model for detail information is effectively enhanced.

Description

technical field [0001] The invention relates to the technical field of robot grasping, and more specifically, to a method for real-time detection of a robot target grasping area based on the SE-RetinaGrasp model. Background technique [0002] In the field of intelligent robots, robot autonomous grasping is a key capability of intelligent robots. The methods currently applied to robot target grasping area detection include the grasping area detection method based on the sliding window detection framework, the global grasping prediction method, the second-order grasping detection method, etc. [0003] Among them, the method of grasping area detection based on the sliding window detection framework adopts the method of sliding window, which will take a long time to search for the grasping area and a large amount of calculation, which cannot meet the real-time requirements of robot grasping and grasping detection; the global grasping prediction method It is easy to lead to the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06T3/60G06T7/11
CPCG06T3/60G06T7/11G06N3/048G06N3/045G06F18/211G06F18/214
Inventor 卢智亮曾碧林伟
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products