Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot target grabbing detection method based on scale invariant network

A scale-invariant, detection method technology, applied in neural learning methods, biological neural network models, instruments, etc., can solve the problems of grasping failure, effective extraction of grasping area interference, and algorithm invariance to scale, etc., and achieve easy implementation. , to ensure the accuracy and robustness of grasping detection, and improve the effect of grasping success rate

Active Publication Date: 2020-06-19
SHANXI UNIV
View PDF17 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It is not difficult to see that the imaging scale of the object to be captured in the image will interfere with the effective extraction of the captured area. Existing methods usually use methods such as increasing the diversity of training samples and introducing multi-scale network models to improve the robustness of the algorithm. However, the above methods cannot fundamentally solve the problem of the invariance of the algorithm to the scale.
[0004] In addition, in order to make full use of the powerful feature learning and representation capabilities of deep learning, existing grasp detection methods usually use directed rectangular boxes to define the grasp pose of the end hand, but the grasp representation described above can only predict a limited number of grasps area, which cannot reflect the real continuous grasping situation; while the representation based on the grasping path uses one or more straight line segments distributed on the object to describe the continuous distribution of the grasping area, which better solves the continuous representation of the grasping area problem, but a crawling path can only correspond to one crawling state
In particular, when there is a large deformation of the object, a single grasping state will produce different degrees of grasping deviation, resulting in grasping failure

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot target grabbing detection method based on scale invariant network
  • Robot target grabbing detection method based on scale invariant network
  • Robot target grabbing detection method based on scale invariant network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0044] see Figure 1-3 , the present invention proposes a robot target grasping detection method based on a scale invariant network, which mainly consists of five parts: image acquisition, feature extraction, target positioning and scaling, quadrilateral grasping representation detection and boundary re-optimization.

[0045] A robot target grasping detection method based on a scale-invariant network, comprising the following steps:

[0046] Step 1, image collection: use an optical camera to collect an RGB image containing the target to be captured as input information for subsequent steps;

[0047] Step 2, feature extraction: Construct a feature extraction module consisting of 13 convolutional layers, 13 modified linear unit layers, and 4 pooling layers, and use the 30th layer of the feature extraction module, that is, the output of the modified linear unit layer as The feature map extracted from the current image;

[0048] Step 3, target positioning and scaling:

[0049] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of computer vision and intelligent robots, and particularly relates to a robot target grabbing detection method based on a scale-invariant network. The objective ofthe invention is to make up for the defects of an existing grabbing detection algorithm in target scale change adaptability and grabbing state diversity representation. A complex multi-scale network structure and a multi-observation-angle sample are not needed; a scale transformation factor can be adaptively estimated; target consistence output is provide;, according to the object grabbing detection method, the scale transformation factor can be adjusted in a self-adaptive mode according to the size of the object to be grabbed and the actual distance between the object to be grabbed and the camera, consistent output of the target object is obtained, and therefore robustness of a grabbing detection result under different scales is guaranteed.

Description

technical field [0001] The invention belongs to the field of computer vision and intelligent robots, and in particular relates to a robot target grasping and detection method based on a scale-invariant network. Background technique [0002] With the increasingly serious problems of labor shortage and aging population, robots are playing an increasingly important role in human production and life. Due to the ability to replace humans to complete some household tasks, domestic service robots have received more and more attention. Existing home service robots are mainly concentrated in educational machines, sweeping robots, etc., which are weak in intelligence and lack the ability to effectively perceive the external environment, so it is difficult to carry out effective human-computer interaction tasks. Through extensive research, the robot's ability to grasp objects is an important function to realize human-computer interaction and home services, such as: picking up garbage ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V20/10G06V10/462G06N3/045G06F18/23213G06F18/24
Inventor 陈路钱宇华吴鹏王克琪刘畅卢佳佳
Owner SHANXI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products