Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Grasping detection method of service robot based on dual-channel convolutional neural network

A convolutional neural network and service robot technology, applied in the field of service robots, to achieve the effect of improving quality, improving accuracy, and accurately grasping detection results

Active Publication Date: 2021-09-07
INST OF AUTOMATION CHINESE ACAD OF SCI +1
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the above problems in the prior art, that is, in order to solve the problem that the existing robot grasping detection method is difficult to have good real-time performance and good accuracy at the same time, the present invention proposes a method based on dual-channel convolution A neural network service robot grasping detection method, the method comprising:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Grasping detection method of service robot based on dual-channel convolutional neural network
  • Grasping detection method of service robot based on dual-channel convolutional neural network
  • Grasping detection method of service robot based on dual-channel convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0041] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, rather than to limit the invention. It should be noted that, in the case of no conflict, the embodime...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of service robots, and specifically relates to a method, system, and device for grasping and detecting service robots based on a dual-channel convolutional neural network, aiming at solving the problem that it is difficult for a robot grasping and detecting method to be both real-time and accurate. The method of the present invention includes: acquiring an original color image and an original depth image of the surrounding environment, and converting the original color image into an original grayscale image; acquiring a bounding frame of a target object, and obtaining a first depth image area and a first grayscale image area; Adjust the image area to the set size; encode the resized image area and add it; decode the added feature map to obtain the capture quality feature map, width feature map, height feature map, and first angle Feature map, second angle feature map; get the best grabbing rectangle and its corresponding best grabbing detection frame in the original color image, and realize the grabbing detection of the target object. The invention ensures the real-time and accuracy of grabbing.

Description

technical field [0001] The invention belongs to the technical field of service robots, and in particular relates to a method, system and device for grasping and detecting a service robot based on a dual-channel convolutional neural network. Background technique [0002] In recent years, with the rapid development of artificial intelligence, sensing technology, computing processing technology, etc., service robots have gradually entered people's daily life. Among them, service robots equipped with mechanical arms are popular because they can provide object grabbing services. focus on. In order to grasp the target object, the robot needs to first use deep learning-based object detection methods (such as Faster R-CNN, YOLO, SSD, etc.) to detect the target object, and then determine the best capture of the target object in the image The detection frame, and then guide the robotic arm to carry out the grasping operation, in which determining the best grasping detection frame of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/73G06T7/90G06T9/00G06N3/04
CPCG06T7/73G06T7/90G06T9/00G06N3/045
Inventor 曹志强李忠辉亢晋立于莹莹喻俊志谭民
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products