Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Service robot grabbing detection method based on double-channel convolutional neural network

A technology of convolutional neural network and service robot, applied in the field of service robot

Active Publication Date: 2020-10-23
INST OF AUTOMATION CHINESE ACAD OF SCI +1
View PDF6 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the above problems in the prior art, that is, in order to solve the problem that the existing robot grasping detection method is difficult to have good real-time performance and good accuracy at the same time, the present invention proposes a method based on dual-channel convolution A neural network service robot grasping detection method, the method comprising:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Service robot grabbing detection method based on double-channel convolutional neural network
  • Service robot grabbing detection method based on double-channel convolutional neural network
  • Service robot grabbing detection method based on double-channel convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0041] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, rather than to limit the invention. It should be noted that, in the case of no conflict, the embodime...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of service robots, particularly relates to a service robot grabbing detection method, system and device based on a double-channel convolutional neural network, and aims to solve the problem that a robot grabbing detection method is difficult to have both real-time performance and accuracy. The method comprises the following steps: acquiring an originalcolor image and an original depth image of a surrounding environment, and converting the original color image into an original grayscale image; obtaining a target object bounding box, and obtaining afirst depth image region and a first gray level image region; adjusting the image area to a set size; encoding the image regions with the adjusted sizes, and adding the encoded image regions; decodingthe added feature maps to obtain a captured quality feature map, a captured width feature map, a captured height feature map, a captured first angle feature map and a captured second angle feature map; and obtaining an optimal grabbing rectangle and an optimal grabbing detection frame corresponding to the optimal grabbing rectangle in the original color image, thereby realizing grabbing detectionof the target object. According to the invention, the real-time performance and accuracy of grabbing are ensured.

Description

technical field [0001] The invention belongs to the technical field of service robots, and in particular relates to a method, system and device for grasping and detecting a service robot based on a dual-channel convolutional neural network. Background technique [0002] In recent years, with the rapid development of artificial intelligence, sensing technology, computing processing technology, etc., service robots have gradually entered people's daily life. Among them, service robots equipped with mechanical arms are popular because they can provide object grabbing services. focus on. In order to grasp the target object, the robot needs to first use deep learning-based object detection methods (such as Faster R-CNN, YOLO, SSD, etc.) to detect the target object, and then determine the best capture of the target object in the image The detection frame, and then guide the robotic arm to carry out the grasping operation, in which determining the best grasping detection frame of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/90G06T9/00G06N3/04
CPCG06T7/73G06T7/90G06T9/00G06N3/045
Inventor 曹志强李忠辉亢晋立于莹莹喻俊志谭民
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products