Detection method for target object grab position based on deep learning robot

A technology of deep learning and detection method, which is applied in the field of detection of the grasping position of the target, which can solve the problems of a lot of manual participation, time-consuming, and the target cannot accurately detect the grasping position.

Inactive Publication Date: 2017-05-31
WUHU HIT ROBOT TECH RES INST
View PDF3 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The embodiment of the present invention provides a method for detecting the grasping position of an object based on a deep learning robot, which aims to solve the existing method of manually designing the grasping characteristics of an object, which is time-consuming and requires a large amount of manual participation, and for The target object that the robot has not seen cannot accurately detect the grasping position, and cannot execute the grasping action

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Detection method for target object grab position based on deep learning robot
  • Detection method for target object grab position based on deep learning robot
  • Detection method for target object grab position based on deep learning robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0014] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0015] figure 1 It is a flow chart of a method for detecting a grasping position of an object based on a deep learning robot provided in an embodiment of the present invention. The method includes the following steps:

[0016] S1. Collect an RGB-D image containing a target object through a sensor;

[0017] The embodiment of the present invention uses the Microsoft Kinect sensor to obtain high-resolution RGB images and depth images of the captured objects. The RGB images include the surface color information and texture information of the objects to be captured, and the depth image includes the spa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention is applicable to the field of robot grab and provides a detection method for a target object grab position based on a deep learning robot. The method comprises the steps that an RGB-D image containing a target object is collected through a sensor; candidate grab regions are divided in a target region of the RGB-D image; the length-width ratio of each candidate grab region is kept unchanged, the size of each candidate grab region is magnified to the size required for input of a neural network; input vectors are constructed for the magnified candidate grab regions; the input vectors are whitened, and the whitened input vectors are input into the trained neural network; and the score of each candidate grab region is acquired, and the candidate grab region with the highest score is determined as the grab position. The grab position of the target object can be determined just by acquiring the RGB-D image of the target object, the robot can grab any target object through the grab position, and manual intervention is not needed.

Description

technical field [0001] The invention belongs to the field of robot grasping, and in particular relates to a method for detecting the grasping position of a target object based on a deep learning robot. Background technique [0002] In order to care for the elderly, the disabled and other people with disabilities, the grasping of common objects in the home environment, such as teacups, beverage bottles, books, etc., has become an indispensable and important functional requirement for home service robots. Different from the grasping of workpieces by industrial robots in a structured environment, the intelligent grasping of service robots in a home environment faces many challenges, such as dynamic environments, lighting changes, dozens or even hundreds of target objects, complex backgrounds, Mutual occlusion between objects, etc. [0003] At present, the robot grasping detection technology includes the following types: artificially design the grasping characteristics of the t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/70
CPCG06T2207/20081G06T2207/20084
Inventor 高靖李超曹雏清
Owner WUHU HIT ROBOT TECH RES INST
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products