Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual capture method and device based on depth image and readable storage medium

A deep image and vision technology, applied in the direction of instruments, character and pattern recognition, computer components, etc., can solve problems such as misjudgment and affect the accuracy of object recognition, save storage space, reduce program running time, and improve robustness sexual effect

Inactive Publication Date: 2018-03-02
SHANTOU UNIV
View PDF2 Cites 43 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the use of three-dimensional image information for object recognition can make good use of the three-dimensional information of the object such as shape features, it cannot take advantage of the advantages of image processing methods and the use of object texture information, and it will also cause misjudgment of objects with similar shape features. affect the accuracy of object recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual capture method and device based on depth image and readable storage medium
  • Visual capture method and device based on depth image and readable storage medium
  • Visual capture method and device based on depth image and readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0038] see figure 1 , an embodiment of the present invention provides a visual capture method based on a depth image, comprising the following steps:

[0039] Step 100: Obtain the point cloud image, and segment the obtained point cloud image through the RANSAN random sampling consensus algorithm and the Euclidean clustering algorithm, and segment the target object to be identified.

[0040] see figure 2 First, the point cloud image 101 of each object is acquired through the depth camera Kinect, and the point cloud image includes position information (x, y, z) and color information (R, G, B) of the object. Before segmenting the target object from the point cloud image acquired by the depth camera Kinect, filter the point cloud image information far from the targ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a visual capture method and device based on a depth image and a readable storage medium. The method comprises steps that a point cloud image is acquired through a depth cameraKinect, the acquired point cloud image is segmented through an RANSAN random sampling consensus algorithm and an Euclidean clustering algorithm, and an identification-needing target object is acquiredthrough segmentation; 3D global characteristics and color characteristics of the object are respectively extracted and are fused to form a new global characteristic; off-line training of a multi-class support vector machine classifier SVM is carried out through utilizing the new global characteristic of the object, category of the target object is identified through utilizing the trained multi-class support vector machine classifier SVM according to the new global characteristic; then the category and the grasping position of the target object are determined; and lastly, according to the category of the target object and the grasping position of the target object, a manipulator and a gripper are controlled for grasping the target object to the specified position. The method is advantagedin that the target object can be accurately identified and grasped.

Description

technical field [0001] The present invention relates to the field of robot visual grasping, in particular to a depth image-based visual grasping method, device and readable storage medium. Background technique [0002] Robot visual grasping is one of the very important research directions in the field of robot research. This importance is mainly reflected in its wide range of application scenarios. Not only in the factory assembly line, it is necessary to use a large number of mechanical arm visual grasping to complete the assembly work. Moreover, in daily service robots, robotic arm vision grasping is also required to complete daily work. More importantly, with the rise of online shopping and the Internet of Things, it is becoming increasingly important to intelligently classify and select items in warehouses. At the same time, robotic arm vision Grasping also involves a multidisciplinary approach, including disciplines such as automatic control science, image processing, m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46
CPCG06V10/50G06V10/56G06F18/23G06F18/2411
Inventor 范衠李中兴朱贵杰李冲王宇鹏
Owner SHANTOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products