Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot 3D shape recognition method based on multi-view information fusion

A technology of three-dimensional shape and recognition method, which is applied in the field of multi-view visual information, can solve the problems of insufficient information, the impossibility of robots to obtain full-view visual information of three-dimensional shapes, large limitations, etc., and achieve high recognition accuracy.

Active Publication Date: 2017-07-14
NORTHWESTERN POLYTECHNICAL UNIV
View PDF5 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In summary, in the 3D shape recognition method for robots, although the full viewing angle can obtain relatively high recognition and classification accuracy, it has relatively large limitations in practical application, and it is unlikely that the robot can obtain the full viewing angle of the 3D shape. Visual information can often only perform recognition tasks based on visual information from partial perspectives
In addition, although the calculation speed of single-view is very fast, after all, there is only one visual information of the view, and the amount of information is not sufficient, resulting in low classification and recognition accuracy, which cannot meet the actual needs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot 3D shape recognition method based on multi-view information fusion
  • Robot 3D shape recognition method based on multi-view information fusion
  • Robot 3D shape recognition method based on multi-view information fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] Embodiments of the present invention are described in detail below, and the embodiments are exemplary and intended to explain the present invention, but should not be construed as limiting the present invention.

[0036] attached figure 1 The general flow of the recognition of the three-dimensional shape by the robot realized by the present invention is shown. The purpose of the invention is to enable the robot to realize fast and efficient recognition of three-dimensional shapes during motion. The figure contains the visual map of the three-dimensional shape of the different perspectives obtained by the robot during the movement. In the recognition process, the visual information similarity is first sorted to obtain a set of ordered visual information structures; and then the obtained visual map is convolved. Neural network learning to obtain hierarchical deep features, and then brought into the long short-term memory model to obtain deep features of time-space sequen...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a robot 3D shape recognition method based on multi-view information fusion. The method combines the advantages of a full-view method and a single-view method, overcomes the disadvantages of the two methods, and comprises steps of: firstly, performing image similarity ordering by using an image similarity detection technique by means of the multi-view information of a 3D shape acquired by a robot in motion, so as to obtain a hierarchical depth feature by a convolutional neural network; finally, learning the visual feature with a certain time and spatial sequence by using a long short term memory model to obtain a highly abstract space-time feature. The method not only simulates the hierarchical learning mechanism of human beings, but also adds the space-time sequence learning mechanism for simulating the human learning, and realizes the high-precision classification and recognition of 3D shape by multi-view information fusion.

Description

technical field [0001] The present invention relates to the field of robot technology and computer vision. Specifically, it utilizes the multi-view visual information obtained by the visual sensor of the robot, and uses the hierarchical deep learning network, the time-space sequence deep learning network and the image similarity detection and sorting technology to realize the robot's alignment. Recognition and classification of 3D shapes. Background technique [0002] 3D shape recognition has always been a hotspot in the field of robotics and computer vision. Fast and efficient recognition of three-dimensional shapes is of great significance to real life. For example, robots or unmanned aerial vehicles can quickly retrieve and identify objects in the database through three-dimensional shape matching, and use them to find and determine targets or avoid obstacles and improve their own performance. Degree of intelligence; Public security and other fields use 3D matching technol...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/08
CPCG06N3/084G06F18/22
Inventor 布树辉王磊刘贞报
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products