Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time pose estimation method and positioning grabbing system for three-dimensional target object

A target object, pose estimation technology, applied in computing, computer components, image data processing and other directions, can solve problems such as low efficiency and accuracy

Active Publication Date: 2020-01-03
SHENZHEN HUAHAN WEIYE TECH
View PDF3 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In view of this, in order to solve the problem of low efficiency and accuracy of object pose estimation when relying on robots for parts recognition in current industrial production, this application provides a real-time pose estimation method and positioning and grasping system for three-dimensional target objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time pose estimation method and positioning grabbing system for three-dimensional target object
  • Real-time pose estimation method and positioning grabbing system for three-dimensional target object
  • Real-time pose estimation method and positioning grabbing system for three-dimensional target object

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0048] Please refer to figure 1 , the present application discloses a method for estimating a real-time pose of a three-dimensional target object, including steps S100-S400, which are described below respectively.

[0049] Step S100, acquiring three-dimensional graphic information of a target object.

[0050] It should be noted that the target object here may be a product on an industrial assembly line, a mechanical part in an object box, a tool on an operating table, etc., and is not specifically limited. Then, the three-dimensional graphic information of these target objects can be obtained through visual sensing instruments such as camera equipment, laser scanning equipment, etc., and the graphic information may be a part of the appearance shape data of the surface of the target object.

[0051] Step S200, calculate and obtain the local feature descriptor of the target object according to the graphic information. In a specific example, see figure 2 , step S200 may inclu...

Embodiment 2

[0155] Please refer to Figure 11 , the present application discloses a positioning and grasping system for a target object. The positioning and grasping system mainly includes a sensor 11, a processor 12, and a controller 13, which will be described separately below.

[0156] The sensor 11 is used to collect images of the target object to form three-dimensional graphic information of the target object. The sensors 11 here may be some visual sensors with image acquisition functions, such as camera equipment and laser scanning equipment. The target object here may be a product on an industrial assembly line, a mechanical part in an object box, a tool on an operating table, etc., and is not specifically limited.

[0157] The processor 12 is connected to the sensor 11 and used to obtain the pose information of the target object relative to the sensor 11 through the real-time pose estimation method disclosed in the first embodiment.

[0158] The controller 13 is connected to the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a real-time pose estimation method and a positioning grabbing system for a three-dimensional target object. The real-time pose estimation method comprises the following steps:acquiring three-dimensional graphic information of the target object; calculating to obtain a local feature descriptor of the target object according to the graphic information; performing three-dimensional pose estimation on the target object through the local feature descriptor by utilizing a pre-established three-dimensional model database to obtain pose information of the target object; and outputting the obtained pose information. According to the application, the real-time pose estimation method requested to be protected is applied to the positioning and grabbing system of the three-dimensional target object; the controller can control the motion mechanism to accurately grab the target object according to the pose information output by the processor, the grabbing accuracy can be effectively improved while the grabbing efficiency is guaranteed, and the practical performance of the positioning grabbing system in the application process is enhanced.

Description

technical field [0001] The invention relates to the technical field of machine vision, in particular to a method for estimating a real-time pose of a three-dimensional target object and a positioning and grabbing system. Background technique [0002] In today's manufacturing industry, the time spent in the assembly process accounts for 20%-70% of the entire manufacturing process, and the money spent accounts for more than 40% of the production cost, including a large part of human labor costs. In order to improve production efficiency and reduce labor costs, people began to explore the use of robots to realize automated assembly. As an indispensable and important link in the automated assembly process, part recognition and grasping position planning have a vital impact on the quality of assembly. Vision-based part pose determination and grasping position planning can significantly improve the automation of product assembly. Flexibility, reduce time-consuming and reduce cost...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/33G06T17/00G06K9/62G06F16/51
CPCG06T7/73G06T7/33G06T17/00G06F16/51G06T2207/10028G06F18/23
Inventor 杨洋
Owner SHENZHEN HUAHAN WEIYE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products