Method and device for determining pose of to-be-grabbed object and electronic equipment

A technology of object and pose, which is applied in the field of devices and electronic equipment, and the determination method of the pose of the object to be grasped. It can solve the problems of poor robustness, labor and time consumption, and affecting the accuracy and success rate of object grasping. , to achieve the effect of improving accuracy and success rate, saving labor and time

Active Publication Date: 2019-10-01
BEIJING ORION STAR TECH CO LTD
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In practical applications, on the one hand, due to the variety of object types and shapes, the model data of the object is unknown. Therefore, before the object is grasped, the model data of the object needs to be known, that is, the object modeling needs to be performed manually, which requires a lot of work. a lot of labor and time
On the other hand, due to the influence of sensor noise, object occlusion, modeling error and other factors, the method of object pose estimation based on the established object model is less robust, which affects the accuracy and success rate of subsequent object capture

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for determining pose of to-be-grabbed object and electronic equipment
  • Method and device for determining pose of to-be-grabbed object and electronic equipment
  • Method and device for determining pose of to-be-grabbed object and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0154] As an implementation manner of the embodiment of the present invention, the above method may also include:

[0155] According to the pose of the object to be grasped, the mechanical arm is controlled to grasp the object to be grasped.

[0156] Since the pose of the object to be grasped relative to the coordinate system of the manipulator can be determined, the electronic device can control the movement of the manipulator to the position of the object to be grasped according to the pose of the object to be grasped relative to the coordinate system of the manipulator, and then treat the Grab objects for grabbing.

[0157] It can be seen that in this embodiment, after the pose of the object to be grasped is determined, the electronic device can control the robotic arm to grasp the object to be grasped according to the pose of the object to be grasped, with high grasping accuracy.

[0158] As an implementation mode of the embodiment of the present invention, such as Figu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention provides a method and device for determining a pose of a to-be-grabbed object and electronic equipment. The method comprises the steps: acquiring a target image, including the to-be-grabbed object, collected by a first image sensor; rotating the target image in the imaging plane according to a preset rotation rule to obtain a preset number of rotation images; inputting the target image and the rotating image into a pre-trained deep neural network model for detection to obtain an output result corresponding to each image; and according to the output result and apreset rotation rule, determining the pose of the to-be-grabbed object. Due to the fact that object modeling does not need to be conducted manually, the target image is rotated, and detection is conducted based on the deep neural network model, and three-dimensional pose estimation of the object is converted into the one-dimensional rotation angle problem, so that labor and time can be greatly saved, and the accuracy and the success rate of subsequent object grabbing are improved.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence, in particular to a method, device and electronic equipment for determining the pose of an object to be grasped. Background technique [0002] In recent years, artificial intelligence technology has developed rapidly. In the field of artificial intelligence, in many applications such as industrial robots and service robots, robotic arm grasping objects is an indispensable technology. However, the grasping of objects by the robotic arm has always been a relatively difficult problem, especially the grasping of objects of arbitrary shape and position. [0003] The classic method for the mechanical arm to grab the object is to first collect the image of the object, then perform object segmentation and object recognition on the collected image, and then estimate the pose of the object according to the model data of the object, and then select the grabbing point and determine Crawl sche...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/70
CPCG06T7/70G06V2201/06G06V10/82
Inventor 赵哲
Owner BEIJING ORION STAR TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products