A Prediction Method of Target Capture Point Based on Online Confidence Discrimination

A prediction method and a technology of confidence, applied in the fields of image processing and computer vision, can solve problems such as the inability to effectively improve the capture success rate, the inability to guarantee the capture success rate, and increase the difficulty of capture, so as to achieve real-time capture and easy Realize the effect of good arrest conditions

Active Publication Date: 2020-06-26
NORTHWESTERN POLYTECHNICAL UNIV
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] 1. This framework only reacts to the image passively. When the current posture or angle of the object to be captured is not suitable for capture, it can only be captured with a capture point position with a low confidence level. At this time, capture cannot be guaranteed Success rate
[0005] 2. For multi-object capture tasks in complex environments, capturing objects based solely on detection features may result in a situation of "grabbing the lower objects first, then the upper objects", which increases the difficulty of capture
[0006] For the first problem, the existing improvements are still mainly focused on the capture point prediction algorithm itself. By increasing the training data set and using a more powerful predictor, the algorithm can still obtain more accurate capture point predictions in extreme environments. location, but it is still a passive prediction method, which cannot effectively improve the success rate of capture

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Prediction Method of Target Capture Point Based on Online Confidence Discrimination
  • A Prediction Method of Target Capture Point Based on Online Confidence Discrimination
  • A Prediction Method of Target Capture Point Based on Online Confidence Discrimination

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] Now in conjunction with embodiment, accompanying drawing, the present invention will be further described:

[0037] The present invention is a target capture point prediction method based on online discrimination, which mainly consists of target recognition, target bounding box alignment, capture point prediction, capture point online discrimination, mechanical arm motion path planning and capture point secondary prediction It consists of six parts.

[0038] The method specifically includes steps as follows:

[0039] 1. Target recognition

[0040] In order to avoid the low time efficiency of the traditional algorithm based on sliding window search, a target recognition method based on object-like sampling is adopted. Firstly, the real-time imaging results of the hand-eye camera are extracted. For the current image, the EdgeBoxes method [1] is used to predict all areas that may contain objects in the image.

[0041] [1] C.L.Zitnick and P.Dollár, "Edge boxes: Locating ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a target capture point prediction method based on online confidence discrimination, using the EdgeBoxes method to predict all areas that may contain objects in the real-time image of the hand-eye camera; using the Cornell Grasp Detection Dataset data set to train the convolutional neural network to obtain the predicted object The location of the capture point; the capture point is judged online to obtain the confidence of the current capture point and the rotation angle of the mechanical arm, so that the mechanical arm can reach the corrected capture state. Through the comparison with the traditional target capture point prediction method, the advantages of this algorithm in complex environment adaptability and target capture point prediction accuracy are further verified, and it has a wide application prospect. This method can predict the current capture conditions online, and change the posture of the target to be captured through the movement of the robot arm itself, so as to achieve better capture conditions.

Description

technical field [0001] The invention belongs to the field of image processing and computer vision, and relates to a target capture point prediction method based on online confidence degree discrimination. Background technique [0002] Vision is an important way for robots to interact with the external environment, and the grasping operation of the robotic arm is an important means of operation for robots. With the development of multi-degree-of-freedom manipulator technology and the continuous improvement of the flexibility of multi-joint grippers, the manipulator can more flexibly and effectively complete most of the target grasping tasks. In the traditional sense, when the robot arm performs the grasping operation on the target, the target to be captured is usually placed on a single, clean table, and there is often only a single object in the field of view of the hand-eye camera of the robot arm. However, in actual operation, the above conditions are difficult to meet, s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00
CPCG06V20/64
Inventor 黄攀峰陈路张海涛孟中杰刘正雄张夷斋张帆
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products