Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target capture point prediction method based on online confidence discrimination

A prediction method and confidence level technology, applied in the fields of image processing and computer vision, can solve the problems that the success rate of capture cannot be effectively improved, the success rate of capture cannot be guaranteed, and the difficulty of capture is increased, so as to meet the requirements of real-time capture and easy Realize the effect of good arrest conditions

Active Publication Date: 2017-09-29
NORTHWESTERN POLYTECHNICAL UNIV
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] 1. This framework only reacts to the image passively. When the current posture or angle of the object to be captured is not suitable for capture, it can only be captured with a capture point position with a low confidence level. At this time, capture cannot be guaranteed Success rate
[0005] 2. For multi-object capture tasks in complex environments, capturing objects based solely on detection features may result in a situation of "grabbing the lower objects first, then the upper objects", which increases the difficulty of capture
[0006] For the first problem, the existing improvements are still mainly focused on the capture point prediction algorithm itself. By increasing the training data set and using a more powerful predictor, the algorithm can still obtain more accurate capture point predictions in extreme environments. location, but it is still a passive prediction method, which cannot effectively improve the success rate of capture

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target capture point prediction method based on online confidence discrimination
  • Target capture point prediction method based on online confidence discrimination
  • Target capture point prediction method based on online confidence discrimination

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] Now in conjunction with embodiment, accompanying drawing, the present invention will be further described:

[0037] The present invention is a target capture point prediction method based on online discrimination, which mainly consists of target recognition, target bounding box alignment, capture point prediction, capture point online discrimination, mechanical arm motion path planning and capture point secondary prediction It consists of six parts.

[0038] The method specifically includes steps as follows:

[0039] 1. Target recognition

[0040] In order to avoid the low time efficiency of the traditional algorithm based on sliding window search, a target recognition method based on object-like sampling is adopted. Firstly, the real-time imaging results of the hand-eye camera are extracted. For the current image, the EdgeBoxes method [1] is used to predict all areas that may contain objects in the image.

[0041] [1] C.L.Zitnick and P.Dollár, "Edge boxes: Locating ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a target capture point prediction method based on online confidence discrimination. The method comprises steps that an EdgeBoxes method is utilized to predict all the regions of real-time images of a hand-eye camera possibly containing objects; a Cornell Grasp Detection Dataset data set is utilized to train the convolutional neural network to acquire object capture point positions through prediction; online discrimination of the capture points is carried out to acquire confidence of a present capture point and a rotation angle of a mechanical arm, and the mechanical arm is made to reach a capture state after correction. Compared with a traditional target capture point prediction method, advantages in complex environment adaptability and target capture point prediction precision are further proved, and wide application prospects are further realized; online prediction of the present capture condition can be carried out, attitude of a to-be-captured target is made to change through self motion of the mechanical arm, and the better capture condition is realized.

Description

technical field [0001] The invention belongs to the field of image processing and computer vision, and relates to a target capture point prediction method based on online confidence degree discrimination. Background technique [0002] Vision is an important way for robots to interact with the external environment, and the grasping operation of the robotic arm is an important means of operation for robots. With the development of multi-degree-of-freedom manipulator technology and the continuous improvement of the flexibility of multi-joint grippers, the manipulator can more flexibly and effectively complete most of the target grasping tasks. In the traditional sense, when the robot arm performs the grasping operation on the target, the target to be captured is usually placed on a single, clean table, and there is often only a single object in the field of view of the hand-eye camera of the robot arm. However, in actual operation, the above conditions are difficult to meet, s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V20/64
Inventor 黄攀峰陈路张海涛孟中杰刘正雄张夷斋张帆
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products