Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Feature point elimination method based on correlation

A feature point and correlation technology, applied in the field of vision guidance, can solve problems such as feature point mismatch, influence deviation solution, robot grasping/assembly failure, etc.

Active Publication Date: 2021-02-26
易思维(杭州)科技有限公司
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002]Vision guidance usually uses vision sensors to collect workpiece images and compares them with the previously taught standard workpiece images to analyze the deviation between the coordinates of the two measuring points and compensate for the gripping parts / Assembly robot trajectory, and then realize the intelligent assembly process; in this process, because the placement position of the workpiece to be assembled / grabbed cannot be completely overlapped with the standard workpiece when it is placed at the detection position, and there are also processing errors between different workpieces , at this time, if the deviation of the workpiece to be assembled / grabbed at the detection position is large or the deformation at a certain measuring point position during the processing itself is large, when obtaining the measured coordinates of the feature points of the workpiece to be assembled / grabbed There may be errors in the calculation of the measured coordinates of individual feature points, and the mismatching of feature points, which will affect the subsequent deviation calculation and cause the failure of the robot to grasp / assemble

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature point elimination method based on correlation
  • Feature point elimination method based on correlation
  • Feature point elimination method based on correlation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] In this embodiment, the technical solution of the present invention is described in detail by taking the car door grasping guide as an example.

[0049] A feature point elimination method based on correlation, which is used for visually guiding the assembly / grabbing process. This embodiment takes the car body-in-white door component grasping guidance as an example;

[0050] Set the detection position in advance, place the standard car door workpiece at the detection position, and when the robot can correctly grasp the car door through the robot teaching, the visual sensor will obtain the coordinates of each feature point of the standard car door workpiece, which will be recorded as the standard coordinates (image Distortion correction has been carried out); the feature point is to select N measuring points on the standard workpiece in advance, and the coordinates of each feature point on the workpiece digital model are marked as theoretical coordinates; N>4;

[0051] In...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a correlation-based feature point elimination method. The method is used for visually guiding an assembling / grabbing process and pre-acquiring standard coordinates and theoretical coordinates; acquiring measurement coordinates, and utilizing a first projection error; judging whether wrong feature points need to be removed or not, wherein the method for searching the wrong feature points comprises the following steps: sorting the plurality of feature points; removing the ith feature point in sequence according to the sequence, calculating a standard reprojection error array corresponding to the ith feature point, and measuring the reprojection error array; calculating a correlation coefficient between the standard reprojection error array corresponding to the ith feature point and the measurement reprojection error array; finding an abnormal point from a plurality of correlation coefficients, and then finding an error feature point from the abnormal point; The method is advantaged in that two arrays obtained through the method are high in relevancy, and amplitudes of all elements in the set are close; therefore, accuracy of the remaining feature points is guaranteed, and assembly / grabbing accuracy is improved.

Description

technical field [0001] The invention relates to the field of visual guidance, in particular to a feature point elimination method based on correlation. Background technique [0002] Vision guidance usually uses visual sensors to collect workpiece images and compares them with the previously taught standard workpiece images to analyze the deviation between the coordinates of the two measuring points, compensate for the trajectory of the grasping / assembly robot, and then realize the intelligent assembly process; in this process In this case, since the position of the workpiece to be assembled / grabbed cannot be completely coincident with that of the standard workpiece when it is placed at the detection position, and there are also processing errors between different workpieces, if the workpiece to be assembled / grabbed is placed at the detection position The deviation on the surface is large or the deformation at a certain measuring point position is large during the processing ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/73G06T7/80G01B11/00
CPCG06T7/0004G06T7/73G06T7/80G01B11/002
Inventor 郭寅郭磊尹仕斌付康龙
Owner 易思维(杭州)科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products