Three-D object recognition method and parts picking system using the method

A recognition method and technology of three-dimensional objects, applied in the field of parts grabbing system, can solve the problems of low recognition efficiency and long processing time.

Inactive Publication Date: 2004-08-18
MATSUSHITA ELECTRIC WORKS LTD
View PDF2 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, there is a problem that the processing time is too long, making the recognition efficiency low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-D object recognition method and parts picking system using the method
  • Three-D object recognition method and parts picking system using the method
  • Three-D object recognition method and parts picking system using the method

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0045]The three-dimensional object recognition method of the present invention can be well used in a part gripping system for conveying parts. For example, as shown in Figure 1, this part grasping system comprises a pair of TV cameras (1A, 1B); By using the image information provided by these TV cameras to carry out the image processing unit 2 of the three-dimensional object recognition method of the present invention; Have flexible robot 5 that can grasp parts 30 manipulator; And a robot controller 4, this robot controller 4 is used to control robot 5 according to the output that image processing unit 2 provides, makes manipulator grab parts from material box 30, wherein a plurality of parts 30 are piled up on the stage 3 in a random manner, and the grabbed parts are moved to the required positions.

[0046] In the image processing unit 2, the obtained video signal is converted into a digital signal by the A / D converter (20A, 20B), and temporarily stored in the memory 21. Ne...

no. 2 example

[0116] As shown in FIG. 14, the parts gripping system of the second embodiment is basically equal to the parts gripping system of the first embodiment, except that a third camera 1C is used in addition to the television camera, and by using the third camera 1C The image processing unit with A / D converter 20C can perform Figure 15 The 3D object recognition method shown.

[0117] The three-dimensional object recognition method of the present embodiment includes the following steps: step 200A, photographing the left image; step 201A, detecting the two-dimensional features of the object on the left image; step 202A, evaluating the reliability of the two-dimensional features; step 200B, photographing The right image; step 201B, detecting the two-dimensional features of the object on the right image; step 202B, evaluating the reliability of the two-dimensional features. These steps are performed in the same manner as in the first embodiment. The method of this embodiment further ...

no. 3 example

[0126] The second embodiment described above illustrates that the recognition accuracy of the position and attitude of an object in three dimensions can be improved by creating consistency of two-dimensional features between the left and right images and between the left image and the third image. In this embodiment, the three-dimensional object recognition method is performed in consideration of the coincidence of the two-dimensional features between the generated right image and the third image.

[0127] Specifically, as Figure 16 As shown, a case where a rectangular solid 50 is used as an object is illustrated, and two parallel straight line pairs (51, 52), (53, 54) are used as two-dimensional features of the object. First, an image of a rectangular solid is taken using three cameras 1A, 1B and 1C. Next, two parallel line pairs (51, 52), (53, 54) are extracted from each of the three images as two-dimensional features. The first process of generating consistency between t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A 3-dimensional object recognition method which can be applied to a pin picking system, comprising a step (A) of obtaining 1st and 2nd images of an object, a step (B) of detecting 2-dimensional features of the object from the 1st and 2nd images, a step (C) of evaluating the reliability of the step (B) in comparison with the model data of the object, a step (D) of matching the 2-dimensional features of the 1st image against the 2-dimensional features of the 2nd image, a step (E) of evaluating the reliability in the step (D) by comparing the 2-dimensional features of the 1st image with the 2-dimensional features of the 2nd image, a step (F) of recognizing the 3-dimensional position and posture of the object in accordance with 3-dimensional information of the 2-dimensional features which are obtained by the matching, and a step (G) of evaluating the reliability of the recognized 3-dimensional position and posture.

Description

technical field [0001] The present invention relates to a three-dimensional object recognition method by which objects can be recognized accurately and at high speed, and to a parts grasping system using the method. Background technique [0002] In the past, various methods have been proposed for recognizing the three-dimensional position and attitude or shape of an object by making a stereoscopic image using two two-dimensional images. For example, Japanese Laid-Open Patent [KOAKI] No. 10-206135 describes "a method for determining the position and attitude of a three-dimensional object". In this method, a stereoscopic image is obtained by observing a free-form surface of a three-dimensional object, and an edge image is extracted from the stereoscopic image. The edge image is divided into segments according to its local features, and then the local geometric features are added to the segments. Next, a comparison of the local geometry of the segment with that of a pre-gener...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): B25J9/16G06T7/00
CPCB25J9/1697G06T7/0044G05B2219/40053G06T7/0022G06T7/74G06T7/97
Inventor 顾海松中原智治荒木秀和藤井裕之
Owner MATSUSHITA ELECTRIC WORKS LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products