Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Device for producing shape model

A shape model and generation device technology, applied in the field of visual recognition, can solve the problems of reduced production efficiency and considerable time spent

Inactive Publication Date: 2006-01-18
FANUC LTD
View PDF1 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, it takes considerable time (for example, more than 20 minutes) to generate a shape model and teach it to the robot (store)
[0006] Also, when teaching the shape model of another object to a robot that is performing a predetermined production operation on one object, it is necessary to temporarily stop the production operation being performed by the robot, and as a result, production efficiency may decrease

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Device for producing shape model
  • Device for producing shape model
  • Device for producing shape model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the drawings, common reference signs are attached to the same or similar components.

[0021] With reference to the accompanying drawings, figure 1 It is a block diagram showing the basic configuration of the shape model generation device 10 according to the present invention. The shape model generation device 10 is configured to include: a shape data acquisition unit 14 that acquires 3D shape data 12 of an object to be worked (not shown); In the coordinate system to which it belongs, a plurality of hypothetical viewpoints (not shown) that can observe an object placed at an arbitrary position in the coordinate system from mutually different directions are set; and the shape model generation unit 20 sets A plurality of hypothetical viewpoints set by the determination unit 16, when the object is observed on the above-mentioned coordinate system, a plurality ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A device for producing a shape model used for a matching process of an object to be worked in a robot system is provided. The shape-model producing device includes a shape-data obtaining section for obtaining three-dimensional shape data of the object; a viewpoint setting section for setting, in a coordinate system to which the three-dimensional shape data obtained by the shape-data obtaining section belongs, a plurality of virtual viewpoints permitting the object placed in the coordinate system to be observed in directions different from each other; and a shape-model generating section for generating, as a plurality of shape models, a plurality of two-dimensional image data of the object, based on the three-dimensional shape data, the plurality of two-dimensional image data being estimated when the object is observed in the coordinate system from the plurality of virtual viewpoints set by the viewpoint setting section.

Description

technical field [0001] The present invention relates to visual recognition in a robot system, and more particularly to an apparatus for generating a shape model (a shape model) used for matching (matching) objects to be worked. Background technique [0002] As we all know, when a robot performs work on an object, it recognizes the object by comparing the actual image of the object input through the visual sensor with the shape model of the object (also called a "teaching model") stored in the robot in advance. current position and posture. For example, when the manipulator installed on the front end of the robot arm grasps and picks out work objects such as parts that have not been corrected and piled up irregularly, a collection of irregular objects is detected by a visual sensor (such as a camera) and input The image data is compared with the shape model to determine the object to be grasped by the manipulator, and at the same time, the robot is moved to a position and po...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/22G05B19/42G06T3/00G06V10/772
CPCG06K9/6255G06T19/00G06T7/004G06T15/20G06T7/70G06V10/772G06F18/28
Inventor 长嘉治小林博彦
Owner FANUC LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products