Virtual assembling method and device

A technology of virtual assembly and assembly operation, applied in the field of computer vision, can solve the problems of limited assembly effect, dangerous operation, difficult to widely apply, etc., to achieve the effect of improving assembly efficiency and assembly effect, enhancing the sense of interaction, and being widely used

Inactive Publication Date: 2017-03-22
HUNAN VISUALTOURING INFORMATION TECH CO LTD
View PDF11 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the prior art has the following deficiencies: First, for the manual assembly method, it consumes a lot of manpower, material resources and financial resources, and there is a certain risk of operation
And limited by the time and place of assembly teaching, it is impossible for multiple people to operate at the same time and repeatedly; secondly, the interactive sense of the virtual assembly method is not strong, resulting in low assembly efficiency, limited assembly effect, and difficult to be widely used. In addition, large-scale motion capture Equipment makes the cost prohibitive

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual assembling method and device
  • Virtual assembling method and device
  • Virtual assembling method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] figure 1 It is a flow chart showing the virtual assembly method according to Embodiment 1 of the present invention. available in as image 3 The shown apparatus performs the method.

[0032] refer to figure 1 , in step S110, acquire the 3D data of the hands and limbs of the assembly operator, the 3D data of the assembly workshop, and the 3D data of the product to be assembled and its parts, and acquire the RGB image data and depth image data of the hands and limbs of the assembly operator.

[0033] Specifically, somatosensory interaction devices such as Kinect can be used to collect information on the arm of the assembly operator, the assembly workshop, the product to be assembled and its components, and complete the input of 3D data, providing a data source for subsequent modeling processing and production of teaching materials. At the same time, the Kinect body sensor is used to obtain the RGB image and depth image data of the assembly operator, which provides a da...

Embodiment 2

[0050] figure 2 is a flowchart illustrating a virtual assembly method according to Embodiment 2 of the present invention, which can be regarded as figure 1 A specific implementation of . available in as Figure 4 The shown apparatus performs the method.

[0051] refer to figure 2 , in step S210, acquire the 3D data of the hands and limbs of the assembly operator, the 3D data of the assembly workshop, and the 3D data of the product to be assembled and its parts, and acquire the RGB image data and depth image data of the hands and limbs of the assembly operator.

[0052] According to an exemplary embodiment of the present invention, the process of acquiring the three-dimensional data of the hands and limbs of the assembly operator in step S210, the three-dimensional data of the assembly workshop, and the three-dimensional data of the product to be assembled and its parts may include: acquiring the three-dimensional data of the assembly operator through the Kinect somatosens...

Embodiment 3

[0072] Based on the same technical idea, image 3 is a logic block diagram showing the virtual assembly device according to the third embodiment of the present invention. It can be used to execute the process of the virtual assembly method described in the first embodiment.

[0073] refer to image 3 , the virtual assembly device includes a data acquisition module 310 , a model building module 320 , a posture analysis module 330 , a virtual assembly module 340 and an assembly effect analysis module 350 .

[0074] The data acquisition module 310 is used to acquire the three-dimensional data of the hands and limbs of the assembly operator, the three-dimensional data of the assembly workshop and the three-dimensional data of the product to be assembled and its parts, and acquire the RGB image data and depth image data of the hands and limbs of the assembly operator .

[0075] The model building module 320 is used to respectively establish corresponding hand and limb models, asse...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention provides a virtual assembling method and device. The method comprises the following steps of: obtaining the three-dimensional data of the hand limb of an assembling operation person, the three-dimensional data of an assembling workshop and the three-dimensional data of a product to be assembled and the component of the product to be assembled, and in addition, obtaining the RGB (Red, Green and Blue) image data and the depth image data of the hand limb of the assembling operation person; according to the three-dimensional data of the hand limb of the assembling operation person, the three-dimensional data of the assembling workshop and the three-dimensional data of the product to be assembled and the component of the product to be assembled, independently establishing a corresponding hand limb model, a corresponding assembling workshop model, a corresponding product to be assembled and the component model of the corresponding product to be assembled; according to the obtained RGB image data and depth image data, carrying out posture analysis, and obtaining the kinematics data of the hand limb; on the basis of the hand limb model, the assembling workshop model, the product to be assembled and the component model of the product to be assembled, according to the kinematics data, carrying out a virtual assembling process; and analyzing the virtual assembling process to obtain an assembling result analysis result. By use of the method, a virtual product assembling process is realized, and assembling efficiency and effects can be improved.

Description

technical field [0001] The invention relates to computer vision technology, in particular to a virtual assembly method and device. Background technique [0002] As an important part of the product manufacturing process, assembly determines the development of the technology system in the entire product life cycle, and it is of great significance to improve the level of product assembly. As the complexity and precision of products increase, the professional level of assemblers is facing a severe test. Therefore, the training of assemblers is becoming more and more important. [0003] At present, the existing assembly training methods are as follows: one is manual assembly teaching, specifically through professional instructors in the assembly workshop for practical teaching, or by watching pre-recorded videos for teaching and training; the other is virtual assembly For teaching, PC desktop systems are mostly used, keyboards and mice are used as data collectors, and large-scal...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06Q10/06G06Q50/20G06F3/01G06K9/00G06T19/00
CPCG06F3/017G06Q10/06311G06Q50/2057G06T19/006G06V40/107
Inventor 鲁敏滕书华张鹏
Owner HUNAN VISUALTOURING INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products