Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Intelligent product assembling method based on machine vision recognition

A technology of machine vision and assembly method, which is applied in the direction of character and pattern recognition, instruments, computer parts, etc., can solve the problems of automatic equipment assembly failure, damaged assembly parts, etc., to improve assembly efficiency and assembly accuracy, less noise, and sense The effect of small regions of interest

Pending Publication Date: 2020-05-22
广西柳州联耕科技有限公司
View PDF12 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] On the other hand, the parts assembly production line needs automated automatic assembly equipment. However, the position of the ordinary robot teaching method is fixed. When there is a slight deviation in the position of the parts, the automatic equipment will fail to assemble and damage the assembled parts.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The present invention will be further described in detail below with reference to the examples, but the embodiments of the present invention are not limited thereto.

[0022] The visual guide of the present invention is mainly divided into two parts: hardware and software. Hardware aspect: the present invention selects the Basler industrial camera to take pictures of the parts at first, and then selects a suitable industrial computer to process and locate the pictures of the parts collected by the camera. Software part: first confirm the appropriate detection algorithm, and then program the algorithm. The detection algorithm that the present invention adopts is as follows:

[0023] 1. Detection algorithm combined with deep learning and image processing

[0024] This method first uses the improved target detection algorithm YOLOv2 based on deep learning to detect the workpiece to be assembled. Since the position detected by the target detection algorithm may have devia...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

According to the invention, a method of combining deep learning and traditional digital image processing is adopted; according to the method, the detection precision and the detection speed are comprehensively considered, rough position information of an object can be detected in a complex environment by adopting a deep learning target detection algorithm, then the detected position is used as a region of interest, and image processing is performed to detect more accurate target contour information. According to the invention, a target detection method based on deep learning is improved, so that the detection speed and the detection precision are improved, requirements on workshop hardware are greatly reduced, and the production cost is reduced. In conclusion, the assembly efficiency and the assembly precision of parts are greatly improved, and the situation that assembly fails due to deviation of the positions of the parts is reduced.

Description

technical field [0001] The invention belongs to the technical field of automobile production, and in particular relates to a product intelligent assembly method based on machine vision recognition. Background technique [0002] There are many parts in the car production process that need to be assembled. Traditional manual assembly requires a lot of manpower and low material production efficiency, which can no longer meet the current automated production. At present, most of the common automated parts assembly production lines teach the robot, and then assemble the parts at a fixed position. However, position deviation will inevitably occur when the parts are transferred to a fixed position. At this time, assembling parts according to the fixed robot teaching position will lead to assembly failure and even damage to the parts. In order to solve this problem, the present invention adopts the method of computer vision to locate the assembly position of the parts to be assemb...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06T7/00G06T7/80G06T7/70
CPCG06T7/0004G06T7/80G06T7/70G06T2207/10004G06T2207/20081G06T2207/20084G06T2207/30164G06V20/20G06V2201/07G06F18/23213G06F18/214
Inventor 何智成王振兴胡朝辉宋凯
Owner 广西柳州联耕科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products