Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Printed matter defect identification method based on deep convolution generative adversarial network

A technology of deep convolution and defect recognition, which is applied in character and pattern recognition, biological neural network models, neural learning methods, etc., can solve the problems of reduced recognition rate, large amount of data, difficult data, etc., to increase the amount of training samples, improve Detection efficiency and accuracy, the effect of slow resolution

Pending Publication Date: 2020-09-01
FOSHAN UNIVERSITY
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are still some deficiencies in the deep learning method currently used in the identification of printed matter defects. One is that deep learning requires a large amount of data as training samples, and it is difficult for us to obtain a large amount of useful data for training in reality. , the second is that too few training samples will lead to over-fitting of the network, resulting in a decrease in the recognition rate

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Printed matter defect identification method based on deep convolution generative adversarial network
  • Printed matter defect identification method based on deep convolution generative adversarial network
  • Printed matter defect identification method based on deep convolution generative adversarial network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] Such as figure 1 As shown, a method for identifying defects in printed matter based on deep convolutional generative confrontation network, which includes the following steps:

[0034] Step 1, collecting pictures of printed matter to be recognized, and using the picture of printed matter to be recognized as a training sample.

[0035] Step 2, construct a deep convolutional generative confrontation network.

[0036] Step 3: Input the image of printed matter to be identified into the deep convolutional generative adversarial network to generate image samples of printed matter.

[0037] Step 4, using the image of the printed matter to be identified and the generated image samples of the printed matter to generate a training sample set.

[0038] Step 5, construct a convolutional neural network and use the training sample set to train the convolutional neural network.

[0039] Step 6, using the trained convolutional neural network to detect print defects.

[0040] As a p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a presswork defect identification method based on a deep convolution generative adversarial network. The presswork defect identification method comprises the steps of 1, collecting a to-be-identified presswork picture; 2, constructing a deep convolution generative adversarial network; 3, inputting the printed matter picture to be identified into the deep convolution generative adversarial network to generate a printed matter picture sample; 4, generating a training sample set by utilizing the to-be-identified printed matter pictures and the printed matter picture samples; 5, constructing a convolutional neural network and training the convolutional neural network by utilizing the training sample set; and 6, detecting the defects of the printed matter by using a convolutional neural network. According to the invention, the deep convolution generative adversarial network is constructed; a convolutional neural network is used for detecting printed matter defects, and the problems that in an existing printed matter defect recognition method, deep learning training samples are deficient, network overfitting and the recognition rate are reduced due to too few training samples, and a traditional image processing algorithm is low in speed, low in accuracy rate and difficult in printed matter defect classification are solved.

Description

technical field [0001] The present invention relates to the technical field of printed product defect detection, in particular to a method for identifying printed product defects based on deep convolutional generative adversarial networks. Background technique [0002] Based on the improvement of living standards and the pursuit of high-quality life, people have higher requirements for the printing efficiency and printing quality of printed matter. Due to the influence of technical precision and human or environmental factors, various defects may appear in the printed matter during the printing process. These defects mainly include missing printing, stains, misregistration, scratches, flying ink, and pinholes. In order to strictly control the printing quality of printed matter, it is necessary to carry out real-time detection on the printing process of printed matter. Printed product defect detection methods have also entered the rapid development of automated detection fr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06K9/62G06N3/04G06N3/08G01N21/88
CPCG06T7/0004G06N3/08G01N21/8851G06T2207/10004G06T2207/20081G06T2207/30144G01N2021/8887G06N3/045G06F18/24G06F18/214Y02P90/30
Inventor 陈超庭
Owner FOSHAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products