Full convolutional network fabric defect detection method based on attention mechanism

A fully convolutional network and detection method technology, applied in the field of fabric defect detection based on the attention mechanism, can solve the problems of low detection accuracy and slow detection speed, and achieve improved representation ability, improved effectiveness, and good The effect of detection accuracy and adaptability

Inactive Publication Date: 2020-03-06
ZHONGYUAN ENGINEERING COLLEGE
View PDF3 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the technical problems of low detection accuracy and slow detection speed in the existing fabric defect detection technology, the present invention proposes a fully convolutional network fabric defect detection method based on the attention mechanism, and uses the enhanced VGGnet network to extract multi-level fabric images. Multi-scale features to improve the representation ability of fabric images; use the attention mechanism to screen effective features; use the depth side output and shallow output to connect the effective features for feature fusion, which can better help the shallow side output predict defect areas to improve Defect detection accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Full convolutional network fabric defect detection method based on attention mechanism
  • Full convolutional network fabric defect detection method based on attention mechanism
  • Full convolutional network fabric defect detection method based on attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

specific example

[0061] In the embodiment, randomly select several types of common defect images from a database containing 2000 fabric images, such as image 3 (a) ~ (d), where image 3 (a) is a hole, image 3 (b) is a foreign body, image 3 (c) is a stain, image 3(d) is oil stain, and the image size is selected as 512pixel×512pixel. During training and testing, the learning rate is set to 1E-6, the momentum parameter is set to 0.9, the weight decay is set to 0.0005, and the loss weight for each stage output is chosen to be 1.0. The fusion weights in the feature fusion module are all initialized to 0.2 during the training phase. For specific examples, see Figure 4 ~ Figure 7 .

[0062] Figure 4 (a)~(d) are the truth maps labeled pixel by pixel. Figure 5 (a)~(d) are the literature [1]-[Hou Q, ChengM M, Hu X, et al.Deeply Supervised Salient Object Detection with ShortConnections[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018:1- 1.] The saliency map generated...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a full convolutional network fabric defect detection method based on an attention mechanism. The full convolutional network fabric defect detection method comprises the steps: firstly extracting a multi-stage and multi-scale intermediate depth feature map of a fabric image through an improved VGG16 network, and carrying out the processing through the attention mechanism, andobtaining a multi-stage and multi-scale depth feature map; then, performing up-sampling on the multi-level and multi-scale depth feature maps by utilizing bilinear interpolation to obtain multi-levelfeature maps with the same size, and performing fusion by utilizing a short connection structure to obtain a multi-level saliency map; and finally, fusing the multistage saliency maps by adopting weighted fusion to obtain a final saliency map of the defect image. According to the full convolutional network fabric defect detection method, complex defect characteristics and various backgrounds of the fabric image are comprehensively considered, and the representation capability of the fabric image is improved by simulating an attention mechanism of human visual attention cognition, and the noise influence in the image is eliminated, so that the detection result has higher adaptivity and detection precision.

Description

technical field [0001] The invention relates to the technical field of textile image processing, in particular to an attention mechanism-based full convolution network fabric defect detection method. Background technique [0002] Fabric defect detection is an essential step in quality control in the textile manufacturing industry. The types and characteristics of fabric defects themselves are diverse, which also increases the complexity of the defect detection problem, making it difficult to design a generalized method. Many factories usually use artificial vision methods, and human observation is limited. Workers who work continuously for a long time may cause false detection or missed detection due to fatigue, resulting in additional material and financial losses. As a result, the textile industry has been developing the automation of fabric inspection in order to make a consistent assessment of the quality of the fabric. [0003] Early fabric detection algorithms mainly...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06N3/04G06N3/08G01N21/88
CPCG06T7/0004G06N3/08G01N21/8851G06T2207/10004G06T2207/20081G06T2207/30124G01N2021/8887G06N3/045
Inventor 刘洲峰李春雷王金金董燕杨艳李碧草
Owner ZHONGYUAN ENGINEERING COLLEGE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products