Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fabric defect pixel-level classification method based on deep learning

A defect pixel and deep learning technology, applied in the field of fabric defect pixel-level classification based on deep learning, can solve the problems of poor detection effect of complex textured fabrics, difficulty in meeting real-time performance, and poor accuracy, and achieve fast calculation speed and model The effect of small parameters and small amount of calculation

Active Publication Date: 2019-11-22
XI'AN POLYTECHNIC UNIVERSITY
View PDF7 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Statistical-based methods rely on the choice of parameters, with poor accuracy and low precision
The detection result of the method based on the frequency domain depends on the selection of the filter, and the detection effect on complex textured fabrics is poor
The model-based method has a large amount of calculation and is difficult to meet the real-time requirements
The above methods have opened up a good way for machine vision detection of texture defects, but further research is needed to find a robust method with strong adaptability to variable textures

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fabric defect pixel-level classification method based on deep learning
  • Fabric defect pixel-level classification method based on deep learning
  • Fabric defect pixel-level classification method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0074] Perform step 1, using the FID data set as the picture set;

[0075] Perform steps 2 to 5 to obtain a trained Mobile-Unet network model;

[0076] Perform step 6, select a picture from the test set as the input picture, the input picture is as follows Figure 7 shown; use the Mobile-Unet network model trained in step 5 to classify the input image, and output the classified image, such as Figure 8 shown.

Embodiment 2

[0078] Perform step 1, using the FID data set as the picture set;

[0079] Perform steps 2 to 5 to obtain a trained Mobile-Unet network model;

[0080] Perform step 6, select a picture from the test set as the input picture, the input picture is as follows Figure 9 shown; use the Mobile-Unet network model trained in step 5 to classify the input image, and output the classified image, such as Figure 10 shown.

Embodiment 3

[0082] Perform step 1, using the FID data set as the picture set;

[0083] Perform steps 2 to 5 to obtain a trained Mobile-Unet network model;

[0084] Perform step 6, select a picture from the test set as the input picture, the input picture is as follows Figure 11 shown; use the Mobile-Unet network model trained in step 5 to classify the input image, and output the classified image, such as Figure 12 shown.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a fabric defect pixel-level classification method based on deep learning. The fabric defect pixel-level classification method is specifically implemented according to the following steps: step 1, collecting defective fabric images to form a picture set; step 2, establishing a MobileNetV2 network model; step 3, training the pre-training set by using a MobileNetV2 network model; step 4, establishing a Mobile-Unet network model; step 5, training the training set by using a Mobile-Unet network model; and step 6, classifying the input pictures by the trained Mobile-Unet network model, and outputting the classified images. According to the method, pixel-level segmentation can be carried out on defective fabrics, parameters and models in the method are smaller, and the robustness of the algorithm is improved.

Description

technical field [0001] The invention belongs to the technical field of image segmentation, and relates to a pixel-level classification method for fabric defects based on deep learning. Background technique [0002] The competition in the textile industry is becoming increasingly fierce. The last process after the cloth weaving is usually fabric defect detection, and then evaluates the product grade. of great pressure. Aiming at the detection of fabric surface defects, many domestic and foreign scholars have done related research. These detection methods can be divided into three categories: statistical-based methods, frequency-domain-based methods, model-based methods, and learning-based methods. Statistical-based methods rely on the choice of parameters, which are less accurate and less precise. The detection result of the method based on the frequency domain depends on the selection of the filter, and the detection effect on complex textured fabrics is poor. The model-...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06K9/62
CPCG06T7/0004G06T2207/30124G06F18/241G06F18/214Y02P90/30
Inventor 景军锋王震张缓缓苏泽斌
Owner XI'AN POLYTECHNIC UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products