Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method for identifying wetted areas of multi-color fabrics based on hyperspectral image processing

A hyperspectral image and area recognition technology, applied in the field of textile and clothing performance testing, can solve the problems of low ratio of wetted area and unwetted area, wrong segmentation of test area, etc., to overcome noise, improve image contrast, and improve self The effect of adaptability and automation

Active Publication Date: 2020-11-06
ZHEJIANG SCI-TECH UNIV
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The purpose of the present invention is to overcome the deficiencies of the above-mentioned prior art, and to provide a multi-color fabric wet area recognition method based on hyperspectral image processing, which aims to solve the problem of fabric wetness level detection process based on image processing algorithms in the prior art. Interfering influences such as colour, fabric texture, uneven lighting, light variations, reflections, low contrast between wetted and non-wetted areas and test areas prone to missegmentation etc. technical issues

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for identifying wetted areas of multi-color fabrics based on hyperspectral image processing
  • A method for identifying wetted areas of multi-color fabrics based on hyperspectral image processing
  • A method for identifying wetted areas of multi-color fabrics based on hyperspectral image processing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040]In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below with reference to the accompanying drawings and embodiments. However, it should be understood that the specific embodiments described here are only used to explain the present invention, and are not intended to limit the scope of the present invention. Also, in the following description, descriptions of well-known structures and techniques are omitted to avoid unnecessarily obscuring the concept of the present invention.

[0041] The method of the invention has good versatility for the identification of the wetted area and the detection of the water-stained level of fabrics of different materials. Because there are many kinds of fabrics, the present invention only uses Scottish worsted tweed fabrics made of 100% wool as an implementation example; the image processing process is completed in the Matlab R2014b soft...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

PropertyMeasurementUnit
thicknessaaaaaaaaaa
Login to View More

Abstract

The invention discloses a method for recognizing a wetted area of multicolor fabric based on hyperspectral image processing. An EVA foam-modified gripper is used for providing a precise boundary for the extraction of an image of a test area; due to relatively large near-infrared reflectivity difference between the wetted area and an unwetted area, through the adoption of near-infrared band imaging, not only is the image contrast increased to the maximum extent, but also interference of the color as well as noises generated by fabric texture during wetted area recognition are effectively overcome; through MNF transformation and 2% linear stretching of the image, not only are most abnormal values removed, but also the image contrast is increased, so that the occurrence of mistaken segmentation can be avoided better; through the adoption of iterative threshold segmentation, the adaptability and automation capability during region segmentation are improved; through the selective filling ofthe wetted area, the reflective wetted area is effectively corrected; by adopting the method, the wetted area of fabric with any color, especially the multicolor fabric, can be accurately recognized,and an accurate basis is provided for automatic detection of the water dipping level of the fabric.

Description

【Technical field】 [0001] The invention relates to the technical field of textile and garment performance testing, in particular to a method for identifying wet areas of multi-color fabrics based on hyperspectral image processing. 【Background technique】 [0002] The identification of the wetted area of ​​the fabric in the process of traditional artificial fabric water staining grade assessment is easily affected by the assessor's physiology and psychology, lighting environment, fabric color, etc., and the experimental results have large errors and poor consistency. In order to solve these problems, an image processing-based identification method of fabric wetted area appeared for the spraying method of AATCC standard. [0003] The difficulty of the fabric wetting area recognition method based on image processing is that the contrast between the wetting area and the non-wetting area to be segmented is small, and the segmentation results are greatly affected by uneven illuminat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01N13/00G06T7/00G06T7/11G06T7/187G06T7/136
CPCG01N13/00G06T7/0004G06T7/11G06T7/187G06T2207/10024G06T2207/30124
Inventor 蒋晶晶祝成炎金肖克
Owner ZHEJIANG SCI-TECH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products